Dear friend,
I bought myself a guitar this week. Hello beginner mode, my old friend. Even with decades of tricks that allow me to prevail over writer’s block, imposter syndrome, and the countless hidden obstacles that sink creative endeavors (that’s what Art of the Living Dead was about in a nutshell) it still surprised me how uncomfortable it is to be a guitar noob. Why is it so hard to learn something new? Part of the answer is explained by how our education system taught us to learn. That’s what this week’s chapter is about. Stay creative.
Your friend,
Ade
“We have thought erroneously of education as the mature wisdom and overbrimming knowledge of the grown-ups injected by the discipline pump into the otherwise ‘empty’ child’s head.” —Buckminster Fuller
What a relief it is to graduate. With education checked off the list, you can proceed to adulthood. We are lead to believe you become an adult once your mind has finished indexing the world and your brain is correctly organized into tightly curated buckets. Adults are finally allowed to turn “write-access” off on our brains and switch into passive “read-only” mode. Falling for this con, most people spend their adulthood committed to the concepts stamped into their brains in school. The pressure to produce right answers in class metastasized into a permanent fear of any thought that deviates from the gospel of textbooks.
School doesn’t equip us to manage the unique sparks bouncing around in our skulls. Instead, education fools us into believing that we are all the same, simple empty vessels waiting to hook our brains up to a knowledge tube. If our thoughts deviate too far from the average, we can medicate our minds closer to normal. Sometimes the drugs are prescribed by doctors, other times we self-medicate with chemicals as described in the last chapter.
Minds crave stimulation while they are simultaneously overwhelmed by the un-stimulating information we are force fed at school or work. We outsource our reality curation to our devices, trading the heavy lifting of critical thinking to computers that ping us whenever an algorithm detects dopamine-worthy anomalies. The things that capture our attention are the outliers, the too-good-to-be-true bargains, the extreme sports, the tabloid headlines, the fake news, and the politically incorrect. And yet the happiness algorithms haven’t made us happier. Armed with the most advanced technology in the history of humanity, everyone seems stuck on the verge of meltdown, road rage, maybe even annihilation.
The older you get, the harder it is to adapt to new technology. While grownups fumble with new tech, kids grasp it effortlessly. We shouldn’t be surprised that if you give a child an iPad in the morning they will be proficient by lunch time. Come back a week later and new muscle memory will be permanently burned into their neural pathways. Augmented with instant access to anything their curiosity desires, kids today develop different mental models than those of us raised on television. Meanwhile, teens, the early adopter guinea pigs of our society are beta testing the next wave of software that the rest of us will catch up to eventually.
Parents find themselves in the difficult predicament of trying to regulate the flow of information that their children consume. Prescribing the right dopamine-to-knowledge ratio is a futile exercise that is nevertheless mandatory because we want our kids to reap the benefits of technology while at the same time we want to protect them from danger. To do otherwise would be neglect. Or would it?
What would happen if you put an unsupervised computer in the middle of a remote village in India or South Africa? This experiment was done by education scientist Sugata Mitra in 1999 (watch his TED talk here) and the results were stunning. Sugata placed computers in holes in walls in rural areas of India where children were allowed to use it freely. Even without knowing English they were capable of learning to use computers on their own without supervision. Sugata concluded that,
“In nine months, a group of children left alone with a computer in any language will reach the same standard as an office secretary in the West.”
What is the difference between an American child watching toy unboxing videos on YouTube and a child in India teaching themselves English so that they can use a computer? To answer this question we need to unwind our definition of education and imagine the world before computers.
Our modern minds struggle to comprehend how the world functioned before electricity. How could people orchestrate the entirety of global human life with only scrolls and quills? It was chaos until about 300 years ago. That’s when the education system we know today was invented. The British Empire built a system that produced a network of identical humans with identical knowledge. Like a flesh-based internet, anywhere on Earth you could find humans who could read, write, add, subtract, multiply, and divide. As schools spread, the output of the system was more and more nodes on the network. Year after year students graduated, magnifying the power of this pre-industrial, world-covering computer.
The education system invented by our ancestors has been so effective that today we still mandate our kids follow the same formula. Learn to read, write, and do math, otherwise society has little use for you. The value of those skills can’t be denied, but that other byproduct, the identical humans falling off the end of the conveyor belt, has side effects. If we haven’t passed it already, there will come a tipping point where the benefits of cookie cutter educations are dwarfed by the problems caused by knowledge cloning.
The self-education of village children trying to make use of a computer is a mirror that should shock anyone invested in traditional schools. It shouldn’t be possible. If we don’t force kids to go to school, how can they possibly learn? And yet they will. Mitra observes that ”Children will learn to do what they want to learn to do.” If a child wants to learn English because she wants to use a computer, she will do it.
When you realize that education can be free and self-directed, it changes how we define education. Sugata Mitra gives us a better definition of education. He says,
“Education is a self-organizing system where learning is an emergent phenomenon.”
Education isn’t something that arises from force-feeding curriculum. Education is initiated by a user zero moment. If you can ask a question that captures a child’s imagination, there is nothing in the world that can stop that child from learning. Sugata explains that
“The teacher sets the process in motion and then she stands back in awe and watches as learning happens.”
The challenge is to find the question, the magic spark that transforms a child’s iPad from a dopamine-satisfying pacifier into a bicycle for their mind. It is the same challenge that we all face as social media assists our procrastination from doing meaningful work. We carry the incredibly powerful computers in our pockets and we use them to document our pets.
Mitra’s experiments would be difficult to reproduce today because there is hardly a corner of the planet that isn’t filled with smartphones. The computer has become a universal tool. Software is continually improving, and the expectations for computers to be usable by anyone, even people with disabilities, is resulting in unprecedentedly usable tools. Designers call this property “accessibility” and it is a noble goal. There is a hidden danger on the flip side of this universal tool, however. What gets lost as computers get easier to use?
The paradox of usability is that the same slight of hand that makes powerful computers accessible to normal people also hides the internal workings that must be understood to truly master the tool. You don’t need to understand html to navigate a website. Nobody wants to go back to the computers of the 1990s, but there is something to be said about an era when anyone with the patience to read the manual could understand their machine intimately.
At the dawn of computing a curious youngster could reverse engineer software, improve it, hack it, bend it to their imagination. Bill Gates dug through dumpsters to find the source code for the computer he wanted to create software for. The dawn of the internet grew on top of html that was only a “view source” away. The code that powered the web has shifted from transparent to opaque as the code underneath is increasingly compiled, obfuscated, minified, and processed in ways that human eyes can no longer penetrate. Like Volkswagen Beetles, computers evolved from sturdy, manual machines into self-driving black boxes.
The promise of the web was that anyone could host a website where they could inhabit their own corner of the web. We traded that freedom for walled gardens and wysiwyg editors. Why host your website when it is easier to write on Medium? Why write an essay when you can tweet? Why create a web page when there’s Facebook?
The internet has shifted from an open source model that gave voice to the silent masses towards corporate monopolies that have no incentive to open the curtains. The ability of curious minds to penetrate the black box of technology is diminishing. This is why society’s reliance on ads is so toxic. Videos aren’t optimized for the benefit of the child, they prey on a child’s inability to distinguish an advertisement from quality content.
In the good old days of advertising, back room deals were out in the open. The poster on the side of the bus stop or the ad printed on a bench represented a partnership that we could deduce. Somebody sold your attention to somebody who paid for the privilege of attaching their message to high visibility real estate. We didn’t know how much money traded hands, but we recognized the game.
Before the internet, even product placement in movies and television clung to a sliver of opacity. Is there anything more charming than the trail of Reeses Pieces in E.T.? But when it comes to social media, the economic engine is opaque. We can’t see the algorithm that Facebook uses to show us ads. Twitter won’t show us the black magic that fuels their feeds. You want to know the data profile Google has for you? Forget about it.
Even the Nigerian Prince has the courtesy of a bit of transparency in his scam. When his con job appears in your inbox, the natural response is, “Who are the idiots who fall for this scam?” Your second question is, “Are spammers so dumb that they can’t come up with a better subject line for their scam?” Eventually we realize that the email isn’t meant for us. The wolf isn’t attacking the herd, he is looking for the weak calf, skimming off the edges, satisfied with the easy kill. Big tech on the other hand isn’t content with scraps, they want the entire herd.
We are flying blind, relying on machines that we don’t understand, trusting in the morality of machines that have no morals. We are passengers, locked in our seats, sealed off from the cockpit where all we can do is hope that the pilots are capable of landing the plane. In the next chapter we will enter the cockpit and see who is actually in control.