“The mitochondria is the powerhouse of the cell.”
– Philip Siekevitz
A popular internet meme is that adulthood is like skipping the tutorial section in a role playing video game, and then having no idea what you’re doing as soon as you first step out into the world. For example, after creating my character in Blizzard’s MMORPG World of Warcraft (2004), I recall struggling to figure out where to go next and how to so much as progress the storyline. I would see higher-level players glide past me on their fancy mounts with their badass-looking armor, and wonder how they were able to become so powerful. Surely, they must’ve had access to this esoteric knowledge on character progression, right?
Like adulthood, character progression in video games is a never-ending process, and not a product. It’s about the journey you must undertake to acquire a fancy mount and badass-looking armor—not about a fancy mount and badass-looking armor in and of themselves.
I never stuck with World of Warcraft, but I did transition to Bungie’s shared world shooter, Destiny in 2014. Destiny, I think, was addictive for many of the same reasons that made WoW addictive: it was a loot-based role playing video game with a heavy emphasis on teamwork and cooperation. Above all else, Destiny was about “becoming legend,” or grinding for a long period of time to create the most powerful character imaginable.
Although plagued by inexcusable problems at its launch, such as a fundamentally incoherent story mode, repetitive gameplay, and a broken reward system, Destiny has been renowned not only for its ability to cultivate relationships between people who are halfway across the world from each other, but for its longevity. Whereas player populations in most games drop off after one or two years, Destiny maintains a strong and loyal following, with players to this day grinding to reach the maximum light level despite the knowledge that all of the hard work and energy they expended will be nullified as soon as the next expansion or full sequel is released. Can you imagine, then, how quickly the Destiny community would have disbanded if within the first few hours of the game, you could max out your character?
The answer to the preceding question is part of why the American education system is faulty. Specifically, just as video games with weaker communities operate under the assumption that character progression is something that can be completed in a matter of hours, the institutions that teach our children assume that learning can be completed in a matter of years. That is, education begins in preschool, continues throughout elementary and middle school, and ends in high school, when in reality, education begins after we’ve received our diplomas and continues until we die.
Another point to consider is that inadequate education is no different from skipping the tutorial section in a video game in that it fails to articulate the skills we must practice before we can begin our education, and so, as soon as we reach a point where we are required to achieve independence (let’s say, age 20 or 21), we suffer tremendously. That might seem cynical, as if I’m suggesting that the first 18 years of life is a waste of time, but it’s quite the opposite: there is SO much to learn beyond high school that it’s actually very exciting. In fact, there is so much to learn beyond high school that it’s impossible to learn it all in a hundred lifetimes.
None of this is to say that you should downplay education, but you should question its value. For instance, my preliminary education has a special place in my heart for teaching me to rudimentarily read and write, but I will always criticize it for failing to teach me how to pay taxes, manage expenses, do the laundry, and cook a meal, and at the more advanced level, for failing to teach me how to remedy negative emotions, form and sustain a meaningful relationship, and make a sensitive decision with long-lasting ramifications.
Of course, it’s good to be able to write an essay about Shakespeare’s Romeo and Juliet and solve the Pythagorean Theorem with square roots, but what practical applications do these principles have in the wider context of my life, and the trials and tribulations I will face along its inevitably perilous course? Few, probably. Even worse, school teaches us what to think, but never how to think. This became evident to me upon taking an English examination in high school, in which I was instructed to complete the multiple choice section after reading a short story whose language was bewilderingly antiquated. Why should I be conditioned into thinking that there is a singular “right” answer to interpreting classic literature when classic literature is supposed to warrant multiple interpretations?
Sometimes I feel there are puppeteers dictating the contents that I should and shouldn’t put into my mind, or balding, old men in slick suits who’ve written the exams I’ve taken all my life—who think that math equations are supposed to teach me how to solve a personal problem, and that poems and short stories from three centuries ago will allow me to develop a deeper appreciation for works of fiction. However, I’ve actually learned more from “turning my brain to mush,” watching television shows and reading Wikipedia articles.
And that, you could say, is why American education is destined to fail. It prescribes knowledge that instruments of institutions “think” you should (or ought to) have once adulthood is on the horizon, but it doesn’t give you the tools you need to transition into adulthood, and navigate it successfully.
So you know what I do? I study EVEN when I don’t have a test to take the following week. I learn new things not as part of an endless, narcissistic pursuit, but as part of a contingency, so that in the event my car breaks down in the middle of nowhere, I won’t just solve a silly math equation. I’ll repair the damn thing and make it to my destination.