How Problematic Are You?

 “We’re all pretty bizarre. Some of us are just better at hiding it, that’s all.”

– Andrew, The Breakfast Club

In John Hughes’s The Breakfast Club (1985), five stereotypical teenagers, a “prom queen,” a “geek,” a “jock,” a “criminal,” and a “basket case,” attend detention at their high school’s library on an early Saturday morning. As punishment for their previous transgressions, the delinquents are instructed to compose a 1,000-word essay explaining “who they think they are,” but no work gets done and they spend the remainder of their detention forming unlikely bonds with one another. At face value, the film can be likened to an extended bottle episode that you would watch in a television show—not much else happens beyond a band of misfits talking about their problems and standing up to an abusive authority figure for an hour and a half. Upon further examination, we find a surprising character study on the nuanced complexities of adolescence, with our five protagonists discovering that, despite their assigned stereotypes that ostensibly divide them, they are in fact united by common struggles inside and outside of school.

The Breakfast Club was so successful that it grossed more than fifty times its budget, cementing it as one of the best movies of the 1980s and standing the test of time. The film resonates with me for its smartly written, dynamic, and relatable characters, who all have rich and complicated histories that provide clarity on their personalities. For example, we learn that Allison, the basket case, struggles with forming meaningful relationships because all her life, her parents have neglected her. Brian, the geek, is easily impressionable and contemplates suicide for fear of failing an important class. Claire, the prom queen, is deeply insecure about her virginity, while her friend group prevents her from forging a stable identity. Andrew, the jock, is pushed too hard by his father for not being a good enough wrestler. Finally, Bender, the criminal, incurs constant verbal and physical abuse from his father.

Unfortunately, there are no quick fixes to the issues that the Breakfast Club has spent the majority of its detention working through. The conclusion sees Allison and Andrew develop a relationship, Claire help Bender get in touch with his compassion, and Brian finish the essay for Mr. Vernon. However, their fate is largely open to interpretation, as no sequel is ever made that informs us of where Allison, Andrew, Claire, Bender, and Brian end up in the next 10-20 years and thus we presume that they all go their separate ways after detention ends. What is especially poignant is the understanding that even if these characters lives never intersect again, and if their issues persist through high school graduation and into adulthood, the impacts they leave on each other will last forever. This is best illustrated by the film’s hallmark song, “Don’t You (Forget About Me)” by Simple Minds, which sings about the necessity of transparency in human relationships.

While Hughes’s beloved coming-of-age film serves as a commentary on typical teenage angst and how frequently misunderstood it is by adults, I think many of the characters’ insecurities go well beyond and perhaps supersede the adolescent years, manifesting in a variety of cultures that are not relegated to white, middle class America. If they are left unresolved, they could yield disastrous consequences later in life. For instance, if Bender never makes amends with his abusive father, he could become an abusive father himself one day. If Claire never chooses the right friends, she might spend the rest of her life never knowing who she is. Finally, if Andrew never learns to cope with the prospect of failure, he may one day successfully attempt suicide. The aim of The Breakfast Club is therefore to encourage you, the viewer, to recognize how very little effort is involved in judging, attaching labels to, and dismissing another person based upon an eccentricity, idiosyncrasy, or superficial attribute that he or she is best known for, and that through digging deeper into what makes that person tick, you will make shocking discoveries about them and even yourself.

Think of a person in your life who, by virtue of something that you don’t like such as an annoying stutter or thick perfume, is assumed to be completely problem-free. Chances are, such a person doesn’t exist. In his video, The Science of Awkwardness (2015), Michael from Vsauce strikes this point upon discussing “protagonist disease,” a condition that erodes our interpersonal interactions by deluding us into thinking the world revolves around us 24 hours 7 days a week, or that we are the sole characters the drive our stories forward. Everyone else is just, as Michael puts it, “one-dimensional background characters” who have no virtually effect on your life. In fact, you couldn’t care less about them because they don’t understand what it’s like to be in your shoes—your goals, dreams, aspirations, internal conflicts, and all of the complexities that make you, you. Michael then uses the example of a guy who took too long to order in front of you earlier this morning to illustrate another psychological phenomenon interchangeable with that of protagonist disease, the fundamental attribution error. He states, “He’s obviously just an innately annoying person. That’s his entire purpose, but when YOU take too long, it’s because the staff was unhelpful—you were flustered, preoccupied by an earlier conversation.” But what if all along, the reason that guy took so long to order was because he was caught up in thought about his wife of 26 years, who unfortunately passed away to cancer earlier that week?

The fundamental attribution error becomes evident in the scene where the Breakfast Club gathers around for a group therapy session. In this scene, Brian claims that he considers them all to be his friends, but worries that as soon as Monday arrives, everything will go back to normal and they will no longer speak to each other. Claire is brutally honest with Brian, stating that if Andrew saw Brian in the hallway on Monday, he would briefly acknowledge Brian’s presence but then disparage him behind his back so that his friends wouldn’t think he’s a loser for hanging out with the geeks. Allison asks Claire what would happen if she approached her in the hallway, and Claire replies with saying, “Same exact thing.” Later, Brian calls Claire out on her conceit—of course she will look down upon the less privileged and less popular when she cannot even so much as determine who her real friends are, but Claire protests that it’s more complicated than that. “I hate it—I hate having to go along with everything my friends say,” complains Claire. Brian asks why, then, she continues to hang around people who clearly make her feel miserable. In tears, Claire admits, “I don’t know. You don’t understand. You’re not friends with the same kind of people that Andy and I are friends with. You know, you just don’t understand the pressure they can put on you.” An outraged Brian asks if Claire really thinks he doesn’t know what it’s like to be under pressure, and then shouts, “Well FUCK YOU! Fuck you.”

I am fascinated and quite relieved to know that everyone, not just high school students, has a unique set of challenges that they must overcome if they are expected to survive and thrive. I cannot, with respect to my friends’ and family members’ privacy, go down an entire list of their personal problems, but let’s just say that they are not exempt from them. Furthermore, I, too, have made the fundamental attribution error on a number of occasions. For instance, recently I discovered that one guy for whom I mistook excessive masculinity as his defining trait, actually used to go into the closet to cry when his customers became too abusive for him to handle.

It just goes to show that stereotypes, whether we subscribe to them or not, are only a small fraction of our personas.

What Cannabidiol Therapy Can Do for You

Megan, an old friend, messaged me on Facebook asking if I could write an article about her reactions to cannabidiol (CBD), the non-psychoactive sister cannabinoid to THC. Like THC, CBD binds to cannabinoid receptors in the brain, but they elicit a wide array of effects not hallucinogenic in nature. Some of the reported effects include an improvement in mood, increased sleep and appetite, pain modulation, and refined memory (Butterfield, 2016). It has gained popularity with an increasing number of patients interested in adopting cannabis as a form of treatment for their ailments but want to do so without experiencing the taxing head highs that marijuana is popular for.

With her permission, I am allowed to reveal why Megan chose CBD as her preferred treatment. Simply put, Megan suffers from mild depression and severe anxiety, and it took great courage for her to admit that to me when we consider the disastrous public mental health stigmas that plague Americans and ultimately turn them off from the getting help they so desperately need (Parcesepe & Cabassa, 2013). A common mental health stigma is that anxious or depressed people are weak. We know that to not always be the case.

But Megan’s story doesn’t end with this article—she wants to encourage other sufferers of depression and anxiety to not only seek possible treatments, but to seek natural treatments. Because while drugs like Prozac and Xanax have their respective benefits, one causes radical personality changes and the other yields a high potential for abuse, overdoses, and hospital admissions, especially when used irresponsibly (Harding, 2009; MacLaren, 2017). If I can use Megan’s story to spread the word that natural remedies are indeed out there and work just as effectively as synthetic drugs, I like to think that I’d be doing the world a service.

Then again, it’s very easy say that CBD therapy works, but that does not necessarily mean it will work for you. As such, this article will provide a brief rundown of Megan’s documented experiences with CBD over a period of 15 days so that you, the reader, can judge whether or not it is the right treatment option.

Before continuing, let me address the elephant in the room: CBD’s legal status. I’m sure you don’t want to obtain CBD hemp oil only to discover that it’s no less illegal in your state than THC is, so what does the legality of CBD look like both state-by-state and at the federal level? The short answer is “it’s complicated.” In December of 2016, the Drug Enforcement Administration articulated that any extracts from a cannabis plant are Schedule I controlled substances, effectively putting them on the same level as heroin, LSD, and bath salts. Nonetheless, CBD laws are inconsistent across the country. That is, in the 28 states allowing for the possession and consumption of medical marijuana, CBD is also legal for medical purposes. Sixteen more states have passed laws that, although restrictive, have legalized CBD. In the 6 remaining states—Idaho, South Dakota, Nebraska, Kansas, Indiana, and West Virginia—CBD, THC, and alternative cannabis extracts are 100 percent illegal (Summers, 2017).

Now that we’ve gotten that part out of the way, how has Megan’s time with CBD been?

Day 1: This was the first day that Megan ingested CBD to treat her anxiety. She writes in the e-mail that she used a vape oil called “FX Chill.” The device she used for ingestion was the high-grade vaporizer Yocan Evolve C. She took two puffs from it at 9 A.M. and pledged to take two every morning and two every night. Instantly, she felt rejuvenated—a little high-strung from the events of the previous day, but much less apprehensive than she would have been otherwise.

Day 2: Megan woke up slightly anxious from what she puts as an “odd dream.” To her surprise, she wasn’t as on-edge as she normally is when she wakes up, and her typical anxious symptoms like heavy breathing and rapid heartbeat were absent. She also mentions that she lost the mouthpiece to her vaporizer. Whereas before, her anxiety would have snowballed, this time she felt tranquil. “That is abnormal to me,” she writes.

Day 3: This was a Saturday, and Megan felt unusually contented. She worked in the evening and arrived home feeling calm.

Day 4: This day was an emotional rollercoaster for Megan. Apparently, she felt fine for the first half of it, then depressed toward the evening, and better at night. She also felt a tad nervous here and there.  Despite this, Megan asserted that she would continue on with CBD therapy hoping for longer-term mood improvements.

Day 5: This day was a Monday, and Megan complained that Mondays are stressful for her because they net the most traffic at her workplace. She still felt calm and collected, and it turned out to be a fine day.

Day 6: Megan explains that Tuesdays are hard because after working for 16 straight hours, she refuses to get any sleep. As a consequence of her lack of patience to rest up, she becomes very tired and thus aggravates her anxiety. However, on this particular day all her negative feelings—her depression, anxiety, and apprehension—were absent, and she only felt happy and carefree. She notes, “I am able to experience a glimpse of what life was like pre-onset of my anxiety and that was something I never thought I would see again.”

Day 7: Nothing of much importance happened on this day. Megan went out at midnight to celebrate her friend’s 21st birthday, emphasizing that while she normally feels uncomfortable in social situations of that nature, she felt like she could handle it well.

Day 8: On this day, Megan felt tired and restless but still wasn’t anxious. She worked a run-of-the-mill 8-4 shift and arrived home, relieved to discover that her boyfriend, in a gesture of affection, had completed an assortment of household tasks for her. However, he was troubled by things going on in his life, and Megan did all she could to make him feel better, but nothing worked. Even so, with the help of CBD, she felt more than capable of handling the acute stress associated with trying to console a partner who’s clearly distressed.

Day 9: Here, we start to notice a theme of liberation. Megan once again expresses that she just feels free, like all her troubles are ever present but minimized and less threatening. Later on, however, a rude and obnoxious customer triggered an episode of aggressive anxiety in her. She took a few more puffs of CBD to quell her frustration, but didn’t feel much better afterward.

Day 10: This was a bit of an off day for Megan. Still upset from yesterday, she cried intermittently but was able to pull herself together. In addition, she attended lunch with her dad and dinner with friends, and on both occasions, she drank alcohol.

Day 11: “It was a fairly normal Monday,” Megan writes. She experienced very little anxiety.

Days 12/13/14/15: After forgetting to take CBD on Tuesday and Wednesday, Megan’s anxiety came back in full force, with feelings of extreme sensitivity, despondency, loathsomeness, and most of all, doubt about herself and her capabilities. When she resumed treatment late on Wednesday and into Thursday, she could get back to living life on her terms again, attesting that all this time CBD has worked wonders for her and that she wouldn’t know what to do without it. The quality of her life, she states, has improved dramatically, and she didn’t realize how much better she felt until she missed her dosages.

Based on Megan’s feedback, does CBD therapy work? If so, is it within your best interest? I’ll let you be the judge of that.

I would like to thank Megan for opening up a window into her life and allowing me to post this article. It is people like her who remind us that depression and anxiety are not simply character flaws, but rather afflictions that, much like a physical disability, can be treated and coped with. I hope that through sharing her story today, I can lift the stigma off mental health issues just a little bit and encourage my audience to finally request help.

 

References

Butterfield, D. (2017, February 09). CBD: Everything You Need To Know About Cannabidiol. Retrieved October 10, 2017, from https://herb.co/2016/07/26/everything-you-need-to-know-about-cbd/

 

Harding, A. (2009, December 08). Antidepressants change personality, study suggests. Retrieved October 10, 2017, from http://www.cnn.com/2009/HEALTH/12/08/antidepressant.personality.changes/index.html

 

MacLaren, E. (2016, October 06). Xanax History and Statistics. Retrieved October 10, 2017, from https://drugabuse.com/library/xanax-history-and-statistics/

 

Parcesepe, A. M., & Cabassa, L. J. (2013). Public Stigma of Mental Illness in the United States: A Systematic Literature Review. Adm Policy Ment Health, 40(5), 384-399. Retrieved October 10, 2017, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3835659/

 

Summers, D. (2017, March 22). Is CBD Oil Legal? Depends on Where You Are and Who You Ask. Retrieved October 10, 2017, from https://www.leafly.com/news/politics/cbd-oil-legal-depends-ask

Are You the Master of Your Fate?

“And this is the nature of the universe. We struggle against it. We fight to deny it, but it is of course pretense. It is a lie. Beneath our poised appearance, the truth is we are completely… out of control.”

– The Merovingian, The Matrix Reloaded (2003)

My History and Systems of Psychology professor posed an interesting question to my class the other week: you arrive home and surprise your girlfriend with a large and expensive bouquet of flowers, but how does she react? Does she run up to you with tears of joy in her eyes, and hug you with all the love in the world? Rather, does she react with scorn, convinced that you’re fruitlessly trying to repair your damaged relationship through presenting something as meaningless as a bouquet of flowers whose colors she finds ugly?

Context could help with predicting her reaction. If the relationship was healthy, then she would be more likely to give you all the love she has to offer. Alternatively, if the relationship was on the verge of its imminent dissolution, then she would be more likely to kick you out of the bedroom and force you to sleep on the couch for the night! Even if there is no context to clarify her reaction, no matter what, she is going to react one way or another.

The flower bouquet scenario was brought up in regards to a class discussion on explanatory reductionism, a set of philosophical ideas that states all universal phenomena, from the stars in the sky to the deep blue seas, can be explained by breaking them down into readily understandable terms. According to this paradigm, elements are broken down into molecules, which are broken down into atoms, which are further broken down into protons, neutrons, electrons, and a nucleus. Reductionists argue that by reducing complexity down to its simplest form, we can understand virtually everything in the universe, including free will.

At the same time, there is something deeply unsettling about explanatory reductionism. If we were to assume that, for instance, free will could be broken down and summed up as nothing more than an illusion the brain constructs for itself to reconcile a startling lack of control, then what does that say about us as a species? If we could use explanatory reductionism as an avenue toward perfectly predicting how our girlfriends would react upon the presentation of an expensive bouquet of flowers, then one might argue that it would take away a lot of the luster—the charm and the mystery—that make life worth living in the first place.

Consider the difficulty of describing a long walk on the beach with a beautiful woman you are maddeningly attracted to, sipping on an apple martini and sinking your toes into the sand as a gust of refreshing cool air brushes your face and the sun sets in the distance. As uniquely placid such an experience of consciousness may be, it therefore cannot be reduced to a few paragraphs in a science textbook that you would read once and take a test on the following week. But if a reductionist’s approach can be taken to the stars in the sky, the deep blue sea, and the molecules that constitute an element, what is to stop us from taking it to the mechanisms behind human consciousness and for the purposes of this article, free will?

Free will has always been a sensitive religious and spiritual subject because people despise being told that they do not have it, and they will go to great lengths to convince themselves that they are somehow special for having it. “Well, of course I have free will. After all, I chose what I wanted to eat for breakfast this morning, and I chose what I wanted to wear to work.” But rarely do they ever stop to consider where their choices come from, and how they are made.

The issue of free will became apparent to me after watching HBO’s critically acclaimed Westworld (2016). In Season 1, Episode 10 (“The Bicameral Mind), one of the hosts, or synthetic humans, discovers that what she interpreted as an awakening of self-awareness was yet another behavior that somebody had preprogrammed into her. Despite having thought she was in control the whole time, it turned out that control was simply a string of computer code. Aptly, the host snatched the tablet out of the programmer’s hands, and broke it, content with perpetuating the lie that her choices belonged to her and nobody else. Maybe the lead writers of Westworld were trying to communicate something about humanity’s own understanding of power, control, and freedom. If we discovered that our choices were under the control of something more powerful than ourselves, we, too, might react in the same way.  As the Architect puts it in The Matrix Reloaded, a movie that’s received heavy scrutiny for preaching determinism every five minutes, “Denial is the most predictable of all human responses.”

Whatever the case may be, I do not argue for free will because if there is no such thing as a “higher” or “lower” species, then there is very little to separate us from a spider, dog, chimpanzee, or rhinovirus. And yet, we never say that spiders, dogs, chimpanzees, or rhinoviruses possess free will—only that there is something special about the human being that endows him or her with the fantastic ability to choose. Why, then, must we always assume that our choices have any more weight to them than those of other species on the planet?

If you look at the animal kingdom, a lion’s life is essentially on rails from birth until death. The lion will hunt, look for food, and procreate, but nothing that the lion ever does will deviate it much from evolution’s prescription of its behavior. And if we happen to intervene on a lion’s hunt for its prey, we’ll take a step back and affirm that “it’s just nature. Let the lion do its thing,” as if to say that humankind should refrain from obstructing an animal’s lifestyle because unlike animals, humans benefit from being able to choose everything.

I’m not saying that humans are by and large incapable of choice because that’s a tough pill to swallow. To a certain degree, denial of the relative impossibility of free will is necessary toward maintaining sanity. What I am saying is that the decisions we make but that we think aren’t automatic are automatic, and the work of Ivan Pavlov would concur.

Ivan Pavlov was the first physiologist to systemically uncover the origins of learning. He discovered dogs that had been conditioned to associate the ringing of a bell with the presentation of a bowl of food would produce significantly more saliva than dogs that had not made the association, and thus even when the bells were rung but the food was withheld, the conditioned dogs would salivate anyway. This was considered a landmark study because it demonstrated that learned behavior can occur involuntarily when the subject is conditioned into pairing a stimulus with a physiological response to that stimulus.

I remember learning about classical conditioning in high school and thinking about how cool it was to catch myself in the middle of an automatic behavior. For example, I clean my retainer every morning by placing it into a glass cup, filling the cup to the brim with hot water, and letting a dissolvable antibacterial tablet eradicate all of the plaque while I take a shower. One day, I happened to leave my retainer in the cup without cleaning it, so I never stowed it away into its respective plastic case. Before going to sleep, I then reached for the case and opened it, and curiously, it was empty! “Oh… That’s right,” I reminded myself. “I forgot to clean it today.”

A second example of behavioral automaticity from my experience comes from routinely charging my smart phone, as its battery power does not last long. I spend a large portion of the day sitting at my desk writing articles, completing homework, and editing YouTube videos, and thus I look behind me quite often to see if my phone has received any text messages or Facebook notifications. However, when I unplug my smartphone to listen to Pandora and type a paper, I still check behind me to see if I have received any notifications, even when my phone is sitting on my lap. I’m sure you’ve experienced a similar phenomenon: you experience a mini heart attack when you can’t find your car keys, only to discover that they were in the palm of your hand the whole time. How silly of you!

Above were two rather mundane representations of what a lack of free will might look like, but they get you to contemplate the extent to which behavior is rhythmic, especially since most of our days consist of a pattern in that we wake up, attend to school and work responsibilities, and go to sleep. Everything in between—brushing teeth, eating breakfast, and watching Netflix—is on autopilot. If we make even the slightest alteration to our schedules, like going to the bar instead of watching a television show on Netflix, we exhaust many cognitive resources adjusting to it. After a while, however, it just becomes another box that we check off before the next day kicks in, and we don’t waste time on giving it a second thought.

Next, consider the science behind dreaming versus wakefulness as it applies to the debate of free will. We still don’t know how and why we dream, but we can be confident that dreams occur in REM sleep, or deep sleep. Dreams have also been speculated as protective mechanisms against the overwhelming bombardments of stimuli that we take in every day. In other words, the brain staves off information overload by taking these stimuli, including people’s faces, music lyrics, and smells and tastes of Chinese food, and weaves them into cohesive narratives that the hippocampus goes on to convert into memories. If dreams are therefore an unconscious response to the stimuli that the brain has encoded in the past 8 to 10 hours, our behaviors are likewise generated in large part by the unconscious mind, and everything we do, from going on a blind date to eating a slice of pizza at two in the morning, is nothing more than a story that’s been written out well in advance but that the brain has to “act out” by virtue of the vast amount of information that it sorts through for the sake of its ongoing survival. But if wakefulness cannot function without a certain amount of sleep and dreaming, who is to say that we have any more control over our wakeful states than we do over our dreams?

Matsuhashi and Hallett (2008) might be able to answer that question. They wanted to test the lag time between when the brain consciously intends to move and when movement is actually carried out, hypothesizing that if the conscious intention to move is what supposedly generates the movement, an action they referred to as “movement genesis,” then the movement should occur after the conscious intention, and not before it.

In Matsuhashi and Hallett’s study, participants were instructed to randomly perform brisk finger movements every time they heard a tone, and refrain from expending mental energy by counting the number of movements already made and planning when to make the following movements. They made sure to only move their fingers whenever a thought of finger movement had precipitated it. On occasion, a specialized stop signal was played that informed the participants of their intentions to move and thus signified to them to immediately cancel finger movements thereafter.

A graph of tones documents two key test conditions: (1) before participants are made aware by the stop signal of their conscious intentions to move their fingers, and (2) after participants are made aware of their conscious intentions to move their fingers but cannot cancel their movements because the stop signal was played too late. One subject yielded a lag time of about 1 second between his conscious intention to move and movement genesis, that is, he moved his finger 1 second before even thinking about it. As such, Matsuhashi and Hallett concluded that movement genesis occurs on multiple levels of the unconscious mind and is not as simple as thinking about when to move first, and carrying out the movement itself second. The evidence indicates that a movement is carried out well after any thoughts of movement have been created.

Obviously, there is more to be said about the topic of free will—about its philosophy, psychology, and neuroscience—but with what little evidence we have at our disposal thus far, there is a lot going on beneath the surface of every decision we’ve made and are going to make.

Maybe our minds just have minds of their own.

 

Reference

Matsuhashi, M., & Hallett, M. (2008). The timing of the conscious intention to move. European Journal of Neuroscience, 28(11), 2344-2351.

Video: Why We Are Already Living in the Apocalypse: A Walking Dead Video Essay – Part 5 (Strength)

Here is Part 5 of my 5 part Walking Dead video essay.

Author’s note:  I am so happy that I finished this project. After five long months of writing, recording, rerecording, editing, and rendering, I have created a 60+ minute Walking Dead video essay. No other video essay on YouTube is that long, at least not to my knowledge.

As part of a celebration for Walking Dead’s 100th episode milestone, I will be releasing a Definitive Edition of the video essay on October 22nd, the day of the Season 8 premiere.

Are You Only 20 Percent Effective?

Self-discrepancy theory states that our selves, or the core understandings of our identities, are split according to three components: the ideal, ought, and actual self. The ideal-self is the person we aspire to be, the ought-self is the person we want others to be and the person others want us to be, and the actual-self is the person we actually are. The theory was developed by Edward Tory Higgins in 1987, and since then much research has been aimed at identifying the three selves’ existence relative to one another and which of them is the most predominate. Not surprisingly, the actual-self dominates the other two.

Self-discrepancy represents a conflict that wages inside our own heads—and between our partners—every day. My spouse wants me to stop smoking, but cigarettes are the only things that remedy my stress. I know I should lose weight, but food tastes too damn good. My father wants me to become a doctor, but I’d rather be a pilot. I want to do well on that exam, but I’m too lazy to study for it. The list goes on. Do any of these conflicts sound familiar to you?

I am a classic example of a self-discrepant person. Need proof? I know that I should completely cut alcohol out of my life because it’s hazardous to my organs, but I still enjoy the occasional drink after a long day at work, or with a good friend. I know that I should revamp my diet because I consume too much grease and am probably clogging my arteries, but I can’t stop eating hamburgers, pizza, and pasta. I know that I should stop playing video games so often and start devoting more time to pursuing a career in psychology and expanding my reservoir of knowledge for the sake of it, but I love grinding my character in Destiny. And finally, I know that I should advertise my YouTube videos to stimulate viewership and conduct research on the stock market to make informed decisions on my investments, but I don’t care enough to do either of those things, so why even bother?

And yet, by sitting around and waiting to take initiative, I am actually doing more harm than good to myself. By continuing to drink alcohol, I am further eroding my organs. By continuing to consume greasy foods, I am routinely putting myself at risk for heart disease. By continuing to play too many video games and not pursuing a career and expanding my base of knowledge, I am setting myself up to live with my parents until I’m 40 and making myself stupider. And by not advertising my YouTube videos and investing in the stock market, I am wasting my time producing the videos in the first place and losing money.

If I was a truly self-sufficient person, I would write 5 of these articles a week instead of just 1 every other week. I would market my YouTube channel 8 hours a day to maximize audience retention and engagement, and I would release at least two, high-quality, 30-minute long video essays a month. I would quit my weekend job at the local supermarket and find a better one. I would practice meditation to more effectively manage my emotions. I would go to the bar to talk to women and get out of my shell. I would address every single criticism that I’ve ever had, or currently have, of myself—and then some.

The fact remains that I’m not 100% self-sufficient. Most of the time, I’m 10-15% self-sufficient, and 20% self-sufficient on a good day. That’s not very… sufficient of myself.

Can you imagine where humanity would be today if it utilized 100% of its potential? We probably would have cured every known disease, colonized the galaxy, and transcended space and time itself. But we know that human beings are not THAT perfect. How could they be? They’re notoriously flawed creatures. We’ve accomplished many great things, but only to a certain degree. We still quibble amongst, and go to war with, each other, we still haven’t cured some of the most deadly diseases, and we still haven’t traversed and uncovered the secrets of the far reaches of the galaxy. At least we invented the fidget spinner and sliced bread.

Perhaps our aggressive laziness could result from our propensity to favor pleasure over self-improvement. The human brain is largely rewarded through instant gratification, and not through evaluation of long-term consequences. Given the proper time and training, it can learn to delay gratification in the interest of its longer-term goals, but for the most part, it demands to be rewarded instantaneously and without obstructions. It explains why there are alcoholics, pornography addicts, and obese people—if they really wanted to improve themselves, they would’ve done so a while ago.

Self-discrepancy seems to be a conflict that arises from incessant instant gratification. In essence, we weigh the amount of pleasure we can derive from any given activity (i.e.: playing a video game, partying, or reading a text book) relative to whether or not such activity is befit to our well-being, and almost always, our hedonistic instincts kick into overdrive.

So what can YOU do to reach your potential? Close the gaps between your ideal, ought, and actual selves as much as you can. I’m not saying that I’ve done it already because it’s a conflict I struggle with every day, but I have become more aware of it.

It’s true that while you’ll never reach your full potential, you can come as close to your full potential as absolutely possible. And that’s about the best you can do for your short (and sometimes miserable) time on this God-forsaken floating rock.

Can We Look Up to Fictional Role Models?

“Simply put, there’s a vast ocean of shit you people don’t know shit about. Rick knows every fine grain of said shit… and then some.”

– Abraham Ford, The Walking Dead

AMC’s The Walking Dead is one of my favorite television series, slated to return in October 2017 for its eighth season and whopping 100th episode. I adore the show not for its graphic depictions of gore and violence, but instead for its thoughtful illustrations of the sociology, psychology, and politics of the zombie apocalypse. In fact, I love The Walking Dead so much that I dedicated this entire past summer to creating a video essay for it, arguing that we’re already living in the apocalypse by discussing issues of power, sanity, philosophy, community, and strength in the context of AMC’s highest-rated series. Aptly, you can find Parts 1 through 4 on this blog, and right now I’m working on Part 5 and a “definitive edition” to celebrate the show’s 100th episode milestone, quite a remarkable feat.

As much as I commend The Walking Dead, I will not overlook its flaws. Many of the characters are just plain weak and uninteresting (i.e.: Daryl Dixon), with a few exceptions such as Carol, The Governor, Gareth, Morgan, King Ezekiel, and Negan. In addition, the show’s writing is at times shaky and questionable, with the more recent seasons characterized by four great episodes, four good episodes, and another eight episodes of pure filler content—you can thank the Screen Junkies at YouTube for that observation.

One thing that I will never criticize The Walking Dead for, however, is giving me my first TRUE role model to look up to: Sheriff Rick Grimes.

Rick Grimes has seen it all. He’s transformed from a small town cop to the leader of The New World, calloused, exacting, and most of all, uncompromisingly tenacious. But Rick’s lived a hard life the past couple of years: he’s killed his best friend, grieved over a wife who died in childbirth, lost places he called home, faced betrayals and double-crossings, and witnessed two of his closest friends get brutally beaten to death by a sociopath with a baseball bat wrapped in barbed wire. Rick has even done things that he’s not so proud of, killing people in cold blood in the interest of safeguarding his group. Whereas other characters might have been rendered permanently insane from such experiences, Rick has always come out on the other side, and more vigilant than before the world went to Hell.

Given Rick Grimes’s attributes, it’s no wonder he’s my role model, but this notion that “fictional characters are ineligible to be role models” is a myth. For a fictional character to even exist in the first place, then obviously he, she, or it has to come from somebody’s mind. In other words, somebody, usually a professional writer, has to imbue within a character the values, morals, beliefs, and personality traits that justify said character’s behaviors and underlying motivations. Some characters can even reflect the writers who wrote them. For instance, Rocky Balboa’s identity crisis in Rocky II (1979) is said to reflect Sylvester Stallone’s own struggles in dealing with fame and finding a voice (Schmidt, 2017). As such, you can imagine why audiences grieve over the death of a beloved character in a television show or movie franchise—their identities might become so inextricably tied to the character that’s just passed away, that they feel “chipped away” in their untimely absence.

I’ve struggled to come to terms with character deaths on a couple of occasions. Fear the Walking Dead (2015) is a classic example. Travis Manawa, a school teacher and my favorite character, was set up for an interesting arc at the end of Season 2, (*SPOILER*) brutally beating the hell out of two men responsible for inadvertently causing his son Chris’s death. However, the actor who played Travis, Cliff Curtis, was cast as the main villain in the upcoming Avatar sequels prior to the principal photography of Season 3, so the writers had to write his character out of the show by abruptly killing him off in episode 302 (“The New Frontier”). Since then, I’ve grown increasingly disinterested with the direction of Season 3, having found it difficult to identify and emphasize with the new lead character, Madison.

I was under the impression that Travis Manawa would be the Rick Grimes of Fear, not Madison, Travis’s girlfriend. And I have nothing against Madison because she’s a woman. Rather, she’s bland, boring, dull, and generally not a suitable replacement for Travis. Rick Grimes will always be my #1.

But why might I hold Rick in such a high esteem? In short, he’s experienced so much pain and loss in a short period of time, yet repeatedly come out stronger as a result. I figured, then, that perhaps I could follow suit, for one day, I will lose someone or something very dear to me—just as Rick lost his wife and the Prison. But that won’t be enough to stop me, because even when my life is shattered into a million pieces, I’ll somehow put them all back together again.

I don’t want to be weak. I want to be strong like Rick Grimes. And if you’ve been paying attention, that’s really what this blog is about.

 

Reference

Rockall-Schmidt, G. [George Rockall-Schmidt]. (2017, August 19).  How The Rocky Films Changed Over Time. Retrieved from https://www.youtube.com/watch?v=mKTmkLvESI4

Why American Education Is Destined to Fail

“The mitochondria is the powerhouse of the cell.”

– Philip Siekevitz

A popular Internet meme is that adulthood is like skipping the tutorial section in a role playing video game, and then having no idea what you’re doing as soon as you first step out into the world. For example, after creating my character in Blizzard’s MMORPG World of Warcraft (2004), I recall struggling to figure out where to go next and how to so much as progress the storyline. I would see higher-level players glide past me on their fancy mounts with their badass-looking armor, and wonder how they were able to become so powerful. Surely, they must’ve had access to this esoteric knowledge on character progression, right?

Like adulthood, character progression in video games is a never-ending process, and not a product. It’s about the journey you must undertake to acquire a fancy mount and badass-looking armor—not about a fancy mount and badass-looking armor in and of themselves.

I never stuck with World of Warcraft, but I did transition to Bungie’s shared world shooter, Destiny in 2014. Destiny, I think, was addictive for many of the same reasons that made WoW addictive: it was a loot-based role playing video game with a heavy emphasis on teamwork and cooperation. Above all else, Destiny was about “becoming legend,” or grinding for a long period of time to create the most powerful character imaginable.

Although plagued by inexcusable problems at its launch, such as a fundamentally incoherent story mode, repetitive gameplay, and a broken reward system, Destiny has been renowned not only for its ability to cultivate relationships between people who are halfway across the world from each other, but for its longevity. Whereas player populations in most games drop off after one or two years, Destiny maintains a strong and loyal following, with players to this day grinding to reach the maximum light level despite the knowledge that all of the hard work and energy they expended will be nullified as soon as the next expansion or full sequel is released. Can you imagine, then, how quickly the Destiny community would have disbanded if within the first few hours of the game, you could max out your character?

The answer to the preceding question is part of why the American education system is faulty. Specifically, just as video games with weaker communities operate under the assumption that character progression is something that can be completed in a matter of hours, the institutions that teach our children assume that learning can be completed in a matter of years. That is, education begins in preschool, continues throughout elementary and middle school, and ends in high school, when in reality, education begins after we’ve received our diplomas and continues until we die.

Another point to consider is that inadequate education is no different from skipping the tutorial section in a video game in that it fails to articulate the skills we must practice before we can begin our education, and so, as soon as we reach a point where we are required to achieve independence (let’s say, age 20 or 21), we suffer tremendously. That might seem cynical, as if I’m suggesting that the first 18 years of life is a waste of time, but it’s quite the opposite: there is SO much to learn beyond high school that it’s actually very exciting. In fact, there is so much to learn beyond high school that it’s impossible to learn it all in a hundred lifetimes.

None of this is to say that you should downplay education, but you should question its value. For instance, my preliminary education has a special place in my heart for teaching me to rudimentarily read and write, but I will always criticize it for failing to teach me how to pay taxes, manage expenses, do the laundry, and cook a meal, and at the more advanced level, for failing to teach me how to remedy negative emotions, form and sustain a meaningful relationship, and make a sensitive decision with long-lasting ramifications.

Of course, it’s good to be able to write an essay about Shakespeare’s Romeo and Juliet and solve the Pythagorean Theorem with square roots, but what practical applications do these principles have in the wider context of my life, and the trials and tribulations I will face along its inevitably perilous course? Few, probably. Even worse, school teaches us what to think, but never how to think. This became evident to me upon taking an English examination in high school, in which I was instructed to complete the multiple choice section after reading a short story whose language was bewilderingly antiquated. Why should I be conditioned into thinking that there is a singular “right” answer to interpreting classic literature when classic literature is supposed to warrant multiple interpretations?

Sometimes I feel there are puppeteers dictating the contents that I should and shouldn’t put into my mind, or balding, old men in slick suits who’ve written the exams I’ve taken all my life—who think that math equations are supposed to teach me how to solve a personal problem, and that poems and short stories from three centuries ago will allow me to develop a deeper appreciation for works of fiction. However, I’ve actually learned more from “turning my brain to mush,” watching television shows and reading Wikipedia articles.

And that, you could say, is why American education is destined to fail. It prescribes knowledge that instruments of institutions “think” you should (or ought to) have once adulthood is on the horizon, but it doesn’t give you the tools you need to transition into adulthood, and navigate it successfully.

So you know what I do? I study EVEN when I don’t have a test to take the following week. I learn new things not as part of an endless, narcissistic pursuit, but as part of a contingency, so that in the event my car breaks down in the middle of nowhere, I won’t just solve a silly math equation. I’ll repair the damn thing and make it to my destination.