Category: Column

During my Crisis, before watching Güeros, I watched Parks and Recreation…Like, all of it. All seven seasons. Before that, I watched the first five seasons of Futurama. Before that, Breaking Bad, Orange is the New Black, Love, and every Best Picture winner since 1939 (minus a few bad eggs, not the least of which includes 2005’s Crash—c’mon, give me some credit!).

For all my talk of the “primordial power of cinema” in my last column, I would be remiss—and indeed, quite hypocritical—in failing to acknowledge cinema’s second primary function, borne out of its alluring spectacle quality: cinema as a medium for entertainment.

Indeed, cinema has always possessed a two-fold functionality—as emotional therapy and as spectacle—which was made apparent immediately following film’s inception in the late 1890s. This dichotomy is most obvious among the two towering French pioneers of this early era: Auguste and Louis Lumière (the “Lumière” brothers) and Georges Méliès. Today, these two are widely considered to be the “founding fathers” of cinema, though their bodies of work could not be more antithetical.

For Siegfried Kracauer, one of the most prominent figures in film theory, this opposition highlights what he famously coined as the two “tendencies” of the cinema: the realistic and formative tendencies.

The realistic tendency was first exemplified through the Lumière brothers’ archival films. Their most famous work, titled Workers Leaving The Lumière Factory in Lyon, takes footage of exactly what the title suggests—within the span of a then-whopping forty-six seconds. The Lumière brothers were interested in, above all, capturing “everyday life after the manner of photographs.” In other words, the realistic tendency strives to capture (or replicate, through staging) the “nakedness” of life, in the style of, say, a documentary.

By contrast, the formative tendency aims to go beyond the replication of physical reality, which, to accomplish, requires emphasis on cinema-specific techniques (special effects). Méliès employed these techniques more adventurously and innovatively than any other filmmaker of his time. The popularization of universally known modern editing strategies such as “time-lapse photography,” “dissolves,” and “hand-painting,” among others, can all be attributed to Méliès. His legacy as the founder of cinema as a “fantastical art” continues today, where he is most recognizably referenced in allusion to his iconic, anthropomorphic moon from A Trip to the Moon.

Although it is clear that most of cinema displays an overlap between these two tendencies, Kracauer’s teachings have nevertheless continued to serve as a useful starting point for many a timid freshman entering the daunting realm of film theory for the first time. All subsequent cases for a “purpose of cinema” tend to exist within Kracauer’s rough outline of these two core functions: cinema as verisimilitude, and cinema as spectacle.

Modern audiences would tend to agree that the best works of cinema employ a harmonious balance between these two. Films on either end of the spectrum do not hold the attention of mass audiences for very long. If anything, a quick look at any recent “highest-grossing films” list within the last few years will show the People’s obvious predilection for spectacle. Films like Captain America: Civil War, Finding Dory, and Rogue One: A Star Wars Story all feature fantastical worlds narratively rooted and motivated by traditional, realist plots—realist insofar as they thematically mirror the plights of our own modern world, whether at the level of the individual, community, or, as in Civil War’s case, nations. Such films succeed in achieving both awe-inspiring and emotional satisfaction. On the flip-side, this clear predilection for spectacle means studios will blatantly abandon substance by backing up projects that rely on the spectacle element alone (Suicide Squad, Batman v. Superman, The Legend of Tarzan, etc.).

(Note: the ongoing success of this era’s Golden Age of Television indicates an audience leaning towards verisimilitude, seemingly contradicting my observations thus far. However, one must take into account the different nature of the TV show, which is fundamentally distinct from that which be accurately deemed “cinematic”—but all this for a later time.)

Although it’s safe to say that the average Columbian is more cultured than the average person, outside of film majors and cinephiles, it’s also probably safe to say that the average Columbian isn’t as well-versed in film as they are in, say, literature, art, or music. The reason for this is obvious: much of the Columbian’s expansive cultural lore can be attributed to our beloved Core Curriculum, which sadly does not include a “Film Humanities” course.

Attempting to coin a term like “Film Humanities” might seem preposterous and naïve on the outset, but such a negative reaction is unwarranted as it is probably based on one of two (or both) fallacious assumptions:

  1. Film is predominantly a “spectacle-based” art, unqualified for the kind of rich and complex analyses other arts tend to incite.
  2. Film is too young an art form and lacks the historical breadth necessary for making any substantial claims about the human condition that are worth investigating in a scholarly fashion.

To the first, we have already discussed film’s two-fold capacity for realism and spectacle, which implies that there exists a whole canon of films predominantly concerned with verisimilitude, with dealing with subject-matters relevant to the human experience. The “spectacle-based” argument illustrates a biased account of cinematic history, whereby at the turn of the millennium the Digital Age pretty much ensured that film as a “fantastical art” would be the way of the future, rendering all previous cinematic periods obsolete in the public eye.

I would also add that to reject “spectacle” point-blank as an element abolishing any degree of humanities-based discourse in an absolute sense is also erroneous, for it fails to take into account the vast and rich spectrum of variations of genre within the real and fantastical (i.e. Ontological Realism, Psychological Realism, Aesthetic Realism; see Bazin’s “The Evolution of the Language of Cinema”)—a spectrum evident in literature as well. Consider, for example, the tremendous difference, from a genre standpoint, between Homer’s The Iliad and Woolf’s To The Lighthouse, both of which are required readings for Literature Humanities.

I will counter the second point in a later column, it being deserving of its own thorough investigation.

For now, I encourage all Columbians—especially those for whom “cinema” is tantamount to “that which is relevant to the current cultural zeitgeist”—to voluntarily explore the history of cinema with the same level of seriousness with which the Core bestows the other, more “noble” arts.

To begin with, this will require a “survey of the greats,” for which I urge you to temporarily put your beloved Netflix/Hulu/Amazon Video TV show on hold and direct yourself to filmstruck.com, where you can subscribe for a two-week trial. This should be enough time to at least begin exploring the following list I have curated for you below. (And if it’s not, you can use this website to see what other platforms offer these films.) All of the following works share a “crossover” (to “artsy” films) appeal that I hope to instill in all you soon-to-be-cinephiles.

  1. The Red Balloon (1956), Albert Lamorisse.
  2. Y Tu Mamá También (2001), Alfonso Cuarón.
  3. Blue is the Warmest Color (2013), Abdellatif Kechiche.
  4. In the Mood for Love (2000), Wong Kar-wai.
  5. A Woman Under the Influence (1974), John Cassavetes.
  6. Aguirre, the Wrath of God, (1972), Werner Herzog.
  7. Three Colors: Red (1994), Krzysztof Kieślowski.
  8. The Great Beauty (2013), Paolo Sorrentino.
  9. The Spirit of the Beehive (1973),Víctor Erice.
  10. Seven Samurai (1954), Akira Kurosawa.

Enjoy.

P.S. Here is my favorite reference to Méliès, from Martin Scorsese’s Hugo (2011).

After twenty-one years of a beautifully reciprocal relationship, Television and I have hit a rough patch. What have I done to deserve this? For years, I have given him every spare minute of my time, turned to him in my hour of need, loved him unconditionally and completely. Our relationship was always new and refreshing, and every time I thought he began to take me for granted, he’d surprise me with an incredible new show and remind me why I loved him. But, things have changed. The enormous lack of fall television has left me brokenhearted, alone, and rebounding with not-so-good-for-me-but-incredibly-enticing Netflix.

Network television’s fall TV premiere line-up seemed promising. There were the obvious shoe-ins in the shape of returning series: Season Two of This Is Us, more Brooklyn Nine-Nine, and cult favorites like Empire, Scandal, and Supernatural. Personally, I couldn’t wait to find out more about This is Us’s Pearson family and Brooklyn Nine-Nine’s Andy Samberg’s stint in jail, and I eagerly counted down the days until The CW’s Jane the Virgin and Crazy Ex-Girlfriend returned to prime time. I was excited to return to the ins and outs of Firehouse 51 in Chicago Fire and even willing to give Designated Survivor’s sophomore season a chance. But they all let me down.

This Is Us has gotten so predictable that even the background music seems cliché. Rachel Bloom’s strong feminist character in Crazy Ex-Girlfriend is suddenly acting like an immature child. Jane the no-longer virgin has a boring new love interest (I told you that Michael was the heart and soul of that show), and nobody significant has been killed off of Chicago Fire since Season Two (it’s now entering its sixth season), which makes the whole “will they survive?” vibe kind of ridiculous. To top it all off, Designated Survivor is no longer focused on the survivor, and Andy Samberg turns out to make a horrible convict.

Desperate to salvage my relationship, I turned to new premieres. ABC’s The Good Doctor had a fascinating premise (it’s about an autistic surgeon), but after the first episode I was already bored with every character other than the main one. Adam Scott and Craig Robinson’s new comedy Ghosted felt like an even less-funny version of Men In Black (and MIB isn’t even a comedy…), and Daveed Digg’s The Mayor is cute, but nothing to write home about.

So here I am: bored with TV, hoping for a better mid-season lineup, and watching Gossip Girl on Netflix to pass the time.  

So… it’s up to you now, Television: Woo me. Here’s to hoping we’ll rekindle our love in the winter.

Illustration made by Laura Elizabeth Hand, CC’19

Why are we all so unsatisfied? It’s both an existential and practical question, facing down administrators at colleges across the country. As diagnoses of mental disorders have skyrocketed and the palpable aura of discontent has began to seep into millennial spaces, especially the college campus, most experts are left wringing their hands without explanation. While hundreds of think pieces have been written about the existential dread of the modern world, very few have wondered if our brains themselves may be incompatible with the society we’ve made.

To understand where the disconnect between brains and our society comes from, it’s worth focusing on biology. Humans have evolved our complex brains over millennia to do one thing better than other species — to reduce uncertainty. We do this by predicting the future based on our past experiences, and then adjusting those models when they’re wrong. We call this process learning.

The neurons in our cortex and hippocampus, two areas essential for learning and prediction, are especially wired for these tasks. These neurons have two kinds of channels at their synapses that bind to glutamate, the primary excitatory neurotransmitter in the brain. The simpler channel opens up whenever glutamate is around, causing quick but fleeting pulses of activity. The more complex one needs a lot of glutamate to open, but when it does, it triggers a host of structural changes in the neuron to make it more responsive in the future.

This process is called long-term potentiation, and it is the molecular basis of learning from sea slugs all the way up the food chain to Homo Sapiens. But one innovation made by mammals is the addition of dopamine to the picture. For us, whenever something unexpectedly good happens that doesn’t meet our predictions, our brains send those neurons a pulse of dopamine. This cements those molecular changes of LTP on a scale of weeks to months, and makes sure that the association is learned.

This process was ideal for the hundreds of thousands of years humans spent as hunter-gatherers. We lived in an much more uncertain world, where many of our predictions were wrong and small unexpected pleasures (such as finding berries where there were none previously) abounded. Our brains would frequently receive small pulses of episodic happiness through dopamine. Learning based on rewarding prediction errors to motivate similar behavior in the future works only when those prediction errors are common.

While the world may seem uncertain existentially, in the most basic of ways it is far more predictable. That lack of constant, small pulses of dopaminergic inputs may be the root cause of many modern issues. When most of our material comforts are taken care of by technology, we turn elsewhere to find those hits of dopamine our brains are wired to crave. Whether that be through an alert notification on our phones, a pint of ice cream, or forcing a dopamine rush through  alcohol, opioids, cannabinoids, or other substances, we increasingly engineer artificial means of dopamine release — sometimes to addictive and destructive ends.

So what can be done to alleviate the issue? Instead of turning to massive and artificial methods of dopamine generation, re-introducing small and, more importantly, surprising pleasures into your life can provide a brain evolved to learn through unpredictability those necessary reward prediction errors. Eat a new food, explore a new place downtown without an agenda, have a conversation with someone unexpected. Our intelligence is how we’ve made it this far — maybe we can think our way out of this one.

 

Full credit for this conceptualization goes to Peter Sterling. For a more detailed elaboration on this idea I wholeheartedly recommend reading his essay “On Human Design” or his book Principles of Neural Design, specifically Chapter 14

Uniquely Human is written by Heather Macomber and runs every other Monday. To submit a comment/question or a piece of your own, email submissions@columbialion.com.

Photo from the 2014 film Güeros.

There is a moment in Alonso Ruizpalacio’s 2014 film, Güeros, that has stayed with me since my first viewing: following a confrontation with an angry neighbor, the film’s trio flees the scene by car, and Sombra, the protagonist’s older brother, lies in the backseat undergoing what is clearly meant to be an audiovisual representation of a panic attack.

The scene owes much of its haunting memorability to its experimental track. A selection of ambient sounds—an eerie screech, a low rumble, and an incessant beep—intensify in sync to Sombra’s deteriorating mental condition, blurring his vision and muting the pleading voice of his younger brother (shown above) until his whole existence is reduced to the mere sound of frantic breaths against the backdrop of perilous sonic waves, which are evidently threatening to overtake him.

The reason this scene continues to leave such a lasting impression on me is simple: I, too, suffer from anxiety, and the scene’s mise-en-scène (everything that physically appears before the camera) seamlessly blends with the avant-garde dreaminess and apprehension of the score to elicit a convincing and uniform reproduction of my mental affliction.

In fact, when I first saw this film, two years ago, I was in the midst of my own personal Crisis. This took place seconds after I realized it was mathematically impossible for me to pass one of my CS classes, and that, consequently, I would be unable to graduate from Columbia within the traditional four-year span. Suffice to say, this colossal failure (“colossal,” insofar as it was the only notable one in my life thus far) amplified my anxiety-inducing imposter syndrome to the point where I physically couldn’t leave my room; the specificities of what followed, however, are for another time.

For now, I wish to briefly ruminate on one of cinema’s most sacred, primordial powers, illustrated by the aforementioned example: its ability to instill in the viewer catharsis (Greek: “katharsis,” meaning “purification” or “cleansing”) through poignant verisimilitude, especially as it relates to life’s immanently tragic nature.

As Aristotle teaches us in his seminal work on tragedy, Poetics, this experience is marked by a profoundly satisfying purgation of “negative” emotions, especially those characterized by fear and pity. In the end—if all has gone well—the viewer reemerges with the consoling reaffirmation that, despite one’s misfortunes, they will be able to cope nonetheless; in other words, that everything will end up okay.

But, on a more primal level, why do we experience catharsis at the movies at all?

Here it is helpful to quote the German Continental philosopher, Hans-Georg Gadamer, who is best known for his 1960 work on hermeneutics, Truth and Method, in which he writes:

“What is experienced in such an excess of tragic suffering is something truly common. The spectator recognizes himself and his finiteness in the face of the power of fate…To see that ‘this is how it is’ is a kind of self-knowledge for the spectator, who emerges with new insight from the illusions he, like everyone else, lives.” (132)

The first step in reaching catharsis, “recognition,” is not to be misunderstood as something immediate, for this is the process by which the artist aims to get the viewer to empathize with the protagonist on at least some level (this implies neither likeability nor relatability—think Walter White from Breaking Bad). Neither should it be seen as “contextual” recognition: after all, who else has ever found themselves literally trapped by a boulder in a remote slot canyon in southeastern Utah (127 Hours)? The recognition, then, is a thematic one: to use the previous example, the viewer is familiar with the general feeling of being suddenly pitted against a formidable obstacle which, despite your initial off-guardedness, will come to test the limits of your resolution.

The instant of catharsis occurs when the character’s suffering reaches its crescendo because it is here that the “power of fate” is most viscerally felt. Having been emotionally “led on” by the artist, the character has become us in the abstract sense, so that their trials and tribulations are likewise our own. Hence, we too are subjected to the great emotional weight of intense suffering when the crescendo arrives. It is here that the recognition realizes its consummate form as an utterly affective phenomenon.

It is the aim of the artist to lead the viewer to this step of “affective immersion,” without which the next step is not possible: the acquisition of what Gadamer terms “self-knowledge,” or “new insight.” This is the most important stage of catharsis, for it is here that art fulfills its primordial power: the viewer can now walk away with a rejuvenating, newfound emotional clarity. All that is left is the dissection of this clarity and the study of its personal implications.

For me, after watching Güeros, this meant sitting in shock for several hours, letting the weight of time slowly crush me as I slowly accepted the terrifying reality of my situation: I was having a chronic panic attack, fueled by a feral wave of anxiety, and was caught up in a truly desperate situation which seemed to have no end in sight.

Finally, cinema’s greatest gift: the capacity to incite radical change in the viewer, for the betterment of his or her situation, or those of others.

For me to get up and say:

“Hm…Maybe it’s time I got out now.”

 

The Seventh Art is written by Juan Gomez and runs every other Sunday. To submit a comment/question or a piece of your own, email submissions@columbialion.com.

Photo Courtesy of the Vanishing Point Chronicles

Mid-October marks many things in a college student’s life. It’s the beginning of midterms, the end of the beginning-of-semester haze, the hangover from homecoming, the warm weather’s slow abandonment. We desperately begin to count down to Fall Break, but the wait seems impossible. In this hour of need, you ask, what else but film can lift our spirits? What films and shows can we turn to?

Fall TV premieres are slowly trickling in, but for immediate therapy, check out this summer’s best premieres and releases:

  1. Dunkirk

Only Christopher Nolan can write a 70 page screenplay, cast Harry Styles as the most talkative character, and then insist that his film be shown in 70mm across all theaters in the US. And only Christopher Nolan can turn all of that into a smashing success. Based on a true story, Dunkirk is not only the most visually stunning film you’ll see this year, but also the most enthralling. Commonly mislabeled as a typical war movie, there’s really no way to describe Dunkirk to someone who hasn’t seen it. What Nolan has created is a plot line with twists and characters unlike those you may be familiar with. And that’s precisely what makes it so great.

  1. The Big Sick

I don’t think I’d be able to count the number of times I burst out laughing while watching Kumail Nanjiani’s debut feature film. A movie based on Nunjari’s own love story, The Big Sick was the romantic comedy version of Dunkirk. Nanjiani refuses to conform to the tropes that often plague this genre and instead infuses this story that isn’t really about romance at all with an incredible sense of humor and relevant social commentary . This innovative story, combined with Ray Romano’s adorably dopey performance as the girlfriend’s dad, catapults The Big Sick to the top of romantic comedies.

  1. Spider-Man: Homecoming

If you’re only planning on watching one of this summer’s blockbuster superhero hits, skip Gal Gadot’s overrated Wonder Woman for Tom Holland’s stellar performance in Spider-Man. Sure, Wonder Woman broke a glass ceiling and it’s great that a woman superhero is getting her chance to shine, but amidst the massive boost of superhero movies, Spider-Man returns to the genre’s roots. Unlike Wonder Woman and other recent films in the genre, Spider-Man is light and funny, and it finally feels like the movie-for-all-ages superhero films promise to be. Holland’s character is indeed “super,” but he’s also relatable, and I found myself rooting more genuinely for him than I had for any Marvel or DC character in a long time.

  1. The Handmaid’s Tale

If you don’t want something dark, don’t watch The Handmaid’s Tale. But if you want to experience television’s most thrilling and thought-provoking series of the summer, it may be worth it. Based on the novel by Margaret Atwood, The Handmaid’s Tale follows a dystopian futuristic America in which women are forced to return to domesticity. Our protagonist, played by Elisabeth Moss, is chosen as a breeder– and while her performance is outstanding, nothing could prepare you for the chills that will run up your spine when Yvonne Strahovski’s and Ann Dowd’s characters come on screen. In fact, nothing really could prepare you for the whole show at all, so I guess you’ll just have to watch it yourself.

  1. Unbreakable Kimmy Schmidt

I know I’ve spoken about this show before, but in this season Kimmy attends Columbia, and her observations are so spot on that it should probably be required viewing for incoming first-years. Although they filmed at UTS and not Columbia, the Kimmy Schmidt showmakers somehow found a way to harness the culture of Columbia– stress levels and all– in a wonderfully concocted season of puns, social commentaries, and Hamilton’s Daveed Diggs. Even if you haven’t watched the first couple of seasons, season three is worth your time. Maybe use it as a study break when you’re up late in Butler– and perhaps take Kimmy’s advice when she tells you there’s more to life than studying.