Tag: film

In a footnote from her essay “Against Interpretation,” Susan Sontag refers to film as a “subdivision of literature.” Now, I have never been one to uphold any kind of “hierarchy of the arts” (of what use would this be anyhow?), but I am interested in the relationship between different artistic mediums, and, in particular, as Sontag describes, that between film and literature. “Subdivision of literature” suggests literature as a kind of umbrella term encompassing film within its greater arena, as opposed to, as one might have intuitively supposed, two separate subsets within the greater arena that is “art.” Furthermore, the phrase disallows the opposite (“literature as subdivision of film”) to be true. What is it, then, that makes literature more “all-encompassing,” and what does it mean for a film to be “literary”?

An examination of “Godard’s Vivre Sa Vie,” Sontag’s essay on the French filmmaker’s fourth film about a struggling-artist-turned-prostitute, will prove useful here. In the essay, Sontag points out two general tendencies of the artist: the tendency toward proof, characterized by an emphasis in considerations of form, and the tendency towards analysis, which is more akin to fruitless “rambling” within a work, as the artist chases after the “infinite angles of understanding.”

As you might have guessed, Sontag favors the former, insisting that “In great art, it is form—or, as I call it here, the desire to prove rather than the desire to analyze—that is ultimately sovereign. It is form that allows one to terminate.” Thus, it is characteristic of great art to contain “endings that exhibit grace and design, and only secondarily convince in terms of psychological motives or social forces.” Vivre Sa Vie is therefore “literary” in the sense that, as in all great literature (Sontag names Shakespeare’s early comedies and Dante’s Divine Comedy as paragons), at play is a predominant concern towards proof—as opposed to analysis. The term “literary,” used to describe film, is thus a bit of a misnomer on Sontag’s part, as it might have suggested the presence of qualities intrinsic to literature, whereas all she is referring to is that which defines good art, within any medium. For Sontag, this means the artist emphasizes the formal: that is, they include a conspicuous element of design (symmetry, repetition, inversion, doubling, etc.).”

Sontag’s insistence on form strongly reminds me of my Art Hum instructor, Daniel Ralston, who would call us out whenever we would respond to a painting with such platitudes as: “I think the three birds represent the Holy Trinity” or “The expression of the left-most figure is one of intense melancholy”—statements of a nature which would no doubt have gone unheeded (perhaps praised) in some of my previous Core classes. For example, during my Literature Humanities course several years ago, a full hour was once spent on a Freudian analysis of Woolf’s To the Lighthouse (which, unfortunately for me, I consider to be one of the most beautiful novels of all time). Ralston would often respond to these comments by saying, “Yes, but, what about formally—for example, what can you say about the composition?” And though frustratingly delimiting and didactic at first, I eventually came to realize this methodology was far more compatible with my personal relationship with art, which, for the most part, had tended to go ignored by many of my humanities classes at Columbia.

This issue came up once during the discussion section to my Western class (FILM 2120, Topics in American Cinema: The West) the previous semester. The topic of discourse was the Edenic imagery permeating throughout some boring film whose name I can’t recall. Someone had said, “I don’t see it. I don’t see him [the director] trying to do that,” to which the others collectively responded in defensive choir, “But it’s there,” leaving the poor girl outnumbered. In that moment, what none of us understood was that, at its core, the disagreement arose out of a difference in hermeneutical approach. On one hand, there was the school of thought that perpetuates myth by asserting that “this is there” and this isn’t, that “this ought to be but not that” (i.e. all the feminist readings of these films), and, on the other hand, there were those who believed that a work of art is the thing itself, not whatever meaning is forced out of it by some ulterior agenda.

The subject of her famous “Against Interpretation” essay, Sontag is well aware of this dry hermeneutical approach, prevalent among most schools, which tends to mistreat the work of art. As she writes: “…it is still assumed that a work of art is its content. Or, as it’s usually put today, that a work of art by definition says something. (‘What X is saying is…,’ ‘What X is trying to say is…,’ ‘What X said is…’ etc., etc.)” (4). “Content,” in this sense, is tantamount to “what I think it says” which is always subjective—whereas it should be acknowledged that content is in fact objective (“This is not Edenic imagery, just a shot of a meadow where this story happens to take place”), and that anything more than that is a stretch, fabricating superfluous intellectual delusions that numb the senses and are best befitted for the most cerebral of students, those who relish the thought of life in academia and seek to write theses along the lines of “A Queer Reading of the Works of Pedro Almodovar” or “Marxism in Kafka”—horrible titles, but you get the idea. Sontag beautifully sums up the problem as follows:

“Like the fumes of the automobile and of heavy industry which befoul the urban atmosphere, the effusion of interpretations of art today poisons our sensibilities. In a culture whose already classical dilemma is the hypertrophy of the intellect at the expense of energy and sensual capability, interpretation is the revenge of the intellect upon art.”

And what would fix this? A de-emphasis on content and a recognition of art as a sensory experience. Or, as Sontag put it: “In place of hermeneutics we need an erotics of art.” It is by abiding by this mantra I’ve discovered the audiovisual intensity of Faulkner to be found in Aronofsky’s crescendos, the minimalist serenity and ennui of Hemingway in Antonioni, and the hypnotic allure of flawed (but painfully realistic) characters from Tolstoy in Kieslowski. Literature is thus capable of being as “cinematic” as the cinema is of being “literary”—it’s just a matter of form, form, form.

My previous column was all about the cultural importance of Star Wars as the quintessential modern myth. I even mentioned the need for myth in these troubled times, insinuating my desire for Star Wars: The Last Jedi to acknowledge, or comment on, the current political climate in some capacity. And so, having now watched it, I ask: how good was it, and how does it hold as a modern myth?

To begin, much of the progressivism from The Force Awakens is carried over here, and is given much more room to breathe in some instances, as in Finn (John Boyega) and Rose’s (Kelly Marie Tran) excursion to Cantonica, a desert planet run by greedy, corporate, casino-obsessed profiteers who benefit from the galactic war between the First Order and the Resistance. As many reviews have been quick to point out, this arc is easy to bait as a digressing rambling point, though this is most attributable not to the narrative intentions of the arc, but rather the lackluster execution of these explorations which at times threaten to inspire a blatant indifference on the audience’s part. From the moment Rose begins telling her sob backstory, which then leads into a preachy animal-rights midnight exodus extravaganza, the narrative feels forced and progressive for the sake of being progressive—in short, it feels inauthentic.

I should stress that this lack of authenticity exists strictly on a formal level, by which I mean the film was admittedly doing some interesting things in theory. This includes the incorporation of Star Wars canon material previously unseen on the big screen (How did Luke get there?), the subversion of myth by questioning its authenticity, and the fabrication of a triadic collective protagonist (Luke, Rey, and Kylo Ren). However, most reviews that have defended The Last Jedi have tended to rely on these novel narrative deviations to the Star Wars canon as sufficient evidence for the film’s artistic merit, the equivalent of arguing Pollock’s early works as redeemable insofar as they are “dense with mythology and Jungian archetypes” or that James Joyce is a genius on the basis that UlyssesLike many great works of literature…requires repeated reading and deep study fully to understand–and ultimately to enjoy–the many dimensions and layers.” All this is well and fine, but I would argue that the formal ramifications of a work of art (i.e. revolutionary or revisionist technique), or its utter abstruseness, are not enough to warrant—indeed, even measure—artistic merit. Hence, to defend The Last Jedi by way of uttering such generalizations as “The movie works equally well as an earnest adventure full of passionate heroes and villains and a meditation on sequels and franchise properties” is not enough; I mean, sure, but, where specifically do you see this being done well, and, more importantly, how are you measuring “well”?

I would narrow down my problems with this movie to one pivotal, overarching problem that effectively ruined all of the things that could have worked for the film: pacing. By this I mean not only the editing from one plot to another, but the consistent incorporation of “tonal distractions,” both of which, collectively, forbid any one point in the story to breathe and really come into its own. One result of this is that, unlike The Force Awakens, the film no longer feels character-based—the word “feels” is crucial here as the narrative was evidently attempting to darken and flesh out three of its main characters: Rey, Kylo Ren, and Luke Skywalker. This sophistication had the potential to be the holy grail of the film’s engagement, but, whenever this character-building is at play, it is superfluously embroidered by these aforementioned tonal distractions, whether it’s Luke tricking Rey into “using the force” with a blade of grass, Kylo Ren being shirtless (but why?), or a Pog face-planting into a window during what should be a serious rescue scene on the planet Crait. It’s as if Robert Altman had been hired to write a Star Wars movie and immediately decided to Nashville the sh!+ out of it.

The thing is (and this gets to the heart of why I abhor Robert Altman films) the film medium is temporally built to sustain a well-chosen economy of narrative if it has any hopes of fabricating and sustaining any degree of emotional investment. Shows like Game of Thrones and Orange is the New Black have shown that the serial format is much more compatible with large ensemble casts because they are given the room to be explored in an organic and engaging way. When condensing these kaleidoscopic endeavors into a film, much of the emotional weight is lost in favor for what essentially amounts to “interesting ideas”: the philosophy underlying Luke’s cynicism, Rey’s development as a Jedi (we are given some “shocking” background story, but how does this affect her character? She’s still on the good side at the end [I almost wanted her to go to the dark side, just to shake things up]), or Kylo Ren’s inner conflict (which, again, amounts to nothing—he is still the “bad guy” at the end of the film).

While The Last Jedi does not have a terribly high amount of plots and characters, it does incessantly move from one thing we are meant to be taking seriously to another, a system which amounts to the same thing: the dilution of the audience’s emotional investment. Sure, much of the frantic pacing works for the fresh new theme of “let the past die, look to the future” which may in fact be commenting on the generally pessimistic milieu of our times, and whose newness does manage to “keep the myth interesting, and hence relevant” as I mentioned in my last column. However, The Last Jedi is revisionism done wrong, in the vein of Nolan’s The Dark Knight Rises, where a lot of interesting things are going down without succeeding in making us care. This is in sharp contrast to the much more cogent (and also revisionist) The Dark Knight, or The Empire Strikes Back. Recall how much time we spend following Luke’s training with Yoda in Episode V, or Rey the scavenger-for-parts at the beginning of The Force Awakens. These are some of my favorite moments in the franchise, and the reason they work is because we’re there for a while, to the point where the depicted world begins to feel organic, our own—thus paving the way for emotional investment.

If anything, The Last Jedi has compelled me to familiarize myself to a much greater extent with the Star Wars canon. Through my current efforts to understand just what in the world was happening in the film, I might eventually be able to tame my currently lashing and thrashing response to such a degree that the film may not appear as messy and improvised as it does now. Who knows, a year from now—maybe less—I may even like it.

The new Star Wars: The Last Jedi trailer has been out for months now, and fans—old and new alike—are still raving about it, once more submerging themselves in that paroxysm of fervent fan-boy anticipation, pre-packaged with every preview of the upcoming chapter which instantaneously dominates the masses, spreading like wildfire the moment they hit YouTube. “What this trailer did,” said Jeremy Jahns, popular YouTube movie reviewer, “is what Star Wars trailers do, and that’s put Star Wars at the forefront—like yeah, this is happening.”

One person who’s probably less excited about the upcoming film is Star Wars creator himself, George Lucas, who gave up creative rights to the Star Wars universe after selling the franchise to Disney in 2012 for a whopping 4.05 billion USD. In a 2015 interview with Charlie Rose, when asked how he felt about Episode VII: The Force Awakens (the first installment of the reboot trilogy) Lucas said: “We call it space opera but it’s actually a soap opera. And it’s all about family problems—it’s not about spaceships…They decided they were gonna go do their own thing…They wanted to make a retro movie—I don’t like that. I like…Every movie I make I work very hard to make them different. I make them completely different: with different planets, different spaceships—yenno, to make it new.

I disagree with Lucas’ judgement of Disney’s “nostalgia” approach and maintain that, in order for the reboot to have had the same initial impression of awe-inspiring proportions on the new generation as A New Hope (’77) had on the old, it had to retain as much of its mythic dimensions as possible—which, in order to accomplish, adopting the nostalgia approach was clearly the most surefire way to go. Whatever backlash The Force Awakens (2015) might have received in regards to its “uninteresting” and “boring” semblance to the original fails to recognize what it is that makes Star Wars so compelling a cultural force: that is, its function as myth, which, by its very nature, must remain as little changed as possible if it is to remain relevant.

Here it is important to distinguish between myth and narrative, for the latter is merely the particular (and always varying) mediation of the former (which is always the same). Put another way, a narrative, or an individual story, is simply a representation of a kind of “master story” that pre-exists in the audience’s mind long before they sit down to watch The Force Awakens for the first time—assuming, of course, the audience has lived long enough to have acquired a fairly confident intuition in regards to what constitutes this so-called “master story” that is myth.

“Myth” comes from the Greek word “mythos,” meaning “story.” It is from this definition that our understanding of myth must necessarily arise, for most theories of myth begin from the accepted idea of myth as a kind of “canon of story.” Here it is noteworthy that the medium of the story is not signified, for it would be erroneous to confine myth to a single art form (i.e. myth as the literary canon). Consider, for example, how ancient cave paintings are fraught with narrative imagery, from the dancing scenes of Serra de Capivera, Piauí, Brazil (28,000 to 6,000 BC) to the enigmatic beings and animals of Kadaku, Northern Territory, Australia (26,000 BC); after all, the story “I saw a kangaroo” is still a story, though, to us, not a particularly interesting one (insofar as it is not all that sophisticated).

What is interesting is that such geographically disparate populations, who would have had no physical means of contact with one another, should engage in the same activity (which is not necessary for biological survival) with the same level of behavioral predictability of birds from separate continents—all of whom seem to instinctively grasp the concept of “nest-building” as pivotal for their offspring’s protection. What is it, then, that prompts what appears to be a primordially entrenched instinct of human nature? What is the point of saying, “I saw a kangaroo”?

The answer to this can be arrived at by emphasizing the two subjects of the sentence and studying the resulting truth-values derived thereof. For if the emphasis is placed on “a kangaroo,” then one extracts an empirical value tantamount to the scientist’s collected data. Here, the sentence derives significance from its illumination of some perceived aspect (in this case, the “kangaroo”) of the world, that is, of reality. On the other hand, if one places the emphasis on “I saw,” a second meaning is discovered, this time signifying the presence of “I,” that is, the storyteller. This too can be perceived as empirical but, more notably, as humanistic, for the manifested will to engage in an activity that will record the existence of oneself at a given time is a behavior unique to the human species.

What results from this innocuously curios act of paint-on-wall, then, is the radical evolutionary leap towards self-reflexivity, whereby an innate curiosity is cognitively mastered through creativity. Of course, this process has long been practiced by humans, but early-on it was strictly in the material sense, and motivated by survival at that. With the emergence of art, however, the human’s cognitive faculties began to operate within a more fundamentally psychological dimension, one motivated not by survival, but the acquirement of knowledge, especially as this knowledge relates to the human being. In other words, cave painting illustrates a primordial desire to understand reality–that is, the universe–and humanity’s place in it.

The primary questions which myth asks, then, are: What is the nature of reality, and why am I a part of it?

The narrative patterns that emerge from humanity’s collective efforts to answer these questions is myth. These patterns can be found not only in paintings (depictions of animals, hunting scenes), but also, more complexly, in the literary tradition. Herein lies my previous need to distinguish the “storytelling” canon from the “literary” one, since the literary, by its very nature, allows for a more immediate and elaborate representation of stories. We can count in these patterns, among others, creation stories, Campbell’s “monomyths,” earth/water mothers, etc. Most of us brought up with a classical education which included a relatively similar rubric of books are no longer surprised to find that the narrative elements of the Bible can be found in the Epic of Gilgamesh, can be found in the Popol Vuh, Homer, Shakespeare, Faulkner—you get the idea.

The last author mentioned beautifully described this intrinsic human need for myth during his Banquet Speech at the Nobel Prize ceremony in 1949. Having discussed the paranoia bred by the Cold War, and the consequent nihilism of that milieu, he insisted that Man must remind Himself of “the old virtues and truths of the heart, the old universal truths lacking which any story is ephemeral and doomed—love and honor and pity and pride and compassion and sacrifice…[otherwise] His griefs grieve in no universal bones.”

All the “universal truths” Faulkner mentioned are major narrative forces of George Lucas’ epic saga: Anakin’s pride leading up to his metamorphosis into Darth Vader (The Revenge of the Sith, 2005), only for him to express compassion and pity in his final moments (The Return of the Jedi, 1983); the honor and love between friends that keeps the pack together through all manner of adversities (as in, say, Leia’s rescuing of Luke in The Empire Strikes Back, 1980); and, more recently, the sacrificial deaths from all of Rogue One’s (2016) major characters. Thus, The Last Jedi will be the latest installment of what can safely be called one of modernity’s greatest myths, for its treatment of these perennial themes has given it a universal appeal and, consequently, a formidable staying power worthy of mythic status.

In light of all this, the Reader (especially if they do not consider themselves a fan—on any level) may begin to appreciate the magnitude of cultural significance The Last Jedi is bound to have come this Christmas. Its inception into cinemas this December will call upon (as the best mythic tales often do) a mass gathering of people who will expect to be awed and moved and shocked and, on top of all these things, reminded of these universal truths, thereby permeating, if at least for a moment, a sense of solidarity among the masses which the cynical media eye will have us believe is practically nonexistent in modern times.

Too sentimental? Perhaps. Let’s just hope the film isn’t (i.e. don’t kill Rei yet, by far my favorite Star Wars character ever!).

P.S. You can watch the trailer here, for those of you who (for whatever reason) haven’t seen it yet.

Image made by Laura Elizabeth Hand, CC’19

Content Warning: sexual assault

As everyone reels from the news about Harvey Weinstein, the question of inequality for women in Hollywood finds itself once again at the forefront of conversation. Behind the camera, women are coming forward with stories of sexual assault, and we’re finally engaging in a conversation that should have begun years ago. But… what about in front of the camera?

In her acceptance speech at last month’s Emmys, Best Actress winner Nicole Kidman explained that she and Reese Witherspoon produced Big Little Lies to create “more great roles for women.” She was met with thunderous applause acknowledging her role in Lies as “great.” But I was lost.

Over the summer, I binged-watched probably hundreds of episodes of television and saw every movie in theaters. And there were, indeed, great female roles. Elisabeth Moss’s Offred was a strong feminist, Kimmy Schmidt made her way to college, Wonder Woman dominated at the box office, Anne of Green Gables made a triumphant return to television, and the women of This Is Us, Veep, The Crown, and so much more were complex and inspiring.

But when I turned to Kidman’s Big Little Lies, I couldn’t help but gasp at the tireless repetition of sexist tropes and same old plotlines. For those who don’t know, Big Little Lies follows four different mothers in an upper-middle class suburban town. Madeline, played by Reese Witherspoon, is the town gossip and an overbearing and self-centered mother. Jane, played by Shailene Woodley, is a single mom, new in town, with a troubled past. Her son, Ziggy, gets into trouble with Renata Klein, the hard-working businesswoman whose daughter claims Ziggy hurt her. And Kidman’s character, Celeste, is a stay-at-home mom who’s hidden the truth about her abusive husband for years.

If you look at the logline, you may buy Kidman’s claim about “great roles for women.” Save for perhaps Witherspoon’s one-dimensional character (who’s literally portrayed as if Elle Woods just grew up a tiny bit), the rest of the women indeed seem complex. But rather than focusing on the crux of the women’s troubled stories, the show spends the bulk of its time rehashing the fight  fight between Jane and Renata’s children. While the fight begins with a serious accusation, before long it becomes clear that Ziggy didn’t hurt Renata’s daughter, and that the fight has spiraled into an all-out war over who works harder: the working moms or the stay-at-home moms. By the end of the first episode, everyone in town has taken sides, and suddenly it’s like you’re watching a glorified version of a middle-school cat fight, but with birkin bags instead of friendship bracelets.

The subplots are equally uncompelling, and wouldn’t pass the Bechdel Test if you gave them all the leeway possible. Madeline can’t seem to get her new husband to get along with her old one, or convince the town to let her put on a production of Avenue Q. These are ridiculously privileged problems, yet the show makes them out to be as dramatic as the abuse Celeste is experiencing at home. Madeline finally connects with her teenage daughter by admitting that she cheated on her new husband. Oh great, isn’t that wonderful motherly guidance? Meanwhile, Renata doesn’t have sex often enough with her husband, and her poor daughter can’t get enough kids to come to her million-dollar birthday party.

But while all this is happening, the only two characters with the possibility for a compelling subplot also fall short. A few episodes into the series, we learn that Jane was raped and she fears that Ziggy will inherit his father’s violent tendencies, but this intriguing storyline barely gets any airtime. Celeste finally works up the nerve to go to a therapist, and the show’s only truly “great” female moments are in Kidman’s painfully accurate portrayal of a woman struggling to come forward about abuse. When Celeste finally decides to leave her husband, the depiction of women on the show finally feels empowered.

But within one episode, everything swings back again. In the final scene, at a ridiculously over-the-top school function, Celeste’s husband discovers she’s leaving and starts to hit her. Coming to her defense, Madeline, Jane, Renata, and one other woman hit him back, and we learn that Celeste’s abusive husband was the man who raped Jane all those years ago. Finally, the women accidentally push him over a cliff and kill him. It was an act of self-defense, and the audience breathes a sigh of genuine relief and hope for Celeste’s brighter future.

But then, they deny the murder. In talking to the police, not one woman comes forward with the truth. He simply fell, they say. In talking to the police, not one woman comes forward with the truth. Why? I’m not sure. In their silence, the women of Big Little Lies end their show not with a message of the importance of speaking out for victims of abuse, but of the harmlessness of staying silent. Suddenly, everything about the showKidman’s character and even Jane’s intriguing subplotseems far too convenient. For Jane, the question of her own PTSD and her son’s violent tendencies are suddenly resolved. And true, it seems like Celeste was about to finally stand up and leave, but by choosing to kill off the abuser, the writers eliminate the incredibly difficult period abused women struggle through, physically and emotionally, to take that step away. If this were a real woman, Jane’s and Celeste’s  struggles would not be over with a timely shove off of a cliff and a promise to never speak of it again. Abuse lives with people forever.

The show ends with a reconciliation. Like they’re in middle school again, Madeline, Jane, Celeste, and Renata are suddenly friends, joined together with a secret. But let me put it plainly: abuse is not a cute little secret you share with your friends. Abuse is not a problem that deserves less screen time and the same dramatic emphasis as does the question of whether to put on Avenue Q. Abuse is real, abuse is terrible, and abuse doesn’t resolve itself that easily.

Big Little Lies took home five Emmys this year. In her acceptance speech, Nicole Kidman said that the show helped “shine a light” on abuse. Maybe, but the small light the show shines is not enough. The women in the show aren’t “great”: they’re simple, naive, entitled, and don’t reflect the true complexities that women like Celeste and Jane (or even real-life Madelines) face every day. And in an industry where actresses experience sexual harassment every day and a world where men like Harvey Weinstein find success, Hollywood needs to do better.

So yes, Ms. Kidman: you’re right. Hollywood does need more great roles for women. But I’m afraid this wasn’t it.

Photo from the 2014 film Güeros.

There is a moment in Alonso Ruizpalacio’s 2014 film, Güeros, that has stayed with me since my first viewing: following a confrontation with an angry neighbor, the film’s trio flees the scene by car, and Sombra, the protagonist’s older brother, lies in the backseat undergoing what is clearly meant to be an audiovisual representation of a panic attack.

The scene owes much of its haunting memorability to its experimental track. A selection of ambient sounds—an eerie screech, a low rumble, and an incessant beep—intensify in sync to Sombra’s deteriorating mental condition, blurring his vision and muting the pleading voice of his younger brother (shown above) until his whole existence is reduced to the mere sound of frantic breaths against the backdrop of perilous sonic waves, which are evidently threatening to overtake him.

The reason this scene continues to leave such a lasting impression on me is simple: I, too, suffer from anxiety, and the scene’s mise-en-scène (everything that physically appears before the camera) seamlessly blends with the avant-garde dreaminess and apprehension of the score to elicit a convincing and uniform reproduction of my mental affliction.

In fact, when I first saw this film, two years ago, I was in the midst of my own personal Crisis. This took place seconds after I realized it was mathematically impossible for me to pass one of my CS classes, and that, consequently, I would be unable to graduate from Columbia within the traditional four-year span. Suffice to say, this colossal failure (“colossal,” insofar as it was the only notable one in my life thus far) amplified my anxiety-inducing imposter syndrome to the point where I physically couldn’t leave my room; the specificities of what followed, however, are for another time.

For now, I wish to briefly ruminate on one of cinema’s most sacred, primordial powers, illustrated by the aforementioned example: its ability to instill in the viewer catharsis (Greek: “katharsis,” meaning “purification” or “cleansing”) through poignant verisimilitude, especially as it relates to life’s immanently tragic nature.

As Aristotle teaches us in his seminal work on tragedy, Poetics, this experience is marked by a profoundly satisfying purgation of “negative” emotions, especially those characterized by fear and pity. In the end—if all has gone well—the viewer reemerges with the consoling reaffirmation that, despite one’s misfortunes, they will be able to cope nonetheless; in other words, that everything will end up okay.

But, on a more primal level, why do we experience catharsis at the movies at all?

Here it is helpful to quote the German Continental philosopher, Hans-Georg Gadamer, who is best known for his 1960 work on hermeneutics, Truth and Method, in which he writes:

“What is experienced in such an excess of tragic suffering is something truly common. The spectator recognizes himself and his finiteness in the face of the power of fate…To see that ‘this is how it is’ is a kind of self-knowledge for the spectator, who emerges with new insight from the illusions he, like everyone else, lives.” (132)

The first step in reaching catharsis, “recognition,” is not to be misunderstood as something immediate, for this is the process by which the artist aims to get the viewer to empathize with the protagonist on at least some level (this implies neither likeability nor relatability—think Walter White from Breaking Bad). Neither should it be seen as “contextual” recognition: after all, who else has ever found themselves literally trapped by a boulder in a remote slot canyon in southeastern Utah (127 Hours)? The recognition, then, is a thematic one: to use the previous example, the viewer is familiar with the general feeling of being suddenly pitted against a formidable obstacle which, despite your initial off-guardedness, will come to test the limits of your resolution.

The instant of catharsis occurs when the character’s suffering reaches its crescendo because it is here that the “power of fate” is most viscerally felt. Having been emotionally “led on” by the artist, the character has become us in the abstract sense, so that their trials and tribulations are likewise our own. Hence, we too are subjected to the great emotional weight of intense suffering when the crescendo arrives. It is here that the recognition realizes its consummate form as an utterly affective phenomenon.

It is the aim of the artist to lead the viewer to this step of “affective immersion,” without which the next step is not possible: the acquisition of what Gadamer terms “self-knowledge,” or “new insight.” This is the most important stage of catharsis, for it is here that art fulfills its primordial power: the viewer can now walk away with a rejuvenating, newfound emotional clarity. All that is left is the dissection of this clarity and the study of its personal implications.

For me, after watching Güeros, this meant sitting in shock for several hours, letting the weight of time slowly crush me as I slowly accepted the terrifying reality of my situation: I was having a chronic panic attack, fueled by a feral wave of anxiety, and was caught up in a truly desperate situation which seemed to have no end in sight.

Finally, cinema’s greatest gift: the capacity to incite radical change in the viewer, for the betterment of his or her situation, or those of others.

For me to get up and say:

“Hm…Maybe it’s time I got out now.”

 

The Seventh Art is written by Juan Gomez and runs every other Sunday. To submit a comment/question or a piece of your own, email submissions@columbialion.com.