Columnists


In a footnote from her essay “Against Interpretation,” Susan Sontag refers to film as a “subdivision of literature.” Now, I have never been one to uphold any kind of “hierarchy of the arts” (of what use would this be anyhow?), but I am interested in the relationship between different artistic mediums, and, in particular, as Sontag describes, that between film and literature. “Subdivision of literature” suggests literature as a kind of umbrella term encompassing film within its greater arena, as opposed to, as one might have intuitively supposed, two separate subsets within the greater arena that is “art.” Furthermore, the phrase disallows the opposite (“literature as subdivision of film”) to be true. What is it, then, that makes literature more “all-encompassing,” and what does it mean for a film to be “literary”?

An examination of “Godard’s Vivre Sa Vie,” Sontag’s essay on the French filmmaker’s fourth film about a struggling-artist-turned-prostitute, will prove useful here. In the essay, Sontag points out two general tendencies of the artist: the tendency toward proof, characterized by an emphasis in considerations of form, and the tendency towards analysis, which is more akin to fruitless “rambling” within a work, as the artist chases after the “infinite angles of understanding.”

As you might have guessed, Sontag favors the former, insisting that “In great art, it is form—or, as I call it here, the desire to prove rather than the desire to analyze—that is ultimately sovereign. It is form that allows one to terminate.” Thus, it is characteristic of great art to contain “endings that exhibit grace and design, and only secondarily convince in terms of psychological motives or social forces.” Vivre Sa Vie is therefore “literary” in the sense that, as in all great literature (Sontag names Shakespeare’s early comedies and Dante’s Divine Comedy as paragons), at play is a predominant concern towards proof—as opposed to analysis. The term “literary,” used to describe film, is thus a bit of a misnomer on Sontag’s part, as it might have suggested the presence of qualities intrinsic to literature, whereas all she is referring to is that which defines good art, within any medium. For Sontag, this means the artist emphasizes the formal: that is, they include a conspicuous element of design (symmetry, repetition, inversion, doubling, etc.).”

Sontag’s insistence on form strongly reminds me of my Art Hum instructor, Daniel Ralston, who would call us out whenever we would respond to a painting with such platitudes as: “I think the three birds represent the Holy Trinity” or “The expression of the left-most figure is one of intense melancholy”—statements of a nature which would no doubt have gone unheeded (perhaps praised) in some of my previous Core classes. For example, during my Literature Humanities course several years ago, a full hour was once spent on a Freudian analysis of Woolf’s To the Lighthouse (which, unfortunately for me, I consider to be one of the most beautiful novels of all time). Ralston would often respond to these comments by saying, “Yes, but, what about formally—for example, what can you say about the composition?” And though frustratingly delimiting and didactic at first, I eventually came to realize this methodology was far more compatible with my personal relationship with art, which, for the most part, had tended to go ignored by many of my humanities classes at Columbia.

This issue came up once during the discussion section to my Western class (FILM 2120, Topics in American Cinema: The West) the previous semester. The topic of discourse was the Edenic imagery permeating throughout some boring film whose name I can’t recall. Someone had said, “I don’t see it. I don’t see him [the director] trying to do that,” to which the others collectively responded in defensive choir, “But it’s there,” leaving the poor girl outnumbered. In that moment, what none of us understood was that, at its core, the disagreement arose out of a difference in hermeneutical approach. On one hand, there was the school of thought that perpetuates myth by asserting that “this is there” and this isn’t, that “this ought to be but not that” (i.e. all the feminist readings of these films), and, on the other hand, there were those who believed that a work of art is the thing itself, not whatever meaning is forced out of it by some ulterior agenda.

The subject of her famous “Against Interpretation” essay, Sontag is well aware of this dry hermeneutical approach, prevalent among most schools, which tends to mistreat the work of art. As she writes: “…it is still assumed that a work of art is its content. Or, as it’s usually put today, that a work of art by definition says something. (‘What X is saying is…,’ ‘What X is trying to say is…,’ ‘What X said is…’ etc., etc.)” (4). “Content,” in this sense, is tantamount to “what I think it says” which is always subjective—whereas it should be acknowledged that content is in fact objective (“This is not Edenic imagery, just a shot of a meadow where this story happens to take place”), and that anything more than that is a stretch, fabricating superfluous intellectual delusions that numb the senses and are best befitted for the most cerebral of students, those who relish the thought of life in academia and seek to write theses along the lines of “A Queer Reading of the Works of Pedro Almodovar” or “Marxism in Kafka”—horrible titles, but you get the idea. Sontag beautifully sums up the problem as follows:

“Like the fumes of the automobile and of heavy industry which befoul the urban atmosphere, the effusion of interpretations of art today poisons our sensibilities. In a culture whose already classical dilemma is the hypertrophy of the intellect at the expense of energy and sensual capability, interpretation is the revenge of the intellect upon art.”

And what would fix this? A de-emphasis on content and a recognition of art as a sensory experience. Or, as Sontag put it: “In place of hermeneutics we need an erotics of art.” It is by abiding by this mantra I’ve discovered the audiovisual intensity of Faulkner to be found in Aronofsky’s crescendos, the minimalist serenity and ennui of Hemingway in Antonioni, and the hypnotic allure of flawed (but painfully realistic) characters from Tolstoy in Kieslowski. Literature is thus capable of being as “cinematic” as the cinema is of being “literary”—it’s just a matter of form, form, form.

My previous column was all about the cultural importance of Star Wars as the quintessential modern myth. I even mentioned the need for myth in these troubled times, insinuating my desire for Star Wars: The Last Jedi to acknowledge, or comment on, the current political climate in some capacity. And so, having now watched it, I ask: how good was it, and how does it hold as a modern myth?

To begin, much of the progressivism from The Force Awakens is carried over here, and is given much more room to breathe in some instances, as in Finn (John Boyega) and Rose’s (Kelly Marie Tran) excursion to Cantonica, a desert planet run by greedy, corporate, casino-obsessed profiteers who benefit from the galactic war between the First Order and the Resistance. As many reviews have been quick to point out, this arc is easy to bait as a digressing rambling point, though this is most attributable not to the narrative intentions of the arc, but rather the lackluster execution of these explorations which at times threaten to inspire a blatant indifference on the audience’s part. From the moment Rose begins telling her sob backstory, which then leads into a preachy animal-rights midnight exodus extravaganza, the narrative feels forced and progressive for the sake of being progressive—in short, it feels inauthentic.

I should stress that this lack of authenticity exists strictly on a formal level, by which I mean the film was admittedly doing some interesting things in theory. This includes the incorporation of Star Wars canon material previously unseen on the big screen (How did Luke get there?), the subversion of myth by questioning its authenticity, and the fabrication of a triadic collective protagonist (Luke, Rey, and Kylo Ren). However, most reviews that have defended The Last Jedi have tended to rely on these novel narrative deviations to the Star Wars canon as sufficient evidence for the film’s artistic merit, the equivalent of arguing Pollock’s early works as redeemable insofar as they are “dense with mythology and Jungian archetypes” or that James Joyce is a genius on the basis that UlyssesLike many great works of literature…requires repeated reading and deep study fully to understand–and ultimately to enjoy–the many dimensions and layers.” All this is well and fine, but I would argue that the formal ramifications of a work of art (i.e. revolutionary or revisionist technique), or its utter abstruseness, are not enough to warrant—indeed, even measure—artistic merit. Hence, to defend The Last Jedi by way of uttering such generalizations as “The movie works equally well as an earnest adventure full of passionate heroes and villains and a meditation on sequels and franchise properties” is not enough; I mean, sure, but, where specifically do you see this being done well, and, more importantly, how are you measuring “well”?

I would narrow down my problems with this movie to one pivotal, overarching problem that effectively ruined all of the things that could have worked for the film: pacing. By this I mean not only the editing from one plot to another, but the consistent incorporation of “tonal distractions,” both of which, collectively, forbid any one point in the story to breathe and really come into its own. One result of this is that, unlike The Force Awakens, the film no longer feels character-based—the word “feels” is crucial here as the narrative was evidently attempting to darken and flesh out three of its main characters: Rey, Kylo Ren, and Luke Skywalker. This sophistication had the potential to be the holy grail of the film’s engagement, but, whenever this character-building is at play, it is superfluously embroidered by these aforementioned tonal distractions, whether it’s Luke tricking Rey into “using the force” with a blade of grass, Kylo Ren being shirtless (but why?), or a Pog face-planting into a window during what should be a serious rescue scene on the planet Crait. It’s as if Robert Altman had been hired to write a Star Wars movie and immediately decided to Nashville the sh!+ out of it.

The thing is (and this gets to the heart of why I abhor Robert Altman films) the film medium is temporally built to sustain a well-chosen economy of narrative if it has any hopes of fabricating and sustaining any degree of emotional investment. Shows like Game of Thrones and Orange is the New Black have shown that the serial format is much more compatible with large ensemble casts because they are given the room to be explored in an organic and engaging way. When condensing these kaleidoscopic endeavors into a film, much of the emotional weight is lost in favor for what essentially amounts to “interesting ideas”: the philosophy underlying Luke’s cynicism, Rey’s development as a Jedi (we are given some “shocking” background story, but how does this affect her character? She’s still on the good side at the end [I almost wanted her to go to the dark side, just to shake things up]), or Kylo Ren’s inner conflict (which, again, amounts to nothing—he is still the “bad guy” at the end of the film).

While The Last Jedi does not have a terribly high amount of plots and characters, it does incessantly move from one thing we are meant to be taking seriously to another, a system which amounts to the same thing: the dilution of the audience’s emotional investment. Sure, much of the frantic pacing works for the fresh new theme of “let the past die, look to the future” which may in fact be commenting on the generally pessimistic milieu of our times, and whose newness does manage to “keep the myth interesting, and hence relevant” as I mentioned in my last column. However, The Last Jedi is revisionism done wrong, in the vein of Nolan’s The Dark Knight Rises, where a lot of interesting things are going down without succeeding in making us care. This is in sharp contrast to the much more cogent (and also revisionist) The Dark Knight, or The Empire Strikes Back. Recall how much time we spend following Luke’s training with Yoda in Episode V, or Rey the scavenger-for-parts at the beginning of The Force Awakens. These are some of my favorite moments in the franchise, and the reason they work is because we’re there for a while, to the point where the depicted world begins to feel organic, our own—thus paving the way for emotional investment.

If anything, The Last Jedi has compelled me to familiarize myself to a much greater extent with the Star Wars canon. Through my current efforts to understand just what in the world was happening in the film, I might eventually be able to tame my currently lashing and thrashing response to such a degree that the film may not appear as messy and improvised as it does now. Who knows, a year from now—maybe less—I may even like it.

Image courtesy of Laura Elizabeth Hand, CC’19

The hippocampus is one of those brain regions that pops up again and again in popular science literature, and for good reason. Most people associate the hippocampus with memory, mainly thanks to Henry Molaison, better known as H.M. Over fifty years ago, a hotshot neurosurgeon named William Scoville removed most of his hippocampus in an attempt to cure his severe epilepsy. The treatment worked but at a severe cost, as H.M. lost the ability to form new memories.

This curious case kicked off modern memory research as we know it. Decades of follow-up research has connected activity in the hippocampus to a variety of functions, most famously  the formation of episodic memories. Inspired by this human case, researchers peered into the brains of awake mice in an attempt to learn more.

One of the reasons why we can investigate this brain region in particular across species is just how similar the hippocampus of a mouse is to a human. It is an ancient structure, millions of years old, but it is arguably the first of the most ‘advanced’ brain regions to develop. While there are obviously differences in size between the species, the underlying organizational principles are nearly identical. What makes the hippocampus so special that we and our rodent cousins have one, but frogs don’t?

During one of these mouse experiments, a scientist named John O’Keefe made a curious finding. When the animal ran around in its environment, a certain kind of cell in the hippocampus would consistently fire only when the mouse navigated through a particular position. This finding later won him the Nobel Prize in Physiology or Medicine and spurred another avenue of research into how these ‘place cells’ (as they have since been dubbed) form a sophisticated ‘cognitive map’ of space.

Meanwhile, the development of fMRI in humans enabled human researchers to study learning, memory, attention, curiosity, and many other cognitive functions of the hippocampus. More than just memory, this enigmatic part of the brain is necessary for imagination, planning, and many other processes we consider so essential to our human existence.

Given the similarities between mice and men, it’s reasonable to expect that the mouse and human hippocampus are doing similar things. So why are their scopes of research so radically different? How exactly do cells that respond to a rodent’s current location in place create memory? While long existing in different spheres, new research aims to bridge the gap.

From the mouse side, non-place features of place cells are increasingly providing evidence for a broader, more integrative role of hippocampal pyramidal neurons than simply recording place. Recent findings, some unpublished, from the Society for Neuroscience 2017 Annual Meeting demonstrated many of these newly discovered, more diverse functions.

In highly social bats, ‘place’ cells can record the location of their fellow bats just as well as their own. In rats, ‘place’ cells can ‘map out’ a representation of sound. In monkeys, ‘place’ cells can fire without movement simply by looking around the environment. Most convincingly, a number of studies have shown that ‘place’ cells can also record a detailed representation of time.

Increasingly, it seems that these special hippocampal cells fire not only to locations, but a number of other things too. Some, if not most, of these cells respond to multiple things at once, like place and time, or sound and place.That feature, crucially, is indispensable in creating a memory. These cells aren’t just recording places, they’re combining different aspects of an experience together. Put another way, a ‘place cell’ isn’t simply mapping space, it’s making a memory.

While neither I nor neuroscience more generally has an answer to the question I posed at the beginning of this column, combining decades of research in mice and humans will help guide the way forward.

 

Citations and further reading:

  1. Scientific reviews are a great way to delve deeper than articles like mine without wading too deep into the terminology of primary articles. For an overview of the importance of H.M. to the field, I recommend: Squire, L. R. (2009). The legacy of patient H.M. for neuroscience. Neuron, 61(1), 6–9.
  2. To read the seminal place-cell study by O’Keefe: O’Keefe, J., & Dostrovsky, J. (1971). The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. Brain Research, 34(1), 171–175.
  3. For a broader review of place cells by nobel laureates in the field: Moser, M.-B., Rowland, D. C., & Moser, E. I. (2015). Place cells, grid cells, and memory. Cold Spring Harbor Perspectives in Biology, 7(2), a021808.
  4. Bats encoding in 3D, same lab with the preliminary unpublished social findings (primary paper): Sarel, A., Finkelstein, A., Las, L., & Ulanovsky, N. (2017). Vectorial representation of spatial goals in the hippocampus of bats. Science, 355(6321), 176–180.
  5. Rats encoding non-spatial ‘sound map’ (primary paper): Aronov, D., Nevers, R., & Tank, D. W. (2017). Mapping of a non-spatial dimension by the hippocampal–entorhinal circuit. Nature, 543, 719.
  6. Monkeys encoding a non-movement based ‘visual map’ (primary paper): Killian, N. J., Jutras, M. J., & Buffalo, E. A. (2012). A map of visual space in the primate entorhinal cortex. Nature, 491(7426), 761–764.
  7. Review of time cells by a giant in the field: Eichenbaum, H. (2014). Time cells in the hippocampus: a new dimension for mapping memories. Nature Reviews. Neuroscience, 15, 732.
  8. To read more about a fascinating brand-new big-picture theory about the hippocampus: Stachenfeld, K. L., Botvinick, M. M., & Gershman, S. J. (2017). The hippocampus as a predictive map. Nature Neuroscience, 20(11), 1643–1653.

The new Star Wars: The Last Jedi trailer has been out for months now, and fans—old and new alike—are still raving about it, once more submerging themselves in that paroxysm of fervent fan-boy anticipation, pre-packaged with every preview of the upcoming chapter which instantaneously dominates the masses, spreading like wildfire the moment they hit YouTube. “What this trailer did,” said Jeremy Jahns, popular YouTube movie reviewer, “is what Star Wars trailers do, and that’s put Star Wars at the forefront—like yeah, this is happening.”

One person who’s probably less excited about the upcoming film is Star Wars creator himself, George Lucas, who gave up creative rights to the Star Wars universe after selling the franchise to Disney in 2012 for a whopping 4.05 billion USD. In a 2015 interview with Charlie Rose, when asked how he felt about Episode VII: The Force Awakens (the first installment of the reboot trilogy) Lucas said: “We call it space opera but it’s actually a soap opera. And it’s all about family problems—it’s not about spaceships…They decided they were gonna go do their own thing…They wanted to make a retro movie—I don’t like that. I like…Every movie I make I work very hard to make them different. I make them completely different: with different planets, different spaceships—yenno, to make it new.

I disagree with Lucas’ judgement of Disney’s “nostalgia” approach and maintain that, in order for the reboot to have had the same initial impression of awe-inspiring proportions on the new generation as A New Hope (’77) had on the old, it had to retain as much of its mythic dimensions as possible—which, in order to accomplish, adopting the nostalgia approach was clearly the most surefire way to go. Whatever backlash The Force Awakens (2015) might have received in regards to its “uninteresting” and “boring” semblance to the original fails to recognize what it is that makes Star Wars so compelling a cultural force: that is, its function as myth, which, by its very nature, must remain as little changed as possible if it is to remain relevant.

Here it is important to distinguish between myth and narrative, for the latter is merely the particular (and always varying) mediation of the former (which is always the same). Put another way, a narrative, or an individual story, is simply a representation of a kind of “master story” that pre-exists in the audience’s mind long before they sit down to watch The Force Awakens for the first time—assuming, of course, the audience has lived long enough to have acquired a fairly confident intuition in regards to what constitutes this so-called “master story” that is myth.

“Myth” comes from the Greek word “mythos,” meaning “story.” It is from this definition that our understanding of myth must necessarily arise, for most theories of myth begin from the accepted idea of myth as a kind of “canon of story.” Here it is noteworthy that the medium of the story is not signified, for it would be erroneous to confine myth to a single art form (i.e. myth as the literary canon). Consider, for example, how ancient cave paintings are fraught with narrative imagery, from the dancing scenes of Serra de Capivera, Piauí, Brazil (28,000 to 6,000 BC) to the enigmatic beings and animals of Kadaku, Northern Territory, Australia (26,000 BC); after all, the story “I saw a kangaroo” is still a story, though, to us, not a particularly interesting one (insofar as it is not all that sophisticated).

What is interesting is that such geographically disparate populations, who would have had no physical means of contact with one another, should engage in the same activity (which is not necessary for biological survival) with the same level of behavioral predictability of birds from separate continents—all of whom seem to instinctively grasp the concept of “nest-building” as pivotal for their offspring’s protection. What is it, then, that prompts what appears to be a primordially entrenched instinct of human nature? What is the point of saying, “I saw a kangaroo”?

The answer to this can be arrived at by emphasizing the two subjects of the sentence and studying the resulting truth-values derived thereof. For if the emphasis is placed on “a kangaroo,” then one extracts an empirical value tantamount to the scientist’s collected data. Here, the sentence derives significance from its illumination of some perceived aspect (in this case, the “kangaroo”) of the world, that is, of reality. On the other hand, if one places the emphasis on “I saw,” a second meaning is discovered, this time signifying the presence of “I,” that is, the storyteller. This too can be perceived as empirical but, more notably, as humanistic, for the manifested will to engage in an activity that will record the existence of oneself at a given time is a behavior unique to the human species.

What results from this innocuously curios act of paint-on-wall, then, is the radical evolutionary leap towards self-reflexivity, whereby an innate curiosity is cognitively mastered through creativity. Of course, this process has long been practiced by humans, but early-on it was strictly in the material sense, and motivated by survival at that. With the emergence of art, however, the human’s cognitive faculties began to operate within a more fundamentally psychological dimension, one motivated not by survival, but the acquirement of knowledge, especially as this knowledge relates to the human being. In other words, cave painting illustrates a primordial desire to understand reality–that is, the universe–and humanity’s place in it.

The primary questions which myth asks, then, are: What is the nature of reality, and why am I a part of it?

The narrative patterns that emerge from humanity’s collective efforts to answer these questions is myth. These patterns can be found not only in paintings (depictions of animals, hunting scenes), but also, more complexly, in the literary tradition. Herein lies my previous need to distinguish the “storytelling” canon from the “literary” one, since the literary, by its very nature, allows for a more immediate and elaborate representation of stories. We can count in these patterns, among others, creation stories, Campbell’s “monomyths,” earth/water mothers, etc. Most of us brought up with a classical education which included a relatively similar rubric of books are no longer surprised to find that the narrative elements of the Bible can be found in the Epic of Gilgamesh, can be found in the Popol Vuh, Homer, Shakespeare, Faulkner—you get the idea.

The last author mentioned beautifully described this intrinsic human need for myth during his Banquet Speech at the Nobel Prize ceremony in 1949. Having discussed the paranoia bred by the Cold War, and the consequent nihilism of that milieu, he insisted that Man must remind Himself of “the old virtues and truths of the heart, the old universal truths lacking which any story is ephemeral and doomed—love and honor and pity and pride and compassion and sacrifice…[otherwise] His griefs grieve in no universal bones.”

All the “universal truths” Faulkner mentioned are major narrative forces of George Lucas’ epic saga: Anakin’s pride leading up to his metamorphosis into Darth Vader (The Revenge of the Sith, 2005), only for him to express compassion and pity in his final moments (The Return of the Jedi, 1983); the honor and love between friends that keeps the pack together through all manner of adversities (as in, say, Leia’s rescuing of Luke in The Empire Strikes Back, 1980); and, more recently, the sacrificial deaths from all of Rogue One’s (2016) major characters. Thus, The Last Jedi will be the latest installment of what can safely be called one of modernity’s greatest myths, for its treatment of these perennial themes has given it a universal appeal and, consequently, a formidable staying power worthy of mythic status.

In light of all this, the Reader (especially if they do not consider themselves a fan—on any level) may begin to appreciate the magnitude of cultural significance The Last Jedi is bound to have come this Christmas. Its inception into cinemas this December will call upon (as the best mythic tales often do) a mass gathering of people who will expect to be awed and moved and shocked and, on top of all these things, reminded of these universal truths, thereby permeating, if at least for a moment, a sense of solidarity among the masses which the cynical media eye will have us believe is practically nonexistent in modern times.

Too sentimental? Perhaps. Let’s just hope the film isn’t (i.e. don’t kill Rei yet, by far my favorite Star Wars character ever!).

P.S. You can watch the trailer here, for those of you who (for whatever reason) haven’t seen it yet.

Official White House Photo by Lawrence Jackson

 

Some of our loyal readers may have noticed this column has had an irregular publication schedule lately. This is because I wanted to give everyone a fresh update from the Society for Neuroscience 2017’s annual meeting, the largest gathering of over 30,000 neuroscientists every year to discuss the most fascinating and cutting-edge research.

Unfortunately, that update will have to wait another week, because today I feel compelled to use my platform to talk about the current tax bill making its way through congress. This bill, if passed, would effectively make graduate school impossible for all those but the independently wealthy, and would decimate the structure of science as we know it.

I typically keep this column apolitical, as my goal is to spread interesting neuroscience knowledge to everyone, rather than wading into the political thicket. Were this bill to have been proposed by the other side of the aisle, I would take equal issue. This overarching legislation aims to in part simplify taxes to, as its proponents so often state, ‘the back of a postcard.’

One such ‘simplification’ is the repeal of Section 117(d)(5), a tiny piece of the tax code that makes a huge difference to graduate students. In most STEM graduate programs, students have their tuitions waived and are awarded a modest stipend of approximately $20,000-$30,000 per year to focus on their research. Under the current tax code, graduate students are only taxed on their stipends, which makes sense, as this is the only money they actually take home.

In the tax bill just approved by the house, this exemption is removed. That means a catastrophic increase in tax burdens for all STEM graduate students. Let’s take an average graduate student in Columbia’s Neurobiology PhD program. Their take-home income is just under $30,000 from their stipends, but Columbia’s tuition (which, again, a graduate student never sees or pays), is nearly $50,000. If the senate passes the current version of this bill, graduate students will see a tripling of their tax burden, an increase of over $10,000.

Essentially, by trying to simplify the tax code, this bill would prevent all but the most wealthy of graduate students from pursuing higher education. While some universities may be able to increase stipends to compensate, most cannot afford to. Graduate students are the backbone of labs, and their projects make up the bulk of research happening in the US; without them, there is no science as we know it.  

Without this tiny line of tax code, programs will slash acceptances, US science productivity will plummet, and the hundreds of innovations which have made us a superpower will grind to a halt. Like all of STEM, neuroscience is reliant on the productive output of graduate students. While we are on the cusp of incredible breakthroughs in understanding the brain — many of which can lead to cures for heartbreaking diseases — none of that is possible with the passage of this tax bill in its current form.

This is bigger than politics, and this is bigger than just science. This is about ensuring that the United States continues to be the world’s leader in innovative scientific and technological breakthroughs. If you enjoy the tiny computer in your pocket, have yourself been or known someone helped by modern medicine, or believe in the necessity of scientific progress, please take the time to speak out against this bill and ensure that if it progresses, it does so without this provision. You can find your representative’s information here; ask them to oppose the repeal of Section 117(d)(5) within the Tax Cut and Jobs Act.

Next week, I promise we’ll be back to our regularly scheduled programming with some fun, new neuroscience findings.