Uniquely Human

A Column by Heather Macomber

About The Column


This column will use recent discoveries in psychology, neuroscience, and sociology to tackle modern issues, unravel common misconceptions, and search for a scientific solution to uniquely human problems.

About The Author


Heather Macomber

Columnist

Heather is a senior in Columbia College, majoring in Neuroscience and Behavior. In addition to writing for the Lion, she is the Vice President Emeritus and Research Coordinator of the Columbia Neuroscience Society, researches perception using mouse models in the Bruno Lab, edits for the Columbia Science Review, and serves as a peer advisor for the Neuroscience major. She is passionate about making neuroscience accessible and mentoring the next generation of neuroscientists, and she hopes to spend her life unraveling the mysteries of the brain.

Posts


While Columbia courses are advertised as mostly intimate and discussion-based, walking into your second (or even third, or fourth) lecture of the day is disturbingly common. Some courses, such as Introductory Biology, consistently reach over 200 students per section. Personally, 54% of my courses (by credit value) in the first two years have been large lectures.

In the engineering school, the percentage of time spent in Havemeyer 309 or Pupin 301 increases, with a close friend with a typical Biomedical Engineering major courseload spent a whopping 81% of her initial coursework stuck in a lecture hall. While humanities courses may admittedly have fewer lecture courses, a significant number of Columbia STEM students spend the majority of their time in lecture courses for their first few years here.

The central role that lectures play in today’s system of higher education cannot be overstated. Ever since parchment was precious and reading a skill reserved for the exclusive elite, any hope at educating the populace relied on the lecture for information transfer. The core format of the lecture would be recognizable to a medieval instructor, while the dramatically changed world outside would entirely unrecognizable.

The reality of a 21st century world makes information not only overwhelmingly available in written form, but also in new, innovative, and interactive formats. As creative ways of learning proliferate at an exponential pace, it is well past time for this ivy-league world-renowned institution of higher education to seriously reconsider consider the ineffectiveness of its most overused workhorse.

Columbia owes it to both its students and itself as a leader to take into account the increasing consensus in neuroeducation research that there is a better way to teach than through lectures. When considering the best way to teach students, we should be thinking about how people actually learn, especially when implementing neuroscience-based changes would hardly cost more and would simultaneously increase both professor and student satisfaction with our Columbia-brand education.

Over forty years of scientific research shows that a student can hardly pay attention to a lecture past its first twenty minutes, when Columbia teaches in 75 minute blocks, that interactive learning is over twice as effective as passively listening, and our nobel laureates could be better put to use actually interacting with the students they teach instead of being kept at arm’s length. Lectures are simply incompatible with the way we’re wired to understand our world.

There is a better use for Columbia’s highly-esteemed professors than wasting time repeating the same information semester after semester to half-empty classrooms of bored and distracted students. There are better uses for its bright and energetic graduate students than re-explaining the material to confused undergraduates.

It’s ironic that some of the best research on learning, the very research that shows how ineffective lectures are, is coming from the labs of Columbia professors who have to turn around and continue to teach in this outdated style. Our diplomas cannot only be valuable on the merits of Columbia’s history; there must be true learning behind our degrees.

Isn’t that why we came here, to learn from the best and brightest, to learn for the rest of our lives and not just for the next exam? In the next few columns, we will be exploring how recent research on attention, learning, memory encoding, and recall can redesign the Columbia classroom. Columbia has always been at the forefront of societal change; it only makes sense that we should be leading the revolution in higher education as well.

Uniquely Human runs alternative Mondays. To submit a comment or a piece of your own, email submissions@columbialion.com.

I am far from the first person to wonder what the answer might be. The idea we are special certainly isn’t new; the core curriculum gives us plenty of arguments to back up that idea. From Aristotle to Kant to yes, even Darwin, our greatest thinkers have always believed that human cognition is unparalleled in the universe. So here we stand, at the top of the food chain, looking down at the rest of the animal kingdom and wondering; are we actually unique, or just egocentric?

Neuroscience might finally give us that answer we crave. For a field younger than some of our parents, it has managed to begin the daunting process of untangling the web of neurons in our brains, while giving us Buzzfeed-worthy headlines along the way. Neuroscience has a way of getting up in every other field’s business, with a reach that’s far exceeded standard academic discourse – perhaps that’s why I’m so hopelessly fascinated by it.

Like an angsty teenager, this young field has a tendency to argue and frequently change its mind. Unlike a teenager, when new research dethrones one theory and crowns another, the public often loses faith in our credibility. After all, we once believed the heart was the seat of all intelligence, and the brain was nothing more than a simple regulator. How can we be expected to really know anything at all?

As scientists, we learn to accept this inherent instability, the sobering truth that we’re wrong far more frequently than we’re right. But as scientists, it’s also on us to explain why our work, even in its failures, is important. Perhaps more importantly, we do not work in a vacuum, and this field is poised to understand how we think and how we live. For all of humanity’s success in conquering the world, our species now stares down threats primarily of our own making. What better way to approach these issues than through understanding who we are and what makes us tick? After all, most of our salaries come from you, the taxpayer – ultimately it is up to the public to see the value in what we study.

This column is an attempt to use powerful discoveries about our brains to propose science-backed solutions to wider social issues. Neuroscience is an ever evolving, consistently contradictory, frequently flawed, and ultimately and beautifully human pursuit of the kind of knowledge we like best: knowledge about ourselves. While it’s not perfect, it’s hard to deny that studying our brains might provide some valuable insight into our uniquely human problems.

Uniquely Human runs alternate Mondays. Questions, comments, concerns, and thoughtful dialogue are always welcome.

­­