Uniquely Human

A Column by Heather Macomber

About The Column


This column will use recent discoveries in psychology, neuroscience, and sociology to tackle modern issues, unravel common misconceptions, and search for a scientific solution to uniquely human problems.

About The Author


Heather Macomber

Columnist

Heather is a senior in Columbia College, majoring in Neuroscience and Behavior. In addition to writing for the Lion, she is the Vice President Emeritus and Research Coordinator of the Columbia Neuroscience Society, researches perception using mouse models in the Bruno Lab, edits for the Columbia Science Review, and serves as a peer advisor for the Neuroscience major. She is passionate about making neuroscience accessible and mentoring the next generation of neuroscientists, and she hopes to spend her life unraveling the mysteries of the brain.

Posts


Illustration made by Laura Elizabeth Hand, CC’19

Why are we all so unsatisfied? It’s both an existential and practical question, facing down administrators at colleges across the country. As diagnoses of mental disorders have skyrocketed and the palpable aura of discontent has began to seep into millennial spaces, especially the college campus, most experts are left wringing their hands without explanation. While hundreds of think pieces have been written about the existential dread of the modern world, very few have wondered if our brains themselves may be incompatible with the society we’ve made.

To understand where the disconnect between brains and our society comes from, it’s worth focusing on biology. Humans have evolved our complex brains over millennia to do one thing better than other species — to reduce uncertainty. We do this by predicting the future based on our past experiences, and then adjusting those models when they’re wrong. We call this process learning.

The neurons in our cortex and hippocampus, two areas essential for learning and prediction, are especially wired for these tasks. These neurons have two kinds of channels at their synapses that bind to glutamate, the primary excitatory neurotransmitter in the brain. The simpler channel opens up whenever glutamate is around, causing quick but fleeting pulses of activity. The more complex one needs a lot of glutamate to open, but when it does, it triggers a host of structural changes in the neuron to make it more responsive in the future.

This process is called long-term potentiation, and it is the molecular basis of learning from sea slugs all the way up the food chain to Homo Sapiens. But one innovation made by mammals is the addition of dopamine to the picture. For us, whenever something unexpectedly good happens that doesn’t meet our predictions, our brains send those neurons a pulse of dopamine. This cements those molecular changes of LTP on a scale of weeks to months, and makes sure that the association is learned.

This process was ideal for the hundreds of thousands of years humans spent as hunter-gatherers. We lived in an much more uncertain world, where many of our predictions were wrong and small unexpected pleasures (such as finding berries where there were none previously) abounded. Our brains would frequently receive small pulses of episodic happiness through dopamine. Learning based on rewarding prediction errors to motivate similar behavior in the future works only when those prediction errors are common.

While the world may seem uncertain existentially, in the most basic of ways it is far more predictable. That lack of constant, small pulses of dopaminergic inputs may be the root cause of many modern issues. When most of our material comforts are taken care of by technology, we turn elsewhere to find those hits of dopamine our brains are wired to crave. Whether that be through an alert notification on our phones, a pint of ice cream, or forcing a dopamine rush through  alcohol, opioids, cannabinoids, or other substances, we increasingly engineer artificial means of dopamine release — sometimes to addictive and destructive ends.

So what can be done to alleviate the issue? Instead of turning to massive and artificial methods of dopamine generation, re-introducing small and, more importantly, surprising pleasures into your life can provide a brain evolved to learn through unpredictability those necessary reward prediction errors. Eat a new food, explore a new place downtown without an agenda, have a conversation with someone unexpected. Our intelligence is how we’ve made it this far — maybe we can think our way out of this one.

 

Full credit for this conceptualization goes to Peter Sterling. For a more detailed elaboration on this idea I wholeheartedly recommend reading his essay “On Human Design” or his book Principles of Neural Design, specifically Chapter 14

Uniquely Human is written by Heather Macomber and runs every other Monday. To submit a comment/question or a piece of your own, email submissions@columbialion.com.

Illustration by Laura Elizabeth Hand, CC’19

 

I’ve spent a lot of time in this column so far talking about studies carried out in humans, usually using techniques like fMRI, EEG, or PET scans. However, a lot of neuroscience research, my own included, happens in what we call ‘model organisms’, one of the most common being the humble mouse. In conversations about my research, I’ve frequently gotten a variant of this question: “Why are you working on mouse brains if you want to understand how humans work?”

Since  I’ll be covering research done in lots of non-human species this semester, I wanted to take a column to talk about why I believe it is necessary to use animals in neuroscience research, and what they can tell us about the brain that human studies cannot.

Basically, it comes down to two things: in mice you can investigate the brain more directly at a much smaller scale, and you have much more causal control over the conditions of your experiments. First, let’s talk about the matter of scale.

In humans, functional magnetic resonance imaging, or fMRI, was a massive breakthrough in neuroscience. To this day, it is considered the highest degree of spatial resolution possible to monitor real-time neural activity in living humans, except for the rare electrodes allowed by a neurosurgery patient. In humans, fMRI is as far as you can ‘zoom in’ on the behaving brain.

However, like with any technique, there are downsides to fMRI. While most popular science articles call fMRI results ‘neural activity,’ fMRI is actually measuring the amount of oxygen that the blood in your brain is using, which serves as a proxy for neural activity. In other words, the assumption is that the more oxygenated blood a brain region is going through, the more neurons are firing in that region.

The other huge issue with fMRI is scale. An fMRI scan is like a 3D video, and just like a movie has pixels, there’s the smallest possible unit of detection in fMRI – the voxel. Its name comes from a combination of the words ‘volume’ and ‘pixel,’and it essentially is a pixel, just in three dimensions. The highest current possible resolution of a single voxel averages the oxygenation of approximately 100,000 neurons over one second, which means that the activity of 100,000 cells is reduced to a uniform greyish box on the display.

While that’s a pretty small percentage compared to the ~80 billion neurons of the brain, an fMRI still can’t tell you what specific kinds of neurons are activating, or anything about the pattern of activity below a voxel scale. So how do we understand neural circuits at a more detailed level?

That’s where mice come in. Mouse brains have most of the major features of human brains – they even have a neocortex that is structured almost identically to our own. In mice, it is much easier to observe these smaller scales, which span from from single neurons to the simultaneous observation of thousands of neurons at a time.

Mice are particularly well-suited to this task because of the immense control an experimenter can have over a given experiment. Every aspect of a lab mouse’s life is regulated from birth to death, which is impossible to control for in human studies.

Beyond behavioral control, genetic techniques enable causal manipulations at a cellular level. Thousands of mouse strains have been specially made to manipulate the expression of particular genes, optogenetic techniques enable researchers to turn on or off specific neuronal populations during behavior, and two-photon imaging paired with calcium labeling lets us observe the activity of individual neurons in real time.

These advantages of experimental control and fine-scale observations are only possible in animal models. While mice have their disadvantages too, namely that without language behavioral motivations becomes difficult to interpret, their use clearly contributes to neuroscience overall. Discoveries in mouse models help guide human researchers to better theories, better treatments, and ultimately, a better understanding of ourselves.

 

Uniquely Human is written by Heather Macomber and runs every other Monday. To submit a comment/question or a piece of your own, email submissions@columbialion.com.

Photo Courtesy of Columbia University.

Welcome back Columbia! From all of us at the Columbia Lion and from here at Uniquely Human, we hope all your summers were fruitful and relaxing. As we get back into the swing of classes, I wanted to write an update on the future of this column and what to expect going forward. First and foremost, Uniquely Human will be continuing its regular release schedule of every other Monday, starting today, so expect a new release two weeks from today.

As a neuroscience student here, I hear about all the impressive, exciting, and paradigm-shifting research coming out of Columbia labs. But no matter how interesting the research, many average Columbia students don’t know what’s coming out of their own institution. Scientists often only share their research in journals aimed solely at other scientists in their subfields. The most interesting conversations about neuroscience are ones that neuroscientists have with each other.

I want to change that.

This is your university, and this research is mostly paid for with your tax dollars. I think you have a right to understand what discoveries in neuroscience are coming out of Columbia, and how they may affect your lives in fascinating and surprising ways.

I believe the best kind of science happens when it’s in communication with the public. In these tumultuous times, now more than ever it’s critical that everyone knows what valuable contributions neuroscientists are making to how we understand ourselves. I think these kinds of conversations are most interesting when they’re had across disciplinary lines – with other scientists, with writers, philosophers, artists – and you, reader, have a worthy perspective to contribute.

So this semester we’ll be thematically shifting our focus away from our series on education and the brain. In its place, I’ll be reviewing the latest and greatest discoveries coming out of Columbia neuroscience using straightforward language, hopefully humorous analogies, and with an eye for the big picture implications. When possible, I’ll be interviewing researchers directly to get the best information directly from the researchers to you.

As always, the contents of this column are mostly dependent on what I want to write, which means not every column will be about Columbia neuroscience discoveries; there will be stories relating neuroscience to both campus and worldwide events.

As always I am happy to take requests. This is only a column in conversation when I can hear your voice. If you have questions that you want answered from a neuroscientific point of view, I’ll do my best to answer them. I can’t wait to share this amazing research with you all, and I hope to see you here next week for our first true installment in our series.

Uniquely Human is written by Heather Macomber and runs every other Monday. To submit a comment/question or a piece of your own, email submissions@columbialion.com.

Photo Courtesy of the Zuckerman Institute

The neuroscience major is not unlike many at Columbia in that it is co-sponsored by two departments, psychology and biology. In fact, a little over half of Columbia majors share this structure of co-sponsorship. Ideally, each department communicates with its counterpart to design a robust, cohesive course schedule that draws from the expertise of the individuals in both disciplines.

However, a majority of courses in the neuroscience major retain their specific psychological or biologic identities without fully integrating the other, thereby falling short of a true neuroscience curriculum. I would like to emphasize that I do not believe individual professors are at fault, and I have truly enjoyed my time as a neuroscience major. However, I do believe that heightened interdepartmental communication could help improve the experience dramatically.

Despite efforts to heighten cross-disciplinary conversations, departments at Columbia largely remain insulated from one another. As each individual professor teaches their course, they are largely unaware of what material the students are already familiar with when entering their classes. To take a closer look, let us examine the course of a typical student in the Neuroscience and Behavior “Despite efforts to heighten cross-disciplinary conversations, departments at Columbia largely remain insulated from one another.(N&B) major at Columbia.

For a first-year interested in the major, a potential N&B student will likely fulfill their introductory chemistry requirement and take Science of Psychology in their first year. Neither course is neuroscience-specific, and both are lecture-style. While perhaps not ideal, such sizeable and nonspecific courses are typical for first-year students.

As a sophomore, the repetition becomes more readily apparent. On the psychology side of the major, N&B students can either take Mind, Brain, Behavior (MBB) or Behavioral Neuroscience. While both courses technically fulfill the ‘intro’ neuroscience requirement from the psychology side, they are very different.

MBB is the less science-heavy of the courses and is commonly taken by non-science majors to fulfill their science requirement for the Core. The syllabus can vary based on the professor, but in any given year the course content for these two classes has almost 70% overlapping material, plus a good amount of overlap with Science of Psychology. Some refresher material is a good thing and is useful to better understand new material. However, for three courses in the same department, two of which are required for N&B majors, this high amount of re-teaching is somewhat unnecessary when it instead could be spent learning new information.

Material overlap continues to be a significant concern on both sides of the major. In the rigorous two semesters of Professor Mowshowitz’s Introductory Biology course, at least a month is dedicated to neural mechanisms. Meanwhile on the psychology side, in the series of psychology lecture courses a N&B student may choose to take, the first few weeks are spent covering the same introductory material that these students have now encountered at least three times.

Continuing along the biology side, N&B students wade their way through pre-requisites only to enter their first neuroscience class in the year-long Neurobiology I&II sequence. Between these sequential courses a good deal of overlap still remains, with systems-level information taught in the cellular level fall course and cellular mechanisms covered again to teach systems in the spring.

The remaining requirements for the major include a non-neuroscience specific statistics course and a non-neuroscience specific additional biology course — for which a neuro-themed variant has not been taught since Fall 2013.

Overall, a common experience among N&B majors is a feeling of disjointed repetition and lack of neuroscience-specific courses catered to their needs and/or interests. I do not believe such a feeling is limited to frustrated neuroscience students — I have heard the same complaint expressed again and again by friends in various joint majors throughout Columbia College.

So what can be done to fix the glaring issues in the design of the N&B major? Ideally, the whole major would be restructured from the ground up to create a fully-integrated design. Realistically, the bureaucracy necessary for such an overhaul is untenable. Instead, I have a few simple proposals to streamline and vastly improve the experience of N&B majors at Columbia.

The greatest concern, of course, is the overlap of course materials. Luckily, each professor has a fair amount of leeway in their syllabi. Because of this freedom, I suggest that professors, responsible for N&B courses on the biology and psychology sides of the major, set aside one full working day at the beginning of each semester to overview syllabi with an eye for overlap.

I believe a good deal of the issue in being taught the action potential seven times is well-intentioned, with each professor unsure if the students have covered this material before. Such a semesterly meeting would eliminate the interdepartmental uncertainty and go a long way towards eliminating unnecessary repetition in courses.

Additionally, I propose expanding and integrating voluntary courses between the two departments by allowing more cellular-heavy neuroscience majors to focus in neurobiology courses, and psychology-heavy majors to spend more time on the psychology side. Allowing these electives outside of the ‘core’ major courses to be taken in either department would enable a range of students to unify within a single major.

From a scheduling perspective, neuroscience courses at a higher than introductory level must be offered on a regular basis. Here, the psychology department far outstrips biology, offering a wide range of rotating seminars. While still skewing towards psychology, some neuroscience-heavy courses are at least offered each semester from the psychology department.

Overall, I put forward a recommendation that Science of Psychology no longer be mandated for N&B majors, and instead it should be replaced by a comprehensive Behavioral Neuroscience introductory course tailored for N&B majors. With this change, Mind Brain Behavior can more specifically and more accessibly target a non-major audience, and Behavioral Neuroscience can serve as the sole prerequisite for Neurobiology I&II, allowing majors to take this course in their sophomore or junior year and leave space for more seminar-style neuroscience electives as upperclassmen taught by professors in their regions of interest.

With a graduating class of 65 majors last year, N&B is the eighth largest program within Columbia College, and has rapidly grown over the last few years. With the opening of the Zuckerman Mind Brain Behavior Institute, Columbia will only continue to attract the best and brightest neuroscience undergraduates. I believe that professors and administrators want to provide the best education possible to the student body — and that many of the problems within the N&B major can be solved by increased communication between the biology and psychology departments and some simple restructuring.

Image courtesy of freshNYC

How would you describe yourself?

Most people can immediately come up with at least a few adjectives to summarize their personalities, and when these people are asked how well they know themselves on a scale of 1-10, the answers are overwhelmingly above 8. When asked to estimate if their ‘core’ personalities have remained consistent over time, the majority agree that while they have indeed changed, certain fundamental aspects of themselves remain the same.

People make important decisions based on the idea that personality continuity often underlies individual growth. You believe that person you choose to marry has essential qualities  which will remain good, that criminals have essential qualities which will remain bad, and that the people in your life all have dependable qualities. When attributing the incredible successes or failures of CEOs, celebrities, or pro-athletes, most tend to credit or blame their personalities.

While this convincing story pervades our culture, modern research indicates that this idea of an individual’s consistent personality is just a myth. A few months ago, the longest-running study on personality was published. Begun in 1947, teachers were asked to rate their fourteen-year old students on six personality traits. Sixty-three years later, researchers tracked down as many of the original participants as they could and analyzed their personalities.

Upon analysis, none of the six traits showed any significant stability across the time-span. While ideas about personality and experimental methods have changed drastically in the intervening decades, more modern neuroscientific research backs up these sorts of long-running surveys with fMRI studies of the changing brain.

While many ‘tests’ of personality exist on the internet, almost none of them hold any neuropsychological weight. This includes the famed Myers-Briggs model, which sorts individuals into sixteen distinct personalities according to four to five traits, each with a corresponding letter. If you have ever had someone tell you they are an ENFP, or INTJ, that’s the model they’re referring to. Though certainly entertaining, such tests have long-been discredited for being too myopic and binning people into binary categorizations.

Although many scientists disagree, the generally-accepted model of personality these days is the Big-Five, which gives individuals a rating from 1-100 on five distinct traits — Extraversion, Neuroticism, Agreeableness, and Conscientiousness. If you are interested, this is the best way to take it online.

Recently, neuroscientists have begun to examine how high scores on various factors in the Big Five might map onto brain structure. Using structural MRI, one team examined brain volume as it varies with brain region size, finding that extroverts had a larger medial orbitofrontal cortex, a brain region which processes reward. This area is heavily implicated in response to social reward, so it is  possible that extroverts enjoy social interactions because they supply them with a ‘hit’ of dopamine.

Increased scores on neuroticism correlate with bigger brain regions associated with threat, punishment, and negative behavior. It is possible that neurotic people feel the potential threat of a negative event more powerfully than those with smaller cingulate cortices, and therefore are more concerned over potentially troubling events.

Agreeableness correlated with a larger lateral prefrontal cortex, a region that loosely corresponds with planning and higher-order processing. Though they did not find a significant association with Openness, neuroscientists found some possible correlations with parts of the parietal cortex associated with integrating sensory stimuli.

While this study did not use functional MRI to tell us what regions are activated when exhibiting behaviors associated with these traits, there does appear to be some association between the sizes of these brain regions and an individual’s personality.

If psychology research tells us personality changes drastically over time, and neuroscience research indicates that our brains reflect our personalities, what underlying mechanisms in our brains are underlying these changes?

Some potential clues lie in memory research. A large body of evidence tells us that each time a memory is ‘accessed’, it is altered, sometimes dramatically, before going back into storage. As experiences pile up in our lifetimes, the memories we make are incorporated into the ways we face new information, and change the ways we make decisions.

The other massive factor in our decision-making comes from our surroundings — specifically, our social surroundings. The cultural norms which permeate a place can strongly influence how a personality changes over time, as new experiences permeate the neural wiring. With that in mind, it’s hard to think of a more distinct social environment in the U.S. than our home, New York City itself.

When asked what made a person a New Yorker, former mayor Edward Koch put it most succinctly: “you have to live here for six months, and if at the end of the six months you find you walk faster, talk faster, think faster, you’re a New Yorker.”  I have certainly found that a few years here have changed me in more ways than knowledge gained in the classroom — parts of my personality seem fundamentally altered by my time living in Columbia and in adapting to the the unique social norms such a city carries.

In a place as hectic, stressful, and sometimes isolating as New York City, the unconscious effect of environment likely affects us all. Combined with a student population of high-achieving and hard-working Columbians, it’s possible our particularly potent stress culture might be drawing heavily from the city itself for fuel. While we often talk about the culture-shock of NYC on many of our students during orientation weeks, we rarely take the time to analyze how exactly our city might be changing us.

Maybe the pressures of Columbian sub-culture paired with tough-it-out mentality of the city makes us feel busier and more focused, and therefore primes to think faster and act smarter. Maybe some of these changes are positive, learning how to ‘tough it out’ certainly has its benefits. But I’m more worried about the negatives, about how a city so known for indifference may be affecting our compassion and human integrity.

Luckily, any negative characteristics our brains may be picking up from the city aren’t permanent. The same neuroplasticity which hardened us can prioritize compassion again, if we make a conscious effort to make others as important as our busy schedules. We have the ability to change our own culture of Columbia and only let the positive aspects of the city in.