Category: STEM

Illustration by Laura Elizabeth Hand, CC’19

 

I’ve spent a lot of time in this column so far talking about studies carried out in humans, usually using techniques like fMRI, EEG, or PET scans. However, a lot of neuroscience research, my own included, happens in what we call ‘model organisms’, one of the most common being the humble mouse. In conversations about my research, I’ve frequently gotten a variant of this question: “Why are you working on mouse brains if you want to understand how humans work?”

Since  I’ll be covering research done in lots of non-human species this semester, I wanted to take a column to talk about why I believe it is necessary to use animals in neuroscience research, and what they can tell us about the brain that human studies cannot.

Basically, it comes down to two things: in mice you can investigate the brain more directly at a much smaller scale, and you have much more causal control over the conditions of your experiments. First, let’s talk about the matter of scale.

In humans, functional magnetic resonance imaging, or fMRI, was a massive breakthrough in neuroscience. To this day, it is considered the highest degree of spatial resolution possible to monitor real-time neural activity in living humans, except for the rare electrodes allowed by a neurosurgery patient. In humans, fMRI is as far as you can ‘zoom in’ on the behaving brain.

However, like with any technique, there are downsides to fMRI. While most popular science articles call fMRI results ‘neural activity,’ fMRI is actually measuring the amount of oxygen that the blood in your brain is using, which serves as a proxy for neural activity. In other words, the assumption is that the more oxygenated blood a brain region is going through, the more neurons are firing in that region.

The other huge issue with fMRI is scale. An fMRI scan is like a 3D video, and just like a movie has pixels, there’s the smallest possible unit of detection in fMRI – the voxel. Its name comes from a combination of the words ‘volume’ and ‘pixel,’and it essentially is a pixel, just in three dimensions. The highest current possible resolution of a single voxel averages the oxygenation of approximately 100,000 neurons over one second, which means that the activity of 100,000 cells is reduced to a uniform greyish box on the display.

While that’s a pretty small percentage compared to the ~80 billion neurons of the brain, an fMRI still can’t tell you what specific kinds of neurons are activating, or anything about the pattern of activity below a voxel scale. So how do we understand neural circuits at a more detailed level?

That’s where mice come in. Mouse brains have most of the major features of human brains – they even have a neocortex that is structured almost identically to our own. In mice, it is much easier to observe these smaller scales, which span from from single neurons to the simultaneous observation of thousands of neurons at a time.

Mice are particularly well-suited to this task because of the immense control an experimenter can have over a given experiment. Every aspect of a lab mouse’s life is regulated from birth to death, which is impossible to control for in human studies.

Beyond behavioral control, genetic techniques enable causal manipulations at a cellular level. Thousands of mouse strains have been specially made to manipulate the expression of particular genes, optogenetic techniques enable researchers to turn on or off specific neuronal populations during behavior, and two-photon imaging paired with calcium labeling lets us observe the activity of individual neurons in real time.

These advantages of experimental control and fine-scale observations are only possible in animal models. While mice have their disadvantages too, namely that without language behavioral motivations becomes difficult to interpret, their use clearly contributes to neuroscience overall. Discoveries in mouse models help guide human researchers to better theories, better treatments, and ultimately, a better understanding of ourselves.

 

Uniquely Human is written by Heather Macomber and runs every other Monday. To submit a comment/question or a piece of your own, email submissions@columbialion.com.

Photo Courtesy of the Zuckerman Institute

The neuroscience major is not unlike many at Columbia in that it is co-sponsored by two departments, psychology and biology. In fact, a little over half of Columbia majors share this structure of co-sponsorship. Ideally, each department communicates with its counterpart to design a robust, cohesive course schedule that draws from the expertise of the individuals in both disciplines.

However, a majority of courses in the neuroscience major retain their specific psychological or biologic identities without fully integrating the other, thereby falling short of a true neuroscience curriculum. I would like to emphasize that I do not believe individual professors are at fault, and I have truly enjoyed my time as a neuroscience major. However, I do believe that heightened interdepartmental communication could help improve the experience dramatically.

Despite efforts to heighten cross-disciplinary conversations, departments at Columbia largely remain insulated from one another. As each individual professor teaches their course, they are largely unaware of what material the students are already familiar with when entering their classes. To take a closer look, let us examine the course of a typical student in the Neuroscience and Behavior “Despite efforts to heighten cross-disciplinary conversations, departments at Columbia largely remain insulated from one another.(N&B) major at Columbia.

For a first-year interested in the major, a potential N&B student will likely fulfill their introductory chemistry requirement and take Science of Psychology in their first year. Neither course is neuroscience-specific, and both are lecture-style. While perhaps not ideal, such sizeable and nonspecific courses are typical for first-year students.

As a sophomore, the repetition becomes more readily apparent. On the psychology side of the major, N&B students can either take Mind, Brain, Behavior (MBB) or Behavioral Neuroscience. While both courses technically fulfill the ‘intro’ neuroscience requirement from the psychology side, they are very different.

MBB is the less science-heavy of the courses and is commonly taken by non-science majors to fulfill their science requirement for the Core. The syllabus can vary based on the professor, but in any given year the course content for these two classes has almost 70% overlapping material, plus a good amount of overlap with Science of Psychology. Some refresher material is a good thing and is useful to better understand new material. However, for three courses in the same department, two of which are required for N&B majors, this high amount of re-teaching is somewhat unnecessary when it instead could be spent learning new information.

Material overlap continues to be a significant concern on both sides of the major. In the rigorous two semesters of Professor Mowshowitz’s Introductory Biology course, at least a month is dedicated to neural mechanisms. Meanwhile on the psychology side, in the series of psychology lecture courses a N&B student may choose to take, the first few weeks are spent covering the same introductory material that these students have now encountered at least three times.

Continuing along the biology side, N&B students wade their way through pre-requisites only to enter their first neuroscience class in the year-long Neurobiology I&II sequence. Between these sequential courses a good deal of overlap still remains, with systems-level information taught in the cellular level fall course and cellular mechanisms covered again to teach systems in the spring.

The remaining requirements for the major include a non-neuroscience specific statistics course and a non-neuroscience specific additional biology course — for which a neuro-themed variant has not been taught since Fall 2013.

Overall, a common experience among N&B majors is a feeling of disjointed repetition and lack of neuroscience-specific courses catered to their needs and/or interests. I do not believe such a feeling is limited to frustrated neuroscience students — I have heard the same complaint expressed again and again by friends in various joint majors throughout Columbia College.

So what can be done to fix the glaring issues in the design of the N&B major? Ideally, the whole major would be restructured from the ground up to create a fully-integrated design. Realistically, the bureaucracy necessary for such an overhaul is untenable. Instead, I have a few simple proposals to streamline and vastly improve the experience of N&B majors at Columbia.

The greatest concern, of course, is the overlap of course materials. Luckily, each professor has a fair amount of leeway in their syllabi. Because of this freedom, I suggest that professors, responsible for N&B courses on the biology and psychology sides of the major, set aside one full working day at the beginning of each semester to overview syllabi with an eye for overlap.

I believe a good deal of the issue in being taught the action potential seven times is well-intentioned, with each professor unsure if the students have covered this material before. Such a semesterly meeting would eliminate the interdepartmental uncertainty and go a long way towards eliminating unnecessary repetition in courses.

Additionally, I propose expanding and integrating voluntary courses between the two departments by allowing more cellular-heavy neuroscience majors to focus in neurobiology courses, and psychology-heavy majors to spend more time on the psychology side. Allowing these electives outside of the ‘core’ major courses to be taken in either department would enable a range of students to unify within a single major.

From a scheduling perspective, neuroscience courses at a higher than introductory level must be offered on a regular basis. Here, the psychology department far outstrips biology, offering a wide range of rotating seminars. While still skewing towards psychology, some neuroscience-heavy courses are at least offered each semester from the psychology department.

Overall, I put forward a recommendation that Science of Psychology no longer be mandated for N&B majors, and instead it should be replaced by a comprehensive Behavioral Neuroscience introductory course tailored for N&B majors. With this change, Mind Brain Behavior can more specifically and more accessibly target a non-major audience, and Behavioral Neuroscience can serve as the sole prerequisite for Neurobiology I&II, allowing majors to take this course in their sophomore or junior year and leave space for more seminar-style neuroscience electives as upperclassmen taught by professors in their regions of interest.

With a graduating class of 65 majors last year, N&B is the eighth largest program within Columbia College, and has rapidly grown over the last few years. With the opening of the Zuckerman Mind Brain Behavior Institute, Columbia will only continue to attract the best and brightest neuroscience undergraduates. I believe that professors and administrators want to provide the best education possible to the student body — and that many of the problems within the N&B major can be solved by increased communication between the biology and psychology departments and some simple restructuring.

Image courtesy of freshNYC

How would you describe yourself?

Most people can immediately come up with at least a few adjectives to summarize their personalities, and when these people are asked how well they know themselves on a scale of 1-10, the answers are overwhelmingly above 8. When asked to estimate if their ‘core’ personalities have remained consistent over time, the majority agree that while they have indeed changed, certain fundamental aspects of themselves remain the same.

People make important decisions based on the idea that personality continuity often underlies individual growth. You believe that person you choose to marry has essential qualities  which will remain good, that criminals have essential qualities which will remain bad, and that the people in your life all have dependable qualities. When attributing the incredible successes or failures of CEOs, celebrities, or pro-athletes, most tend to credit or blame their personalities.

While this convincing story pervades our culture, modern research indicates that this idea of an individual’s consistent personality is just a myth. A few months ago, the longest-running study on personality was published. Begun in 1947, teachers were asked to rate their fourteen-year old students on six personality traits. Sixty-three years later, researchers tracked down as many of the original participants as they could and analyzed their personalities.

Upon analysis, none of the six traits showed any significant stability across the time-span. While ideas about personality and experimental methods have changed drastically in the intervening decades, more modern neuroscientific research backs up these sorts of long-running surveys with fMRI studies of the changing brain.

While many ‘tests’ of personality exist on the internet, almost none of them hold any neuropsychological weight. This includes the famed Myers-Briggs model, which sorts individuals into sixteen distinct personalities according to four to five traits, each with a corresponding letter. If you have ever had someone tell you they are an ENFP, or INTJ, that’s the model they’re referring to. Though certainly entertaining, such tests have long-been discredited for being too myopic and binning people into binary categorizations.

Although many scientists disagree, the generally-accepted model of personality these days is the Big-Five, which gives individuals a rating from 1-100 on five distinct traits — Extraversion, Neuroticism, Agreeableness, and Conscientiousness. If you are interested, this is the best way to take it online.

Recently, neuroscientists have begun to examine how high scores on various factors in the Big Five might map onto brain structure. Using structural MRI, one team examined brain volume as it varies with brain region size, finding that extroverts had a larger medial orbitofrontal cortex, a brain region which processes reward. This area is heavily implicated in response to social reward, so it is  possible that extroverts enjoy social interactions because they supply them with a ‘hit’ of dopamine.

Increased scores on neuroticism correlate with bigger brain regions associated with threat, punishment, and negative behavior. It is possible that neurotic people feel the potential threat of a negative event more powerfully than those with smaller cingulate cortices, and therefore are more concerned over potentially troubling events.

Agreeableness correlated with a larger lateral prefrontal cortex, a region that loosely corresponds with planning and higher-order processing. Though they did not find a significant association with Openness, neuroscientists found some possible correlations with parts of the parietal cortex associated with integrating sensory stimuli.

While this study did not use functional MRI to tell us what regions are activated when exhibiting behaviors associated with these traits, there does appear to be some association between the sizes of these brain regions and an individual’s personality.

If psychology research tells us personality changes drastically over time, and neuroscience research indicates that our brains reflect our personalities, what underlying mechanisms in our brains are underlying these changes?

Some potential clues lie in memory research. A large body of evidence tells us that each time a memory is ‘accessed’, it is altered, sometimes dramatically, before going back into storage. As experiences pile up in our lifetimes, the memories we make are incorporated into the ways we face new information, and change the ways we make decisions.

The other massive factor in our decision-making comes from our surroundings — specifically, our social surroundings. The cultural norms which permeate a place can strongly influence how a personality changes over time, as new experiences permeate the neural wiring. With that in mind, it’s hard to think of a more distinct social environment in the U.S. than our home, New York City itself.

When asked what made a person a New Yorker, former mayor Edward Koch put it most succinctly: “you have to live here for six months, and if at the end of the six months you find you walk faster, talk faster, think faster, you’re a New Yorker.”  I have certainly found that a few years here have changed me in more ways than knowledge gained in the classroom — parts of my personality seem fundamentally altered by my time living in Columbia and in adapting to the the unique social norms such a city carries.

In a place as hectic, stressful, and sometimes isolating as New York City, the unconscious effect of environment likely affects us all. Combined with a student population of high-achieving and hard-working Columbians, it’s possible our particularly potent stress culture might be drawing heavily from the city itself for fuel. While we often talk about the culture-shock of NYC on many of our students during orientation weeks, we rarely take the time to analyze how exactly our city might be changing us.

Maybe the pressures of Columbian sub-culture paired with tough-it-out mentality of the city makes us feel busier and more focused, and therefore primes to think faster and act smarter. Maybe some of these changes are positive, learning how to ‘tough it out’ certainly has its benefits. But I’m more worried about the negatives, about how a city so known for indifference may be affecting our compassion and human integrity.

Luckily, any negative characteristics our brains may be picking up from the city aren’t permanent. The same neuroplasticity which hardened us can prioritize compassion again, if we make a conscious effort to make others as important as our busy schedules. We have the ability to change our own culture of Columbia and only let the positive aspects of the city in.

 

Meet Mathew Pregasen. Mathew is a Columbia junior studying computer science who founded a startup with Anuke Ganegoda (CC ’18), Sahir Jaggi (SEAS ’17) and Rikhav Shah (MIT ’19). Named Parsegon Inc, the company implements a new method of transcribing English descriptions of math into mathematical script. For example, Parsegon’s technology could take a sentence “integral from 0 to 10 in region D of 2x squared + 3x cubed – the square root of x” and convert it into visual, textbook-formatted math.

How did you come up with the idea of Parsegon? What experience made you want to start your own business?

The way it started was pretty accidental. It was first a small project that we had no intention of turning into a company, but as it developed we realized it had more potential. Soon, we started to think of this project in a business context. We did Almaworks, raised some funding, hired some people for the summer, and further developed our business. In the ending, it is a technology project.

How did Almaworks facilitate your business development process?

I think the most beneficial part is that it connects you with incredibly helpful mentors. At first, you might not know too much about design, planning, or the law associated with a startup business, but as long as you get close to a mentor, you will get proper advice on business direction, project development, and especially important legal services.

What’s the current entrepreneurial environment at Columbia like? How does it compare to other schools?

I think in the last two years, there has been some significant changes, where the administration—especially entrepreneurship administration—has been putting a lot of resources into the entrepreneurship community. They raised the amount of provided grants and have organized the Columbia Entrepreneurship Competition for the last four years.  Alongside that, you have clubs like CORE (Columbia Organization of Rising Entrepreneurs) and ADI (Application Development Initiative) that push this culture. I think ultimately the culture should be self-accelerating instead of accessory, but you need to have some initial velocity at the beginning.

 

Mathew Pregasen

Image via Mathew Pregasen

So back to Parsegon. It seems to be designed for people who are not fast at mathematical typing. How do you attract people who are already proficient at mathematical expression in typing packages such as LaTeX?

We are not competing with LaTeX and we don’t expect people to write papers in Parsegon. That being said, we do have a very user-friendly environment that reduces time and difficulty in typing. Parsegon is also educational in the sense that it makes teaching more accessible to students and enables the entire classroom to engage in interactive math.

 

You have been trying to integrate Parsegon into classrooms. What is the feedback from teachers and students?

We primarily focus on high schools, and we’ve been having very strong feedback.

What do you think is the biggest challenge for Parsegon?

I think the greatest challenge for us is to make a technology that provides a number of services for very diverse classroom environments. Some people might not be familiar with computer typing and some do prefer a very traditional and structured typing style, so although we are making it more accessible to people, it is still a big challenge to build the technology that accommodates the needs of everyone and strikes a proper balance between accessibility and formality.

Are there any computer science classes at Columbia that have helped you in this process?

Namely Operating Systems (W4118) with Jason Nieh. I also took a class called Computer Theory with Alfred Aho which was useful for the theoretical angle.

What do you think is the future of Parsegon?

We want to build the best tool for educational practices in the America. We believe that there is a big gap between the technology side of users and the technology provided for educational professionals, and we believe that our implementation will not only complement the traditional learning method, but also improve it. The importance of Parsegon is that it teaches students to understand the language of math. If you can understand the language of math, you usually also understand the theory of math much more coherently. And we believe that is the best way Parsegon could improve the learning process of math on a more cognitive level.

As the new semester begins, The Lion will be spending some time in Uniquely Human on other people — how we interact with them, how they interact with us, and how those interactions shape our personality. This is the first column in our new series.

Columbia students spend a lot of time in elevators. Imagine – you step into an empty elevator on the top floor of a building. As you descend, one, two, even three people walk into the elevator, an experience so typical you hardly notice. But this time as they enter, something curious happens.

After walking in the elevator, each person faces the back instead of turning around to face the front doors. While one person doing this may go unnoticed, after two or three people perform this strange action you too turn around to face the back.

Although your instinct may be to resist that ending to the story, from its origin on Candid Camera in the 1950s, through multiple scientific studies the result is always the same — the majority of people will adopt the new social norm.

This action of changing your behavior to adapt to those around you is called social referencing, and for decades, its powerful sway over social activities has been confirmed in sociological and psychological studies. That people would adapt their behavior to their social situations is not itself revolutionary, although the extent to which people adopt ‘non-logical’ behaviors to fit in a new social norm can often be humorous.

The truly controversial idea is a much newer one, and comes out of modern neuroscience: not only do you change your external behaviors to adjust to a new social environment, your core personality adjusts to fit with a new social reality.

This brain re-wiring can perhaps paradoxically be best illustrated by when the system goes wrong. Have you ever flinched when you have seen someone get hit in a particularly painful location, or felt warm when you have seen two people hug? Now imagine if instead of experiencing a vague sense of those feelings, you physically felt every sensation you saw in someone else. Every touch is replicated on your arm, with every swallow you see you feel the food slither down your throat, and the pain of another sharply becomes your own.

This condition is called mirror-touch synesthesia, and it is one of the most common synesthesias –  an estimated 1.5% of the population experiences the world this way. While the physical aspects of this disorder are fascinating and deserve their own column, where it really gets interesting is in how synesthetes experience emotional reactions.

In a number of mirror-touch synesthetes, the act of seeing someone respond emotionally causes a mirrored emotional response. Because they can acutely feel the happiness, sadness, anguish of the people around them, it can become incredibly difficult for mirror-touch synesthetes to distinguish their own emotions from the emotions of those around them. They find themselves disappearing into others.

As is common in neuroscience, observing such an extreme example of a system going wrong teaches us about how the system should work under normal circumstances. One possible explanation comes from mirror neurons. Discovered a little over a decade ago in monkeys and recently in humans, mirror neurons are cells located in parts of the brain corresponding to sensation and motor activity.

Unlike other cells nearby, these special mirror neurons fire identically both when they are performing an activity, like processing touch or moving your arm, and when observing someone else do the same task. While the purpose of these neurons is still speculative, there is evidence of their role in subconscious mimicry, empathy, self-awareness, and even theory of mind.

Of course, when a typical human observes other people, they don’t acutely feel those external sensations in the same way. That is because there are other inhibitory neurons ‘downstream’ of the mirror neurons, which stop you from acting on their firing. It’s likely that in mirror-touch synesthetes, that ‘turn off’ signal does not get sent, or the original signal from mirror neurons is so strong that it cannot be turned off.

So while mirror neurons might allow us all to understand each other at low levels of activity, cranking their response up causes people to in some ways become other people. Mirror touch synesthetes brings a normally subconscious process to the surface, and they raise some interesting questions in the process.

If we’re somehow experiencing the actions and emotions of other people within our own minds on a subconscious level, do these ‘outside’ factors become a part of us? Do we correspondingly change parts of our core personalities in response? We will seek to explore these very questions in the next column.