In an analogy many scientists hate, the human brain is often compared to a small, wet computer, functioning in almost the same way as the electronic kind. Two scientists at Cornell University report the analogy might be closer to the truth than anyone thought.
They have found an emotion code.
In a paper that took four years to write, neuroscientist Adam Anderson, associate professor of human ecology, and his graduate students found that when different people have a pleasant experience, the neurons, or nerve cells, in the part of the brain called the orbitofrontal cortex, react in all the same way, firing in similar patterns. Additionally, the same pattern shows up in everyone regardless of the type of external sensory stimulus, such as a sight, or sound, or both, that is triggering the pleasant feeling. The concept breaks the pattern of how science deals with emotions.
Image credit: Doggygraph via shutterstock. Rights information: http://shutr.bz/1kcprvK
“The emotion code is distinct from other codes that represent objects,” Anderson said, separate from the process by which we see or taste.
Consider: you and another person are sitting on the deck of a bar on the beach on Maui with a glass of Pinot noir in your hands watching the sunset over the Pacific. Your eyes pick up the photons from the scene; your tongue picks up the taste of the wine. The sensory part of your nervous system processes the image and the taste.
You are aware of what you are doing. But how do you feel about it?
The brain categorizes what it sees, Anderson said. The sunset registers on the brain as a sunset, with colors and motion; the wine registers as fruity, perhaps a bit astringent.
“We used to think about patterns; we would monitor at a specific part of the brain and the activity of the neurons. The more active the neurons were, we would say this was the pleasure center and monitor with some degree of accuracy," said Anderson.
“It was the Swiss-army-knife analogy of the brain, with various tools for various purposes. That in my opinion has largely failed.”
To get their data for their study, the Cornell researchers showed 128 visual scenes to 16 subjects and then monitored their brains using fMRI technology to measure the orbitofrontal cortex, where emotions are processed. The same subjects drank solutions of various tastes, including sweet, sour, salty, and bitter.
What they said they found was that the brain has “information states,” he said, just as computers do. It is not as though you have pictures of your mother in just one area of a hard drive on your computer, he said. In fact, parts of those pictures are all over the hard drive and the computer’s job is to organize them into an overall picture.
That, Anderson found, is what is going on in the brain with emotions.
In the same brain area, such as parts of the prefrontal cortex just above the eyes, there is actually something of a vote going on and that pattern of voting determines aspects of emotion, something like retrieving those pictures. It determines whether you like something or not.
Psychologists call what is happening a valence, a word borrowed from chemistry, to describe the amount of pleasure and displeasure. To measure valence, they use the word coding, right out of computer science.
Back at Maui, if both people like the wine, the same pattern of neural activity is flashing across both brains. If they both like sunsets, the same pattern appears again in the same way. It is not sense-dependent: The pattern exists whether they are savoring the wine or are enthralled by the sunset; the code also remains the same no matter where the stimuli are coming from, taste or vision.
If one person doesn’t like the wine or is indifferent to sunsets, the neurons do not fire in that pattern. The code is different, he said.
“You can significantly predict how someone is feeling by reading the code,” he said. “The code represents our subjective feelings.” The emotion code represents how we are feeling not what we are seeing or tasting.
For a long time, psychologists looked at explanations anchored by senses, said Andrew Connolly, a research scientist at Dartmouth College in New Hampshire. They assumed, like the British empiricist philosophers John Locke and David Hume, that the only coding in the brain--in thought--derived directly from and depended on the senses.
The Anderson study breaks from that.
What Anderson and his team did was to use big-data analysis to find that coding that was independent of the specific senses, Connolly said, and found a pattern, not an average. It didn’t matter whether we taste something or see it, the emotion code is the same.
The paper was published in Nature Neuroscience.
Reprinted with permission from Inside Science, an editorially independent news product of the American Institute of Physics, a nonprofit organization dedicated to advancing, promoting and serving the physical sciences.
Joel Shurkin is a freelance writer based in Baltimore. He is the author of nine books on science and the history of science, and has taught science journalism at Stanford University, UC Santa Cruz and the University of Alaska Fairbanks. He tweets at @shurkin.