- Science News
- Featured news
- Mirror neuron activity predicts people’s decision-making in moral dilemmas
Mirror neuron activity predicts people’s decision-making in moral dilemmas
The study confirms that genuine concern for others’ pain plays a causal role in moral dilemma judgments. Image: Shutterstock
Neural resonance in the brain’s inferior frontal cortex more active in those with aversion to harming others
— By the University of California, Los Angeles
It is wartime. You and your fellow refugees are hiding from enemy soldiers, when a baby begins to cry. You cover her mouth to block the sound. If you remove your hand, her crying will draw the attention of the soldiers, who will kill everyone. If you smother the child, you’ll save yourself and the others.
If you were in that situation, which was dramatized in the final episode of the ’70s and ’80s TV series M.A.S.H., what would you do?
The results of a new UCLA study suggest that scientists could make a good guess based on how the brain responds when people watch someone else experience pain. The study found that those responses predict whether people will be inclined to avoid causing harm to others when facing moral dilemmas.
“The findings give us a glimpse into what is the nature of morality,” said Dr. Marco Iacoboni, director of the Neuromodulation Lab at UCLA’s Ahmanson-Lovelace Brain Mapping Center and the study’s senior author. “This is a foundational question to understand ourselves, and to understand how the brain shapes our own nature.”
In the study, which was published in Frontiers in Integrative Neuroscience, Iacoboni and colleagues analyzed mirror neurons, brain cells that respond equally when someone performs an action or simply watches someone else perform the same action. Mirror neurons play a vital role in how people learn through mimicry and feel empathy for others.
When you wince while seeing someone experience pain — a phenomenon called “neural resonance” — mirror neurons are responsible.
Iacoboni wondered if neural resonance might play a role in how people navigate complicated problems that require both conscious deliberation and consideration of another’s feelings.
To find out, researchers showed 19 volunteers two videos: one of a hypodermic needle piercing a hand, and another of a hand being gently touched by a cotton swab. During both, the scientists used a functional MRI machine to measure activity in the volunteers’ brains.
Researchers later asked the participants how they would behave in a variety of moral dilemmas, including the scenario involving the crying baby during wartime, the prospect of torturing another person to prevent a bomb from killing several other people and whether to harm research animals in order to cure AIDS.
Participants also responded to scenarios in which causing harm would make the world worse — inflicting harm on another person in order to avoid two weeks of hard labor, for example — to gauge their willingness to cause harm for moral reasons and for less-noble motives.
Iacoboni and his colleagues hypothesized that people who had greater neural resonance than the other participants while watching the hand-piercing video would also be less likely to choose to silence the baby in the hypothetical dilemma, and that proved to be true. Indeed, people with stronger activity in the inferior frontal cortex, a part of the brain essential for empathy and imitation, were less willing to cause direct harm, such as silencing the baby.
But the researchers found no correlation between people’s brain activity and their willingness to hypothetically harm one person in the interest of the greater good — such as silencing the baby to save more lives. Those decisions are thought to stem from more cognitive, deliberative processes.
The study confirms that genuine concern for others’ pain plays a causal role in moral dilemma judgments, Iacoboni said. In other words, a person’s refusal to silence the baby is due to concern for the baby, not just the person’s own discomfort in taking that action.
Iacoboni’s next project will explore whether a person’s decision-making in moral dilemmas can be influenced by decreasing or enhancing activity in the areas of the brain that were targeted in the current study.
“It would be fascinating to see if we can use brain stimulation to change complex moral decisions through impacting the amount of concern people experience for others’ pain,” Iacoboni said. “It could provide a new method for increasing concern for others’ well-being.”
The research could point to a way to help people with mental disorders such as schizophrenia that make interpersonal communication difficult, Iacoboni said.
Original research article: Deontological Dilemma Response Tendencies and Sensorimotor Representations of Harm to Others
Corresponding author: Dr. Marco Iacoboni
The study’s first author is Leo Moore, a UCLA postdoctoral scholar in psychiatry and biobehavioral sciences. Paul Conway of Florida State University and the University of Cologne, Germany, is the paper’s other co-author.
The study was supported by the National Institute of Mental Health, the Brain Mapping Medical Research Organization, the Brain Mapping Support Foundation, the Pierson-Lovelace Foundation, the Ahmanson Foundation, the William M. and Linda R. Dietel Philanthropic Fund at the Northern Piedmont Community Foundation, the Tamkin Foundation, the Jennifer Jones-Simon Foundation, the Capital Group Companies Charitable Foundation, the Robson family, and the Northstar Fund.
REPUBLISHING GUIDELINES: Open access and sharing research is part of Frontier’s mission. Unless otherwise noted, you can republish articles posted in the Frontiers news blog — as long as you include a link back to the original research. Selling the articles is not allowed.