Is morality located in the brain?

Many people consider a candidate's morality when they head to the polls. See more election and U.S. President pictures.
Patti McConville/Photographer's Choice/Getty Images

In every election cycle, voters size up the moral positions of the candidates. The public wants to know how a candidate's viewpoints match up with their own. Some political issues seem to have become hallmarks of the morality debate, such as stem cell research, abortion and gay marriage. Other issues, such as foreign policy and war, may seem to be more overtly political, but there are moral underpinnings present as well. For example, would a candidate order an enemy city to be bombed if his or her own child were stationed there? Will the candidate consider that the people stationed there belong to somebody? That each soldier is somebody's father or mother, somebody's son or daughter, somebody's husband or wife?

By some stretch of the imagination, then, it's not too unreasonable to imagine asking a candidate whether he or she would smother a baby to death. It may seem abominable to pose such a question, but let's explain. Imagine we're at war, and a group of people are hiding from the bad guys in a basement. The bad guys are upstairs, prowling the home for dissidents, when the baby in the basement begins to cry. Should the baby be smothered to death? If the baby is quieted, everyone else in the group lives. If the baby keeps crying, the bad guys find you, and everyone else in the group dies as well, including the baby.

Advertisement

You may be able to understand rationally how it's better to sacrifice the baby for the good of the group, but could you actually be the one to put your hand over its mouth? Do you want a president who is able to? We actually might not have that much choice in the matter, if some researchers are to be believed. While morality has long been the domain of philosophers, theologians and people who smoke marijuana, neuroscientists are getting in on the determination of right and wrong. And according to some, there's a very simple reason for why presidential candidates -- or anyone for that matter -- would answer questions of morality the way they do. As you might guess from the title of this article, it comes down to that vital organ that zombies just love to eat: brains. What's going on inside the brain when we're faced with a moral dilemma? And if everyone's morality is different, can the concept be reduced to one spot in the brain?

How Do You Choose? Moral Dilemmas and the Brain

Would you kill a baby if it would save several lives? The brain acts differently depending on who has to do the deed.
Peter Griffith/Taxi/Getty Images

In 2001, a research team led by philosopher and neuroscientist Joshua Greene released a paper detailing the work of using functional MRI to scan the brains of people wrestling with a moral dilemma. Greene and his team wanted to see if there was a conflict between areas of the brain that deal with emotion and those that deal with reason.

The subjects in the study were presented with a scenario that involved killing a person with his or her own hands in order to save a large group of people, such as the circumstances with the crying baby we discussed on the first page. In wrestling with the dilemma, several areas of the subjects' brains lit up, including two parts of the frontal lobe. The scans showed activity in the part of the frontal lobe that regulates our emotions toward other people as well as the part of the frontal lobe that does mental computation such as reasoning [source: Pinker]. Additionally, the anterior cingulate cortex lit up, which is the part of the brain that recognizes that there's conflict in the brain. This suggests that people weighed the benefit of saving the group against their emotions about killing an innocent baby.

Advertisement

Then the subjects were presented with a dilemma in which they didn't have to get their hands dirty. The same person would die, but someone else would do it or a switch could be flipped to accomplish the task. In this scenario, only the reasoning part of the brain was active in scans. When people didn't have to wrestle with their emotions about how they'd feel if they did something, they just completed a utilitarian analysis of what was best for the group.

In a 2007 study, researchers from several universities tried to look further into what areas of the brain affect morality and what might happen if those areas were damaged. It was a small study -- the subjects consisted of 12 people with no brain damage; 12 people with brain damage in areas that regulate emotions, such as fear; and six people with brain damage in the ventromedial prefrontal cortex, thought to be the center of emotions such as shame, empathy, compassion and guilt [source: Gellene]. The subjects were presented with 50 hypothetical scenarios, some of which required decision-making related to morals and some of which didn't.

There was a good deal of overlap in the groups' responses to certain scenarios. In situations that didn't require a moral choice, each of the groups answered in the same way. When asked about scenarios that required moral decision-making but didn't harm another person, such as a question about whether it would be OK to classify some personal expenses as business expenses for a tax write-off, the groups were willing to bend the rules a little bit. Members of all the groups agreed they wouldn't kill or harm another person for selfish gain, such as killing a newborn simply because the parent didn't want to take care of it. But the difference between the groups was evident regarding moral decisions that required the participants to decide if they would harm or kill another person for the greater good. Those with damage to the ventromedial prefrontal cortex were approximately two to three times more likely to sacrifice one person for the greater good [source: Saletan].

It appears then that when the part of the brain that rules such emotions as empathy and shame is damaged, people were more likely to consider only that cost-benefit analysis of the greater good. But some are worried about the eventual implications of such a finding. Could knowing that the brain is damaged in this way have any impact in criminal cases? Could "damage to the ventromedial prefrontal cortex" become a common courthouse plea?

That may seem unlikely, because different cultures consider different things to be crimes. If a sense of morality is wired into the brain, then why do we all have different morals? Go to the next page for some of the leading theories.

Advertisement

The Moral Systems in Us All

Children from different cultures will have different morals.
iStockphoto.com/Jani Bryson

We might immediately chalk up differences in how people perceive morality to cultural influence or religious upbringing. Yet some scientists are claiming that morality is all in our brains and is merely shaped by outside forces. One such scientist is Marc Hauser, who draws upon subjects such as anthropology and linguistics to show that morality was around long before the first religions.

Anthropology comes into play when you consider that primates such as apes and monkeys exhibit behaviors associated with morality, such as forgoing food when it would harm another primate [source: Wade]. While we can't know the primates' motivation, they may serve as a model of how morality is necessary for the communal living that man would perfect. But Hauser's real leap came in linking the concept of morality to the concept of language.

Advertisement

In the 1950s, linguist Noam Chomsky hypothesized that we are born with a universal sense of grammar, but within each language, we have our own rules and quirks. Hauser believes morality is much the same. We are born with certain moral norms, such as "do no harm," but the norms are shaped by our upbringing. Hauser believes one reason for this type of unconscious wiring has to do with time constraints. If we had to juggle a mess of verbs, nouns and sentence diagrams each time we spoke, we'd never get anything done. Similarly, we don't have time to dwell on moral concerns each time one comes up. Just as we may know immediately when someone speaks incorrectly, though we may not be able to identify the specific rule, we know in an unconscious sense whether something is right or wrong [source: Glausiusz].

Psychologist Jonathan Haidt has identified the moral systems that might be innate in each person:

  • Prevention of harm to a person
  • Reciprocity and fairness
  • Loyalty to a group
  • Respect for authority
  • Sense of purity and sanctity

[source: Pinker]

It's possible that these innate systems may have served an evolutionary advantage. For example, a sense of purity may have come about when faced with decisions of who made the best mate and which foods were the best to eat. Additionally, finding a group that believed the same way that you did would aid your individual survival, because the group would help you out in times of need. The group as a whole would also survive when strengthened by the last three principles.

These moral systems can be shaped by different cultures, which is how people can look at the same situation and come to a different conclusion about it. We have all five, but we place greater emphasis on one or another based on our upbringing. In the case of honor killings, in which a woman is killed for committing adultery or even speaking to a man in public who is not her husband, some Middle Eastern cultures see clear violations by the woman related to the areas of respect for authority and sense of purity, while other Western cultures can only see the woman's death as wrongful harm to a person.

Sometimes, Haidt argues, we don't even realize how our culture has trained these ideas within us [source: Wade]. That's because the more rational side of the brain, the side that lit up in Joshua Greene's imaging experiments (discussed on the previous page), may have evolved later than the emotional side that contains our sense of right and wrong. These brain systems may be in competition, with the rational side trying to figure out why the emotional side is reacting a certain way. When the rational side can't figure out what the emotional side did, it's called moral dumbfounding, according to Haidt [source: Wade]. Sometimes we can't explain why we think something is right or wrong, we just know that it is.

As you might expect, some philosophers resent the intrusion of scientists on this turf [source: Wade]. Both scientists and philosophers still have to grapple with what these findings could mean for our brains and for society. One thing that doesn't require grappling, however, is the decision to go to the next page. There you'll find lots more information on morality and the brain.

Advertisement

Lots More Information

Related HowStuffWorks Articles

More Great Links

  • Blakeslee, Sandra. "Watching How the Brain Works as It Weighs a Moral Dilemma." New York Times. Sept. 25, 2001. (Sept. 30, 2008) http://query.nytimes.com/gst/fullpage.html?res=9D0DE7D7143AF936A1575AC0A9679C8B63&sec=&spon=&&scp=7&sq=morality,%20brain&st=cse
  • Carey, Benedict. "Brain Injury Said to Affect Moral Choices." New York Times. March 22, 2007. (Sept. 30, 2008) http://www.nytimes.com/2007/03/22/science/22brain.html?scp=5&sq=morality,%20brain&st=cse
  • Dunham, Will. "Moral dilemma? Brain tells right from wrong." ABC Science. March 22, 2007. (Sept. 30, 2008) http://www.abc.net.au/science/news/stories/2007/1878563.htm
  • Gellene, Denise. "Empathy is hard-wired into the mind, study finds." Los Angeles Times. March 22, 2007. (Sept. 30, 2008) http://articles.latimes.com/2007/mar/22/science/sci-empathy22
  • Gilbert, Susan. "Scientists Explore the Molding of Children's Morals." New York Times. March 18, 2003. (Sept. 30, 2008) http://query.nytimes.com/gst/fullpage.html?res=9B0CE7D71731F93BA25750C0A9659C8B63&sec=&spon=&&scp=25&sq=morality,%20brain&st=cse
  • Glausiusz, Josie. "Is Morality Innate and Universal?" Discover. May 10, 2007. (Sept. 30, 2008) http://discovermagazine.com/2007/may/the-discover-interview-marc-hauser
  • Greene, Joshua D., R. Brian Sommerville, Leigh E. Nystrom, John M. Darley, Jonathan D. Cohen. "An fMRI Investigation of Emotional Engagement in Moral Judgment." Science. Sept. 14, 2001. (Sept. 30, 2008) http://www.wjh.harvard.edu/~jgreene/GreeneWJH/Greene-et-al-Science-9-01.pdf
  • Hauser, Marc D. "Is Morality Natural?" Newsweek. Sept. 22, 2008. (Sept. 30, 2008) http://www.newsweek.com/id/158760
  • Koenigs, Michael. Liane Young, Ralph Adolphs, Daniel Tranel, Fiery Cushman, Marc Hauser, Antonio Damasio. "Damage to the prefrontal cortex increases utilitarian moral judgments." Nature. April 19, 2007. (Sept. 30, 2008) http://www.pubmedcentral.nih.gov/picrender.fcgi?artid=2244801&blobtype=pdf
  • Pinker, Steven. "The Moral Instinct." New York Times. Jan. 13, 2008. (Sept. 30, 2008) http://www.nytimes.com/2008/01/13/magazine/13Psychology-t.html?scp=3&sq=morality,%20brain&st=cse
  • Saletan, William. "Best of the Brain: The Five Biggest Neuroscience Developments of the Year." Slate. April 25, 2007. (Sept. 30, 2008) http://www.slate.com/id/2164996/
  • Saletan, William. "Mind Makes Right." Slate. March 31, 2007. (Sept. 30, 2008) http://www.slate.com/id/2162998
  • Saletan, William. "SpongeBoob." Slate. March 23, 2007. (Sept. 30, 2008) http://www.slate.com/id/2162104
  • Wade, Nicholas. "An Evolutionary Theory of Right and Wrong." New York Times. Oct. 31, 2006. (Sept. 30, 2008)http://www.nytimes.com/2006/10/31/health/psychology/31book.html?scp=4&sq=morality,%20brain&st=cse
  • Wade, Nicholas. "Is 'Do Unto Others' Written Into Our Genes?" New York Times. Sept. 18, 2007. (Sept. 30, 2008) http://www.nytimes.com/2007/09/18/science/18mora.html?sq=morality,%20brain&st=cse&scp=1&pagewanted=print
  • Wade, Nicholas. "Scientist Finds the Beginnings of Morality in Primate Behavior." New York Times. March 20, 2007. (Sept. 30, 2008) http://www.nytimes.com/2007/03/20/science/20moral.html?scp=23&sq=morality,%20brain&st=cse

Advertisement

Loading...