In 2001, a research team led by philosopher and neuroscientist Joshua Greene released a paper detailing the work of using functional MRI to scan the brains of people wrestling with a moral dilemma. Greene and his team wanted to see if there was a conflict between areas of the brain that deal with emotion and those that deal with reason.
The subjects in the study were presented with a scenario that involved killing a person with his or her own hands in order to save a large group of people, such as the circumstances with the crying baby we discussed on the first page. In wrestling with the dilemma, several areas of the subjects' brains lit up, including two parts of the frontal lobe. The scans showed activity in the part of the frontal lobe that regulates our emotions toward other people as well as the part of the frontal lobe that does mental computation such as reasoning [source: Pinker]. Additionally, the anterior cingulate cortex lit up, which is the part of the brain that recognizes that there's conflict in the brain. This suggests that people weighed the benefit of saving the group against their emotions about killing an innocent baby.
Then the subjects were presented with a dilemma in which they didn't have to get their hands dirty. The same person would die, but someone else would do it or a switch could be flipped to accomplish the task. In this scenario, only the reasoning part of the brain was active in scans. When people didn't have to wrestle with their emotions about how they'd feel if they did something, they just completed a utilitarian analysis of what was best for the group.
In a 2007 study, researchers from several universities tried to look further into what areas of the brain affect morality and what might happen if those areas were damaged. It was a small study -- the subjects consisted of 12 people with no brain damage; 12 people with brain damage in areas that regulate emotions, such as fear; and six people with brain damage in the ventromedial prefrontal cortex, thought to be the center of emotions such as shame, empathy, compassion and guilt [source: Gellene]. The subjects were presented with 50 hypothetical scenarios, some of which required decision-making related to morals and some of which didn't.
There was a good deal of overlap in the groups' responses to certain scenarios. In situations that didn't require a moral choice, each of the groups answered in the same way. When asked about scenarios that required moral decision-making but didn't harm another person, such as a question about whether it would be OK to classify some personal expenses as business expenses for a tax write-off, the groups were willing to bend the rules a little bit. Members of all the groups agreed they wouldn't kill or harm another person for selfish gain, such as killing a newborn simply because the parent didn't want to take care of it. But the difference between the groups was evident regarding moral decisions that required the participants to decide if they would harm or kill another person for the greater good. Those with damage to the ventromedial prefrontal cortex were approximately two to three times more likely to sacrifice one person for the greater good [source: Saletan].
It appears then that when the part of the brain that rules such emotions as empathy and shame is damaged, people were more likely to consider only that cost-benefit analysis of the greater good. But some are worried about the eventual implications of such a finding. Could knowing that the brain is damaged in this way have any impact in criminal cases? Could "damage to the ventromedial prefrontal cortex" become a common courthouse plea?
That may seem unlikely, because different cultures consider different things to be crimes. If a sense of morality is wired into the brain, then why do we all have different morals? Go to the next page for some of the leading theories.