Would you kill an innocent person to save five others?
If, like most people, you said no, it may be because following moral rules such as “don’t kill innocent people” sends a powerful social signal that you are trustworthy.
New research suggests people perceive those who hold fast to these moral rules – even when breaking them might lead to better overall consequences – as more trustworthy and valued social partners than those who would be willing to override the rules for the sake of the greater good. The paper, “Inference of Trustworthiness From Intuitive Moral Judgments,” was published April 7 in the Journal of Experimental Psychology: General.
David Pizarro, associate professor of psychology and his co-authors, Jim A.C. Everett and Molly Crockett of Oxford University, set up a series of experiments in which participants were given information about how another person responded to a hypothetical moral dilemma. These dilemmas were designed to pit two styles of moral thinking against each other: one that says the morally right decision is whichever one brings about the best overall consequences, versus one that says certain actions are wrong despite their consequences. For example, in one dilemma, a person is standing on a footbridge overlooking an out-of-control train speeding toward a group of five people. Next to her is a large man. If she pushes him off the bridge onto the track below it will stop the train. He will die, but the five others will be saved.
The researchers found that people who ignored the consequences and stuck to the rules – refusing to kill the innocent person even if it brought about a greater good – were judged to be more trustworthy than those who were willing to overlook the rules because of the positive consequences. The study participants also treated the rule-followers differently: when playing an economic game designed to assess trust, in which participants stand to benefit financially from giving up some of their own money to the other player as long as the other player gives it back, participants handed over more money to the rule-followers and were more confident that they would get it back.
But the hypothetical moral decision itself was not the only thing that mattered when it came to trust. “When determining whether or not to trust others, there are a variety of different cues that might be informative,” said Pizarro. In their experiments the researchers found that those who chose to sacrifice someone for the greater good were judged less harshly if the decision seemed to be difficult for them. “If a person concluded too easily that an innocent person should be thrown to their death for the greater good, our participants viewed them a bit more suspiciously,” he said.
And it wasn’t always the case that those who refused to kill an innocent person for the sake of greater overall consequences were trusted more. If the moral dilemma described the person who might be sacrificed as expressing a specific desire to live or a willingness to die, people favored individuals who respected those wishes, even if that involved killing.
“This helps explain why we appear to like people who stick to these intuitive moral rules, not because they are sticklers for the letter of the law but because the rules themselves tend to emphasize the absolute importance of respecting the wishes and desires of others,” Pizarro said.
These findings, Pizarro said, “highlight an important feature of our moral decisions: they serve as valuable signals to others about our social commitments. When it comes to social partners, we seek out others who will treat us with unwavering loyalty and respect, even when the mathematics of doing so don’t necessarily work out.”
This article also appeared in the Cornell Chronicle.