How Do Your Emotions Affect Your Moral Compass?
Would you sacrifice one person to save five? What if you had to cause harm with your own hands? Your answer may depend on the emotions you’re feeling.
We’d like to believe that we make moral judgments based on rational thought, but the truth is that our moral thinking cannot escape our emotions. Let’s take a look at how anxiety, empathy, anger, and disgust shape our moral thinking, and how we can harness these emotions for making better moral decisions.
The infamous Trolley Problem
Imagine this scenario: As you’re walking by a train station, you notice there are some construction workers working on the tracks. There’s a fork in the track, so a train could either go left or right. On the left track, there’s only one person working. On the right track, there are five people working. They all have noise-canceling headphones on and don’t seem to know what’s going on around them.
Suddenly, you see an out-of-control train car coming down the tracks—it must have gotten loose from a train! The fork in the track is directed towards the right side, so the out-of-control car is headed straight for the five workers, certain to kill them all. There’s no way to stop the train car. The only thing you can do is to pull a switch to redirect the car towards the left track, which would kill the one worker there.
Do you pull the switch?
It’s a tough one, isn’t it? On the one hand, it seems like a no-brainer that killing one person to save five is better than killing five to save one. On the other hand, redirecting the track would require you to purposely cause someone’s death rather than letting the accident take its course. Most of us feel at least squeamish about the kill-one-to-save-five choice, but most of us, when pressed, agree with itopens PDF file . At least when asked about it hypothetically, that is.
Most of us feel at least squeamish about the kill-one-to-save-five choice, but most of us, when pressed, agree with it.
This is the classic moral dilemma called the Trolley Problem. Many philosophers and psychologists have used it to study and ponder the way we think about morality. One big question they’ve asked is: “Do people make these decisions based on rational thinking, or are they influenced by other factors?”
Well, consider this twist to the Trolley Problem for your answer: What if there is no switch to redirect the train car, but there is a large stranger walking by that you could push onto the track? This person would be killed, but their body would stop the train from killing the five construction workers. Would you push the stranger?
Here, the math is the same—sacrificing one to save five. But I bet you had a different gut reaction. If so, then it shows that something else is helping you make this decision. What is that something else?
Emotions play a big role in the way we judge morality and make moral decisions.
It turns out that emotions play a big role in the way we judge morality and make moral decisions. What did you feel when considering the Trolley Problem and the Stranger variation of it? Fear? Empathy? Disgust? Let’s take a look at the science behind how these emotions—and your relationship to them—affect your moral compass:
Stress and anxiety makes us more likely to rely on gut intuition
One major distinction that philosophers and psychologists talk about in moral judgment is between the “deontological” and the “utilitarian.” These are fancy terms that essentially describe whether we’re choosing to follow a moral principle like “thou shalt not kill” (deontological), or choosing to minimize harm on a case-by-case basis, such as sacrificing one to save many (utilitarian).
Redirecting the train to kill one worker is utilitarian because that minimizes harm overall according to the moral math.
In the classic Trolley problem, not doing anything is the deontological choice because you avoid having to kill someone. Redirecting the train to kill one worker is utilitarian because that minimizes harm overall according to the moral math.
Here’s where things get interesting. When we’re under stress, we’re more likely to lean towards the deontologicalopens PDF file —that is, choosing to not sacrifice the one. This is the case even when the stress has nothing to do with the moral dilemma.
Researchers from Germany showed that when research participants were told they had to give a speech in a few minutes, they understandably experienced anxiety. Then they were asked to make decisions about variations on the Trolley Problem. Compared to participants who didn’t have an upcoming speech, those who were nervously anticipating a speech made fewer utilitarian decisions and took longer to decide. In other words, they were more reluctant to sacrifice one person to save many. The more physically nervous they were, the stronger this effect.
Empathy (and soberness) keeps us from basing moral judgments on “doing the math”
It’s not just stress that makes us less rationally utilitarian. The emotion of empathy, too, makes us more reluctant to sacrifice one person to save many. Specifically, researchers found that people who reported feeling more empathic concern—feelings of warmth and compassion towards others—were less likely to make the utilitarian choice. That is, they were less likely to sacrifice the one workman.
The drunker they were, the more likely people were to choose the more utilitarian choice of killing one to save five.
By the way, drunkenness has the opposite effect of empathy when it comes to making moral judgments. Researchers went bar-hopping in France and asked people with varying blood alcohol levels about the Trolley Problem. The drunker they were, the more likely they were to choose the more utilitarian choice of killing one to save five, even if the scenario required them to personally push the person onto the tracks.
The fact that intoxicated people were more likely to make this choice is surprising because sacrificing one to save many is technically the more “rational” choice. You wouldn’t think that drunk people would be more likely to choose it. But perhaps what’s going on is that drunkenness lowers our usual ability to be empathic. If you’ve seen die-hard drunk sports fans setting cars on fire, you might know what I mean!
Disgust and anger make us into harsher judges
Now, let’s look at emotions that are opposite empathy—disgust and anger. They can also affect the way we consider morality.
One of my favorite studies on how disgust affects moral judgment showed that when people were sitting in a smelly room they were harsher judges. The researchers found this out by asking people to rate the wrongness of various actions, such as falsifying a resume, marrying your first cousin, and of course, killing the workman in the classic Trolley Problem.
No wonder prosecutors are so good at appealing to a jury’s emotions!
For some participants, they prepared the testing room with a fart spray—yes, a fart spray!—while for others, the room was kept normal. The disgusted participants who had to sit with the fart spray rated the hypothetical scenarios as more wrong.
Anger seems to work similarly. In a classic study, some participants are made to feel angry by watching a video of someone being bullied. Then they were told that the bully gets away with it. Next, they read unrelated vignettes about bad behaviors, such as a used car salesman not telling buyers about a car’s hidden engine defect. Compared to participants who didn’t watch an anger-inducing video, the angry participants judged the car salesman’s actions as more wrong and said he should pay a higher fine.
No wonder prosecutors are so good at appealing to a jury’s emotions! They need the jury to feel angry in order to judge the defendant as more guilty.
The emotional effect on moral judgment is based in the brain
How exactly do these emotions work their influence on our moral thinking? It’s hard to pin down, but we know it’s based in the brain.
There’s an area of the brain called the ventromedial prefrontal cortex (VMPC) that sits roughly behind your forehead at the front of the brain. It’s known to be involved in emotional processing. Researchers have found that when people who are asked to think about personal moral problems, like someone mugging an old lady, this area of the brain is activated. But when they’re thinking about impersonal moral problems, like someone stealing cash from an ATM, there is no particular activation in the VMPC.
There’s an area of the brain called the ventromedial prefrontal cortex (VMPC) that sits roughly behind your forehead at the front of the brain. It’s known to be involved in emotional processing.
The research shows that this emotional part of the brain does come into play depending on the type of moral problem we face. When I asked you to think about using your own hands to push someone onto the train tracks, even though the intention is to save five people, I bet your VMPC was lighting up more than when I asked you to think about pulling a switch.
We also know that emotions are involved because people who have damage to emotion-processing parts of the brain tend to make moral judgments differently from most people. For example, a fascinating study published in the prestigious journal Nature found that, compared to people with intact brain health, people who have suffered damage to the VMPC made more utilitarian moral judgments. This means that they were more likely to agree with sacrificing one to save many.
The really important finding was that they were more utilitarian even in very personal moral dilemmas, such as deciding whether a group of stranded plane crash survivors should kill and eat an injured child in order to survive.
We do have some control over how emotions affect our moral behavior
Just because there’s a brain basis for these behavioral effects doesn’t mean we have no control at all. We can use emotion regulation skills to turn up or down how much our emotions sway our moral compass.
For example, we tend to experience negative emotions when we’re asked about the Trolley Problem and similar harm-to-save dilemmas. But depending on what we do with those negative emotions, our decision might change. If we use “cognitive reappraisal,” which essentially means trying to see the angst-causing situation from different perspectives, we can bring down the intensity of how bad it feels and end up making a more utilitarian choice.
We can use emotion regulation skills to turn up or down how much our emotions sway our moral compass.
Just because utilitarian decisions seem the most rational doesn’t always make them more desirable. It might depend on the situation. In the real world, we often think that emotions cloud judgment and lead to worse decisions. But some have specifically argued that leaning into emotions is good for making moral judgments.
Here’s an example. Some believe that it’s not only good, but crucial, for nurses to feel empathy so they can better understand patients’ experiences. That understanding gives them a more accurate and sensitive moral perception, and ultimately, the ability to make better moral decisions.
It’s also possible that we need to lean into emotions to help us rally behind climate change. Right now, climate change still feels abstract enough that many people may not see it as a moral issue. Or at least they don’t feel a strong moral intuition or motivation to become more involved.
But if we felt more empathy—or even disgust or anger—perhaps the issue would take on a moral and more urgent tone, motivating us to change our actions. In this way, emotions could end up being the fuel, and not the hindrance, to moral decisions that lead to the greater good.