Latest From the Lab: The Dark Corners of Safety
Imagine you’ve invested months on a project that promises to boost your bottom line and position your organization ahead of your competitors. As the work continues, issues pile up: Employees complain of burnout, and some even quit. And as you near the end, you discover your competitor has beaten you to it, and now all signs suggest you should abandon the project. Do you throw in the towel and walk away? Or do you persist while proclaiming, “I might as well continue … I’ve come so far,” despite the looks of unease from those around you?
The dark corner
We can all agree that once we’ve invested efforts into a goal, it’s hard to let it go. But what if our perseverance affects our moral decision-making? In fascinating new work in the Journal of Judgment and Decision Making, participants were asked how likely they would be to continue with a project they’ve invested in after realizing it was going to fail. The study showed that people who had already started on a course of action and invested resources were more likely to continue, even when they knew continuing wouldn’t lead to success.
On the surface, this may seem harmless. But looking at the scenarios put forth in the study reveals a dark side. In one, you’re tasked with developing a cure for a painful disease, and to do so, you’d have to kill 1,000 monkeys. In another scenario, you’re working on a project to relieve traffic congestion and must bulldoze a neighborhood to achieve your objective. In both scenarios, you learn that another company has beaten you to your goal, and you’re asked how likely you are to continue the work — and kill the monkeys or destroy the homes.
What would you do?
From your current position, it might seem a ridiculous question. Why would you willingly choose to continue with either of these projects and make decisions that could be viewed by many as morally unacceptable? The researchers asked the same question and discovered that participants weren’t normally OK making immoral decisions. But after investing in a course of action, they became biased in such a way that these questionable decisions were now in the realm of morally acceptable ones.
The examples above represent one of many cognitive biases, or mental shortcuts that we take, largely unconsciously, when we make decisions. Whether we realize it or not, as we face a continuous barrage of decisions over the course of each day, we use shortcuts to make these decisions as efficiently as possible, while avoiding pain or loss. One of the ways our brain has evolved to do this is to use easy-to-access or recently used information first, as well as any information that could help avoid a loss. Although an overwhelming number of cognitive biases exist, the SEEDS model categorizes them into five simple domains: Similarity, Expedience, Experience, Distance, and Safety. Falling within the Safety domain, this sometimes dark mental shortcut is referred to as sunk-cost bias and refers to how we often have a difficult time deciding to throw in the towel after we’ve invested resources in a goal.
The neuroscience behind our cognitive biases explains why we sometimes are driven to make questionable or immoral decisions, such as continuing to bulldoze a neighborhood even if it won’t help with traffic congestion. Put into play by our evolutionary past, we are motivated to avoid danger, loss, or pain to help us survive. This means that we pay more attention to information about what we’ve lost or could still lose than information based on what we could save or gain. In the scenarios above, we have already lost resources, so our motivation to continue down the current path is stronger. We don’t want what we’ve already invested to have been for nothing — even if continuing will create even more losses.
Choose a sunnier avenue
The science that helps us understand why we make these kinds of decisions can also help steer us away from a potentially dark path. After acknowledging that your decisions may not always be objective because of safety biases, you can diverge from the shadowed path by placing some distance between yourself and the decision.
The first strategy for creating distance is to place yourself in the shoes of a friend, imagining that you’re making this decision for them. Research has shown that when making decisions for someone else, you can be less biased. That’s because the decision is detached from your sense of self and your own potential gain or loss. Another way to distance yourself is to place the decision in the past. Imagining that you have already made the decision and are now looking at it from a more objective point of view will make the events less charged with emotion and tied to yourself. Does it still seem like a good idea to bulldoze the homes?
These strategies target the core biology that drives sunk-cost bias by engaging a part of the brain called the ventrolateral prefrontal cortex, or vlPFC. This region, commonly referred to as the brain’s “braking system,” helps you regain self-control by drawing resources away from areas involved in emotion and fear and toward areas responsible for focused work and goal-directed planning. In fact, research has shown that when the vlPFC is more active, people tend to make less biased decisions.
So the next time you’re asked whether the project you just spent half a year on should be shelved, remove yourself from the driver’s seat, imagine some distance, and then decide whether you still want to continue.