Most of us would prefer a world that looks like this (World A):
But all of us are stuck with a world that looks more like this (World B):
(unless one of you is hiding a crystal ball under your mattress)
The difference between these two worlds is simple.
World A is governed by measurable uncertainty—known risks.
World B is governed by unmeasurable uncertainty—unknown risks.
We, as humans, prefer known risks to unknown risks. Unknown risks mean not knowing, and not knowing means makes us very uncomfortable.
Behavioral economists call this: Ambiguity Aversion.
Daniel Ellsberg came up with the Ellsberg Paradox to illustrate how it works.
Imagine an urn. It’s filled with 30 red balls and 60 balls that are either black or yellow. We don’t know how many black balls there are or how many yellow balls there are, but we know the combined total of black and yellow balls is 60.
We have a choice between two gambles:
Gamble A: Get $100 if we draw a red ball
Gamble B: Get $100 if we draw a black ball
Which gamble would you take?
Most people choose Gamble A—it’s a known risk. We know there’s a ⅓ chance that we’ll draw a red ball.
Gamble B is a bit more complicated—it involves an unknown risk. We’re not sure exactly how many black balls there are. There might be 50! But, there might be 10. But, there might be 55! But, there might only be 3.
We prefer the certain gamble to the uncertain one.
As the saying goes, “Better the devil you know than the devil you don’t.”
This paradox might seem a bit theoretical, but there’s another study from a group of researchers in the Netherlands that illustrates the idea with a bit more poignancy.
The researchers gave two groups of people a series of 20 electric shocks. Some were intense and some were mild.
The first group knew they’d receive an intense shock every time. 20 intense shocks.
The second group knew they’d receive only 3 intense shocks, but they had no clue when they’d receive an intense shock rather than a mild one. 17 mild shocks and 3 random intense shocks.
Their findings are interesting: the second group exhibited more symptoms of fear and anxiety than the first group, even though they were only given 3 intense shocks. They sweat more profusely and their hearts raced faster.
There was no ambiguity for the first group—they knew they’d have 20 intense shocks, one after the other.
But the second group’s situation was more ambiguous. They were left to agonize over the uncertainty of the situation. Will the intense shock come this round? The next? How intense will it be?
In an op-ed that touches on this study, Harvard Psychologist Dan Gilbert wrote, “human beings find uncertainty more painful than the things they’re uncertain about.”
Alfred Hitchcock made masterful suspense-thrillers by applying this to cinema. As he once said, “there is no terror in the bang, only in the anticipation of it.”
Ambiguity often acts as a smoke screen that clouds our ability to make sound decisions. We don’t like it—we go out of our way to avoid it.
We choose the burger joint with 500 reviews on yelp over the Italian restaurant without a website. We choose the standard career path because the uncertainty of a non-traditional path paralyzes us. How many times do we forgo the unknown in favor of the known only because we’re afraid of the ambiguity that surrounds the unknown?
We don’t have to embrace ambiguity with open arms, but understanding how it shapes our decisions can help us make better ones.
As individuals, we can pay attention to the role ambiguity plays in our decision making. We shouldn’t choose the known risk because the ambiguous risk is just that—ambiguous. We can seek out more information; it might help us navigate the uncertainty.
As leaders, we can observe how ambiguity might prevent our people from taking action. We can work to reduce ambiguity where possible. We can strive to remove the smoke screen of ambiguity from our work.