When in doubt… do something, or do nothing?

(credit: Esa L CC BY)

When we’re not sure, our biases guide our choices… but which ones?

Imagine you’re the goal keeper of a great football team in a championship final. The game is nearly over and your side is winning 2-1, but an unfortunate foul by one of your team mates in final minute of injury time means you’re facing a penalty kick. Stop it, and you will be the hero of the match, let it in, and thanks to you it’s extra time, and possibly defeat. What is your plan – dive left, dive right… or stay put?

Penalty kicks take place at the boundary of human reaction time. The player needs to decide where to place the ball before the goalkeeper moves, and the keeper needs to choose what to do before the ball is actually kicked. If you did like most goalkeepers do, you’d go for one of the two sides. Unfortunately about 1/3 of penalty kicks are placed in the centre of the goal, as a study by Michael Bar Eli, a psychologist at Ben Gurion University, and colleagues found. They analysed 286 penalty kicks from top league and championships by watching footage of the matches (research can be really hard!), and worked out that 29% of them were kicked in the centre, 32% to the left, and 39% to the right. However, the goalie jumped left 49% of the time, right 45% of the time, and stayed in the centre just 6% of the time.

“Something must be done”… or must it?

Why this tendency to jump, rather than stay put if that would increase their chance of stopping the ball? It is called action bias. People sometimes tend to “do something”, perhaps because if they didn’t, and the outcome was bad, they’d be blamed or they’d regret it. When they take action, at least they can say they tried. This is a phenomenon we can see not just in sports, but also in policy-making and organization management. “Something must be done, this is something, so let’s do it”, with little regard to the actual effectiveness of our action.

Sometimes, that is.  At other times, people seem to prefer to do nothing. An example that springs to mind is the controversy around vaccination. A rather alarming finding of a survey by the Wellcome Trust published last Wednesday is that, in all of Western Europe, 22% of respondents disagree or strongly disagree that vaccines are safe. In France, in particular, 1 in 3 people disagree that vaccines are safe. Not surprisingly, France is experiencing a significant rise in the incidence of measles, as parents decide not to have their children vaccinated.

vaccination

Lack of trust, lack of action (source: Wellcome Global Monitor 2018)

Just like with the penalty kick, a decision needs to be made regarding an uncertain future, but here some people seem chose inaction, rather than action. Perhaps this is because the result of ‘doing something’ can be – at least in the perception of the decision-maker – be much more severe than losing the championship. Or is there more to it?

This brings to mind the classic philosophical thought experiment of the Trolley Problem. Faced with a runaway trolley heading for a section of track where there are five workers about to be killed, would you flick a switch to divert it to a siding where only one worker will be killed? About 9 in 10 people would do so. However, if the method for saving five lives by sacrificing one is not flicking a switch, but throwing a fat person on the track, far fewer people would take this course of action. We treat situations in which people suffer because we didn’t do anything different from situations where they do as a direct result of something we did – even though both involved a deliberate decision.

But what the Trolley Problem (in its canonical form at least – there are numerous variants out there) does not reflect is the uncertainty of decisions like the goalkeeper’s or the parent’s: they don’t know exactly the outcome of if they do, or don’t, take action. It is not knowing for sure what the consequence of our action will be that makes the decision so much harder.

We can see this also in the reluctance of some women to opt for Hormone Replacement Therapy (HRT) to counteract the effects of the menopause. For many women, these can seriously reduce the quality of life: loss of libido, painful intercourse, bone loss and increased likelihood of fractures, weight gain, joint pain and more. HRT can be very effective at reducing these symptoms, but in 2002 it was found to be associated with an increase in the risk of breast cancer. A woman undergoing HRT for 5-9 years between the ages of 50 and 59 is about 4 times more likely to develop breast cancer than a woman who doesn’t. That is, however, from a low baseline: the risk goes from about 2% to about 8%. On the other hand, a higher weight (one of the effects of the menopause) is also associated with an elevated risk of breast cancer.

Two cognitive biases

Since the revelation, 17 years ago, that HRT could lead to breast cancer, many women have stopped using it (or decide not to start using it), despite the significant improvements in life quality and health benefits it can offer. This is difficult trade-off to make, and two cognitive biases interfere with our ability to serenely reason about it: loss aversion and ambiguity aversion.

Loss aversion describes how we experience a loss as more intense than we experience a gain of similar magnitude. Ambiguity aversion is our tendency to avoid situations where the probabilities are unknown. As an example, consider the following choice: you can draw a ball from one of two urns in front of you, and if it is white, you win. One urn contains 50 black and 50 white balls, the other contains 100 balls, black or white, but you don’t know how many of each. If you prefer to draw from the first urn, you exhibit ambiguity aversion.

injection

Uncertainty either way… get the jab, or don’t get the jab? (image: Victoria Borodinova)

It is of course not easy to compare the experienced ‘loss’ of developing breast cancer with the ‘gain’ of reducing or eliminating the detrimental menopause symptoms, but the prospect of the former can loom large in our perception of the choice. Together with an aversion to ambiguity, this can tilt the uncertainty balance: we don’t really now how bad the menopause symptoms would be, and perhaps we can learn to live with them. On the other hand, we may not know that we will develop cancer, but we know (“for certain”) that it is possible. And suddenly, the choice becomes clearer.

A similar thought process could explain the choice of a parent not have their children vaccinated: measles is still a relatively rare disease, and if one is convinced that vaccines can cause autism in children, that settles the decision too. In both situations, a prior belief that HRT and vaccines are “unnatural”, and that Big Pharma and governments are in cahoots, may reinforce the chosen course of (in)action – such unambiguous beliefs are powerful arguments when one is ambiguity averse.

Of course, our decision in situations like these depend crucially on our perception and indeed our prior beliefs. We could construct similar arguments in favour of vaccination and HRT. There may as yet be unknown detrimental side effects to vaccines (high ambiguity), but we know they are effective against a serious disease (low ambiguity). We don’t know whether we will develop breast cancer as a result of HRT (high ambiguity), but we do know it will vastly improve the quality of our post-menopausal life (low ambiguity). And if we trust scientists and the robustness of their scientific methods that produce the insights we use, that too will reinforce our choice.

When we cannot (or will not) use reasoned deliberation to evaluate the options before us, we rely on more simplistic shortcuts. Unfortunately, we are a bundle of contradictions, and depending on how we look at the world, we may opt for action, or for inaction. Neither guarantees us the correct solution.

Thankfully, alongside our contradictory tendencies, we possess another powerful capacity: hindsight bias, or the conviction after the event that we chose to do the right thing – even if it was to do nothing.

About koenfucius

Wisdom or koenfusion? Maybe the difference is not that big.
This entry was posted in Behavioural economics, Cognitive biases and fallacies, Philosophy, Society and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s