Belief without value

Our beliefs often stand in the way of better decisions – because we value them too much

The great economist John Maynard Keynes reportedly once said, ‘When my information changes, I alter my conclusions. What do you do, Sir?’ As so often with such powerful quotes, there are no reliable sources that confirm Keynes really said this. But that doesn’t detract from the profound wisdom in the guidance it offers.

It summarizes very well the basis of the scientific method (not just for the dismal science, for that matter). When a theory no longer fits the observed facts, it is discarded or adapted. Research takes place through hypotheses, which are accepted or rejected. Researchers are agnostic, and don’t care which hypothesis is right or wrong.

Bad science

Unfortunately this is not always the case. Sometimes researchers are motivated by other matters than the pursuit of the truth. The chance that a study is published is a lot higher if it describes a positive result. That leads to publication bias (the world rarely hears about a study that finds absence of an effect), and p-hacking, the massaging of the data to come to a statistically significant result.

Even ideology can be involved in the manipulation of scientific research. Some people are convinced that living near electrical power lines is the cause of serious health problems. So when, in 1992, a Swedish study by Maria Feychting and Anders Ahlbom of the Karolinska Institute found that the incidence of leukaemia among children living within 300m of such a distribution line was four times higher than normal, this was grist to the mill of campaigners.

powerlines

Hello, leukaemia? (image: Oran Viriynici)

The American PBS show Frontline investigated the phenomenon in the episode Current of Fears in 1995 (the transcript is no longer available on the Frontline site, but as so often, Archive.org comes to the rescue!). The study in question appeared to have fallen foul of the so-called multiple comparisons fallacy. Eh? Well, the Swedish researchers did not just look for a link between proximity to overhead power lines and leukaemia, but they simultaneously investigated no less than about 800 diverse risks.

Does that make a difference? Certainly. Consider this example: imagine I want to pay you with a coin, but you suspect it is false. If a fair coin is tossed 10 times, the chance it turns up heads at least 9 times is quite small (1.07%). Let’s assume that if this happens with a random coin, it is most likely a fake, so you can use that as a test.

But what if my debt was 100 coins, and you suspect there are false coins in the purse… would the approach for an individual coin be sensible for a whole bag? The chance that a fair coin turns up heads fewer than 9 times is nearly 99% (100%-1.07%). The probability that two fair coins both turn up heads fewer than 9 times is a little smaller: 98.93% x 98.93% = 97.86%. For three coins it is a bit smaller still (96.81%) and so it goes on, until for 100 coins it is just under 34%.

This means that the chance that at least one coin will turn up nine heads in a row is 66% – nearly 2/3.

It may then look as if that one coin is a fake, but that is only because we are testing so many coins at the same time. We never indicated upfront which of the coins would be fake. The fact that one coin produces an unusual result is, well, pretty unremarkable, and certainly not proof that I am out to deceive you.

The same happens when you investigate the correlation between living near overhead power lines and 800 different conditions. Even if there is nothing to worry about, a handful out of that set of 800 conditions might produce an elevated frequency. (This phenomenon is sometimes called the Texas Sharpshooter Fallacy, referring to someone who first fires at random at an old barn, and then draws the target around the bullet hole.)

So there was nothing going on in Sweden. But several groups, with good (or not so good) intentions maintained something fishy was going on, finding a fertile breeding ground among people who already believed there was something sinister about these power lines.

A thin line between belief and confirmation bias

Confirmation bias is indeed often the culprit in situations like this: we look for information that supports what we believe, rather than for facts that might falsify it. Worse still, when we are confronted with information that should make us question our beliefs, we double down. This is why beliefs like the one that Barack Obama was not born in the USA and is a Muslim, or that EU membership is costing the UK £350M per week have proved impossible to eradicate. In 2005 and 2006, Brendan Nyhan and Jason Reifler studied the tendency of people to reject arguments and evidence that contradicts their beliefs. In several cases, they found that corrections actually reinforced the misperceptions people held, especially among those who were the most committed.

This is known as the backfire effect. A 2016 study by neuroscientists Jonas Kaplan, Sarah Gimbel and Sam Harris, placed subjects in an MRI scanner and confronted them with arguments that challenged their strongly held beliefs, political and others. They found that the brain activity they observed was “signaling threats to deeply held beliefs in the same way they might signal threats to physical safety.” As Gimbel and Kaplan explain in an episode of the You Are Not So Smart podcast, the more a subject resisted changing their mind, the stronger the activity was in two areas: the amygdala and the insular cortex. “We find that the response in the brain is very similar to what would happen if you were walking in the forest and came across a bear.”

Of course we cannot switch off this response to threats: it is in our interest to engage protective mechanisms when we are under physical threat by a bear, a mugger or an out of control trolley. But we do have a choice in how deeply we hold a belief.

It is the strength of the attachment to a belief that makes us hold on to it, even in the face of unquestionable evidence to the contrary. If our belief is strong enough, we resist changing our mind as much as we would resist having an arm eaten by a wild animal.

Scientists ideally approach a subject from a neutral viewpoint, without emotional pay-off for either the confirmation or the falsification of a hypothesis. The true utility of research is to discover the truth, not to be right. But we ordinary people also sometimes have to choose between getting wiser, and hanging on to an existing belief. If we really want to pursue wisdom, then we must leave behind the idea that giving up a belief comes at an emotional cost. We have to accept that it is a good thing to lose one’s beliefs.

For belief in itself has no value. Only belief that is not being contradicted by facts can be valuable.

What with the question whether Keynes really spoke the words at the beginning of this article? Dyed-in-the-wool Keynesians might like to believe so, because the quote fits well with his character. But anyone doing so really ought to be prepared to give up that belief. Even if, one day, someone came with unequivocal proof that it was actually said by his rival, that other great 20th century economist, Friedrich von Hayek.

 

Advertisements

About koenfucius

Wisdom or koenfusion? Maybe the difference is not that big.
This entry was posted in Behavioural economics, Cognitive biases and fallacies, Emotions, politics and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s