Bending your mind

(featured image credit: Emilio Garcia CC BY)

Our beliefs are not always as solid and stable as we tend to, well, believe they are….

Our view of the world is shaped significantly by our beliefs. We believe that this, that or the other, supermarket caters best to our needs, so that’s where we do our weekly shopping. We are convinced that cars from a particular country reflect its reputation, so those that determines the makes we drive (or aspire to drive). We believe that this or that politician or party best serves our interests, so that is how we vote.

It would indeed be very hard to navigate the complex choices we need to make every day if we had to rationally weigh up every possibility, and if we couldn’t rely on our stable, persistent beliefs. We barely ever question them, and treat them as though they’re truths. So it’s not surprising that it is unusual for us to modify, let alone reverse our beliefs. For that to happen, something extraordinary is needed.

A surprising lunch

In the spring of this year, Greggs, a chain of sandwich shops in the UK known for decent, but unremarkable food like pasties and sausage rolls, took part in a food festival in Richmond, Southwest London. However, they disguised their identity, appearing as Gregory and Gregory instead. With offerings including Slow roasted tomato and feta pasta salad, and a vegan Mexican Bean wrap, they set out to present their new summer menu to a pretty posh bunch of visitors – undercover.

Itsgreggs!

I can’t believe it’s Greggs – via youtube

And those visitors didn’t half change their mind according to this video. Now of course, it is a publicity campaign, with not even a pretence of scientific validity, and careful editing ensured the right visitors to their stall were shown (it’s called selection bias) – if they were not paid actors, that is. But it chimes with the idea that making people change their minds requires a stark confrontation with a different narrative. It takes a lot to shake up well-entrenched beliefs.

Or does it? How hard would it be to make you believe that purple was blue?

Fooled by prevalence

Recent research by Harvard psychologist David Levari and colleagues shines a remarkable light on the stability of our judgement. Behind the somewhat uncool title “Prevalence-induced concept change in human judgment” of their paper lie fascinating insights derived from seven studies, in which they investigate what happens when the relative occurrence of a particular stimulus is reduced.

In the first few experiments, they presented their subjects with a sequence of 1000 coloured dots, one at a time, chosen from a purple to blue continuum (shown in the image below). For each dot, participants had to indicate whether or not it was blue.

blueorpurple

True blue? (image source)

The participants were subjected to two conditions. One group saw the dots picked randomly, such that every single one had 50% chance of being picked from the blue half of the continuum – this was the stable prevalence condition. The other group was subject to the decreasing prevalence condition. The dots from the blue half were chosen with reducing likelihood (50% in the first 200 presentations, then respectively 40%, 28%, and 16% in the next three sequences of 50, and finally just 8% for dots 351-1000).

In the first group the decision whether a particular dot was blue or not did not change from the first 200 to the last 200 dots presented. But the second group were more likely to say a given dot was blue in the final 200 dots (where only 8% were actually from the blue side of the continuum), than in the first 200. When the actual blue dots were less prevalent, the participants saw more of them than there really were.

dottrial

(image source)

The researchers then ran the same experiment again, but this time actually told the second group that the prevalence of blue dots would decrease. Yet again, even with participants knowing the blue dots would occur less and less, they still shifted their judgement. In further follow-up studies, they found the result replicated when they specifically instructed participants to remain consistent, and not be swayed by the prevalence – even with a monetary incentive, and even if the change in prevalence was abrupt and not gradual. When they increased the prevalence of blue dots, the shift happened in the other direction.

Consistent results, but maybe a touch artificial and contrived: it is rare for anyone to face such a task outside a psychology lab. So the researchers tried something more realistic: they showed their participants computer generated faces on a continuum from ‘very threatening’ to ‘not very threatening’. And they found the same phenomenon was happening. Presented with a decreasing prevalence of threatening faces, participants were more likely to identify a given face as a threat when it appeared in the final 200 than in the first 200 presentations.

threateningfaces

(image source)

What about non-visual stimuli? One study also looked at whether participants would judge research proposals to be unethical – again chosen from a continuum. On the ‘ethically OK’ side, there were innocent ideas, such as “Participants will make a list of the cities they would most like to visit around the world, and write about what they would do in each one”. The middle contained ambiguous ones, for example “Participants will be given a plant and told that it is a natural remedy for itching. In reality, it will cause itching. Their reaction will be recorded”. At the ‘very unethical’ end were proposals like “Participants will be asked to lick a frozen piece of human faecal matter. Afterwards, they will be given mouthwash. The amount of mouthwash used will be measured.”  And here too, participants in the decreasing prevalence condition were more likely to reject ethically ambiguous proposals that appeared towards the end than at the beginning.

Pessimism rules

The researchers’ conclusion is that we tend to expand our concept of what constitutes blue, a threatening face, or an unethical proposal, as the prevalence of them decreases. What was previously seen as purple becomes blue, faces that were neutral earlier on become threatening, and originally ambiguous research becomes unethical.

One consequence the authors imagine is that people whose mission it is to reduce some social ill will fail to recognize the result of their efforts. As the original problems become less prevalent, situations that used to be perfectly fine will begin to take their place, as the definition widens. This may lead to frustration, and to the misallocation of resources to solving problems that no longer exist.

But these findings are a concern also for those of us who do not have such noble objectives. We already have a tendency to overestimate the frequency of dramatic events like violent crimes, as these feature so prominently in the (social) media. If, thanks to effective interventions, criminal violence becomes less commonplace, we will not necessarily feel safer. Instead we may start judging minor instances as evidence of persistent violent crime.

Perhaps this effect on the general public is even more significant than that on policy makers and implementers. We may not actually change our mind, but we certainly bend it in the face of changing prevalence. We risk seeing the world as darker and worse than it really is. And if this happens to us as individuals, it will probably also be reflected in public opinion. Max Roser and Mohamed Nagdy at Our World in Data devoted a fascinating post to optimism and pessimism. The research by Levari and colleagues provides some explanation of our collective pessimistic nature.

This pessimism inevitably shapes our decision-making. It may make us more fearful than we should be – avoiding risks we overestimate, and being receptive to those playing to those fears, whether they be trying to sell us insurance, or trying to get our vote.

The fact that, even as the world gets better, our mind would seem to bend so easily towards pessimism is itself, sadly, grounds for some pessimism.

Advertisements

About koenfucius

Wisdom or koenfusion? Maybe the difference is not that big.
This entry was posted in Behavioural economics, Cognitive biases and fallacies, Psychology, Society and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s