How behavioural science is becoming a victim of its own popularity
Not so long ago most of the stuff published on behavioural economics and behavioural science had a limited audience of practitioners and academics. It may have been a bit dry (the majority were scientific papers), but then again, the readership was generally after facts and findings, rather than fancy pictures. More recently, when you come across a reference to behavioural science, chances are it will be something like this.
That looks like it could be a good thing. A topic that, for a long time, was only of interest to specialists, is now gaining the attention of a wider public. Efforts to educate, especially in a scientific field that is deeply relevant to how we act and interact with each other, has to be applauded.
But it is not so clear whether the popularization of behavioural science in this way is actually serving that educational purpose all that well.
A first problem is the way the information is framed. Putting an attention-grabbing title over a scientific paper may not have that great an effect on citations and downloads. But articles in the popular media tend to have headlines that emphasize the more spectacular aspects of the content. There is a more than passing parallel here with medical research findings. Anything that (apparently) dramatically increases the risk of cancer is a good bet to attract eyeballs, notably for the British tabloid the Daily Mail. The behavioural science equivalent is the cognitive bias – invariably wheeled out as something that makes us all incredibly irrational. Biases screw up decisions, convince us the world us falling apart, impact trading decisions, prevent us from being rational, or will drive the future of marketing. Hyperbole is rife, and nuance is scarce.
A second problem is that behavioural science is still evolving: insights get refined, or sometimes contradicted and relegated. But that doesn’t stop findings being widely reported (especially if they fit a popular narrative, hello confirmation bias!), before they have been replicated.
For example, a 2014 article by Alain Cohn et al provides evidence for the idea that bank employees (and only they) act more dishonestly when their professional identity as banker is made salient. They conclude that “the prevailing business culture in the banking industry weakens and undermines the honesty norm”. This is a message that resonates well with the popular perception of the finance industry, and so it easily found currency.
Yet a new paper by Jean-Michel Hupé suggests the statistical methods used were flawed and led the authors to distort the “evidence”. Will this change public opinion back to what it was? This is doubtful: the finding lacks the sensational characteristics of the original (“evidence of culture of dishonesty in banking industry”). More importantly, it is hard to unlearn something, as another example from the field of medicine illustrates. In the 1990s cholesterol rapidly became the ultimate health bogeyman: it was responsible for pretty much all of mankind’s cardiovascular troubles. We had to seriously restrict our cholesterol intake or risk early heart attacks and clogged arteries. More recently, the picture has become dramatically more nuanced, yet the meme that cholesterol is evil lives on undiminished.
Perhaps the biggest problem is not what people think, but what they do. Behavioural science is complex, messy and fuzzy. Human behaviour is subject to multiple influences, which combine in elaborate ways and often work against each other. The effect of an intervention cannot easily be predicted without proper consideration and indeed experimentation. The context in which a phenomenon is observed can play a huge role, but is not always taken into account.
Way(s) to mislead
Attempts at simplifying this complex, messy and fuzzy domain create impressions that can be misleading:
- “Biases are bad”
Almost invariably, they are pictured as the source of mistakes, poor decisions and undesirable outcomes. What is rarely mentioned is that biases have evolved over a long time, and have served us, as arguably the most successful species ever, very well. Biases are like tools: they can be used well, and for benign purposes, but they can also be used inappropriately and with unwanted outcomes. That subtlety is generally lost in fancy infographics.
- “Behavioural science is really a collection of effects, fallacies, heuristics and biases”
It is true that, in order to understand a topic well, having a vocabulary that describes it is essential. Dilip Soman, a behavioural economist at Toronto University, explains how this works for categories like wine, classical music and quilting: if we have terms to describe what we see, we can make sense of a complex subject. But a list of terms is not enough. You will not become a proficient speaker of a foreign language just by memorizing the definitions of a list of words, and you will not become a behaviour expert by being able to recite a list of biases.
This can tempt people who believe they have acquired the necessary competence, thanks to popular articles and infographics, into making behavioural interventions. A remarkable example of the Dunning-Kruger effect (behavioural economics edition) is United Airlines ill-fated attempt to replace conventional performance bonuses with a lottery system in which a handful of lucky employees would ‘win’ large amounts of money, a luxury car or a holiday. The mind boggled.
In his 2015 blogpost, “Please! Not another behavioural bias”, Jason Collins shines the light on this obsession with biases as a fundamental problem in behavioural economics (and by extension in behavioural science). At the time, I thought he was a bit overcritical and alarmist, but I now realize that he was right on the money. Time has proved him right.
Dealing with the genie
So, what to do about it?
Unfortunately, the genie is well and truly out of the bottle, and that means the possibilities are limited. (Would it have been possible to keep the genie locked in? Looking at the field of medicine, probably not.)
One thing we can all do is to be more critical about simplifying infographics. As Albert Einstein (may have) said: “Everything should be made as simple as possible, but not simpler”, and many of them would appear to cross that boundary into oversimplification. Infographics, even efforts like Buster Benson and John Manoogian III’s massive Wikipedia biases cheat sheet, risk giving the uninformed the illusion of knowledge and understanding. Don’t make the mistake of believing that a superficial summary has the full story. It is not possible to condense the essence of quantum physics in an infographic – and neither is it possible to do so for behavioural science.
Even though it may feel like a futile effort, given how much more quickly fake stories spread than the truth, anyone with knowledge in the field, should challenge and call out any dubious or oversimplified material. A single critical tweet or blogpost may not make much difference, but if many people do so systematically, the effect could be significant.
Perhaps the best antidote to the abundance of oversimplified and misguided material is to spread correct information: source material, solid explanations, material that approaches the field in a nuanced rather than a sensationalist manner. The lack so far of a unifying theory of behavioural science doesn’t help, but recent work by Harvard professor and scholar Xavier Gabaix, for example, looks promising. In time this may make it easier to make the distinction between superficial clickbait and valuable insights.
Let us ensure that the abbreviation for Behavioural Science prevails, and does not get confused with that other term with which it shares the initials.