If there’s one thing to take away from Daniel Kahnemann’s Thinking Fast and Slow it’s that we humans are not overwhelmingly guided by rational thought. In Kahnemann’s language we have a number of heuristics that we (subconsciously) use as shortcuts to deal with complex issues; whilst useful, these heuristics draw on biases that introduce the possibility of errors and inaccuracies that we don’t really recognise.
Kahnemann’s thinking is a key part of a wide range of literature that shows humans are not the rational, predictable, benefit-maximising beings we think we are.
In the realm of public services this is fascinating because of the relentless focus on evidence-based policy – the equivalent of making policy on the basis of things being rational, predictable, and benefit-maximising.
The appeal is understandable. Basing policy on evidence is common sense, isn’t it? Why would you base policy on anything else? Shouldn’t we only spend tax-payers’ money on what works?
But scratch the surface of these questions and things aren’t as rational, predictable, and benefit-maximising as evidence-based policy would have us believe. There are similar heuristics when it comes to making policy.
Several fascinating posts have been published recently that have prompted this reflection, all of which are worth reading.
The first is the findings of the Perils of Perception survey, which explores what the public thinks is the case when it comes to certain issues against what the reality is. Perhaps related to this, there’s George Osborne’s creative use of definitions of what areas of activity fall under which budget headings when it comes to where taxpayer money is spent. The third is Chris Dillow’s great post “Against evidence-based policy”, where he introduces four arguments which engage critically with the concept of evidence-based policy. The final is a tremendous post from Chris Hatton in which he explores a concern he has that Randomised Control Trials (RCTs) are becoming increasingly fetishised as the only valid research methodology. (I hope it’s not immodest to say that Chris’s post picks up on some points raised in my own post on facts, evidence and Personal Health Budgets.)
What might the heuristics of evidence-based policy be? Here are some suggestions, drawing on what each of the above might tell us:
- The Incomplete Picture heuristic: We think the evidence for evidence-based policy gives us a complete picture; in reality it can only give us a partial picture of what might happen when a policy is introduced
- The Definition heuristic: We assume everyone holds a common definition for what is “evidence” or “fact”; in reality not everyone agrees on such things
- The Political heuristic: We assume that what is evidently the case to be pursued in policy is also evidently the case to pursue politically; in reality it isn’t.
 – Kahneman introduces the idea of two systems – System 1 and System 2 – that interact with each other to guide our behaviour. System 1 runs automatically and continuously and “effortlessly originates impressions and feelings that are the source of explicit beliefs and deliberate choices of System 2”. System 2 is “the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do.”
Unfortunately, System 1 has biases – systematic errors that it is prone to make in specific circumstances.
Because System 1 has limitations, such as little understanding of logic and statics, it sometimes answers an easier question than the one it was asked. To do so it introduces a heuristic: a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. I.e. if a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it.
(Jonathan Haidt has described Systems 1 and 2 in terms of the elephant (System 1) and the rider (System 2)).