Thinking, Fast and Slow – a book review

Daniel Kahneman is an interesting man. Born in 1934, he is a psychologist mostly concerned with prospect theory, decision making and the psychology of judgement. Incidentally and as a sideshow he also established the intellectual foundations of Behavioural Economics, for which he was awarded the 2002 Nobel Prize in Economics. Unsurprisingly (or is it? he would ask), he also writes very interesting books.

Which line is longer?

Which line is longer?

Thinking, Fast and Slow is one such book. At its centre is the human mind. Intuitively, we know ourselves to be a rock of rationality, with good decision making skills that are primarily influenced only by factors relevant to the decision. We have emotions and in extreme moments they might push us into rash decisions or regrettably missed opportunities, but on a day to day basis they do not have much effect. Our intuition is sometimes a good guide and sometimes not, but we have a good intuition as to when we should rely upon our intuition and when we shouldn’t. That rock of rationality might not be granite, but at the very least it’s a pretty tough sandstone. Something we can depend on.

Or so we might think. It turns out that our concept of how our mind works is grossly disconnected from how it actually does and Kahneman has spent many decades showing this in experiment after experiment. We cannot remember what we used to believe, although we don’t remember this and lie about what our previous beliefs were. We think we understand why something happened in the past, and so we exaggerate our ability to predict the future. We are inerrantly optimistic and uncaring of it. We consistently fail to optimise our overall decisions, instead making individual decisions that are demonstrably and deterministically worse off. We ignore the base rate and give irrelevantly more detailed stories greater likelihoods.

Thinking, Fast and Slow

Thinking, Fast and Slow

One example of all these fallacies: More than 90% of new businesses fail in the first 3 years, but 81% of entrepreneurs thought their odds of success were better than 7 out of 10 and one third gave themselves a 0% chance of failure.

And I could go on and list more biases and more examples, but let’s try an experiment. And realise that in being forewarned you are also forearmed. Consider the following descriptions of Alan and Ben and assess your immediate reaction to them. Would this person impress you? Would you like them? Would you want to work for them? Would you want to be their manager?

  • Alan is intelligent, industrious, impulsive, critical, stubborn and envious. What is your reaction?
  • Ben is envious, stubborn, critical, impulsive, industrious and intelligent. What is your reaction?

If you are like most people (and me), you would have immediately viewed Alan more favourably than Ben. If you’re a close reader you probably also noticed that the list of adjectives is the same. First impressions matter, and the value of many personality characteristics are ambiguous and depend on the others that they are associated with, even when those judgements depend on chance. We’re also really bad at making fair comparisons. The reality is that we are no rock. At best we’re a plastic bag blowing in the wind.

Kahneman has also gone some way towards uncovering the mechanisms by which these mistakes are made. At its core, his model of the human mind consists of two agents. System I operates quickly and automatically, with no conscious control. It makes fast associations and carries out tasks autonomously, whether those tasks are visual processing (“that car is driving towards me”), simple numerical evaluations (“2 + 2 = …”), assessing other’s emotional and mental state (“he couldn’t keep his eyes off you”) or matching examples to stereotypes (“Tom is meek and quiet with a passion for detail, you said he’s either a farmer or a librarian but I think he sounds more like a librarian”).

System II is the opposite. It allocates attention to effortful mental activities like focusing on a single person’s voice in a hubbub, looking for a man wearing a blue jacket, searching memory to identify a surprising sound, multiplying two long numbers together in your head or parking in a small space. System II is also lazy and “the often-used phrase ‘pay attention’ is apt: you dispose of a limited budget of attention that you can allocate to activities, and if you try to go beyond your budget, you will fail”.

It is these limitations, of System I and System II, which bring about the systematic fallacies that drive so many of our mistakes. We substitute difficult questions with easier ones and then answer the easier one without realising it. It takes enormous effort to factor anything not perceptually present into our decisions – “What You See Is All There Is”. We anchor on arbitrary numbers, colours and categories when we make decisions. We ignore overwhelming statistics in favour of causal stories that we already known are untrue. We are primed by our immediately and distant past experience. Remember Alan and Ben?

At the end of the day, the rational rock-cum-plastic bag is in trouble and there is no magic wand that stops us making bad decisions. However, there are a number of techniques that we can use to mitigate and reduce the impact of our inevitable biases.

Daniel Kahneman

Daniel Kahneman

Consider the problem of planning a project. Despite the most strenuous efforts, such plans are almost always the best case scenario. You simply cannot know the “unknown unknowns” that cause almost all project delays. And so it should be no surprise that the International Space Station was budgeted to cost $37 billion but ultimately cost $105 billion, or that the Sydney Opera House was meant to be built in 4 years but took 14.

Here’s one partial mitigation, albeit incomplete, imperfect and hard work: The pre-morterm.

It’s simple. Make the project plan, and be as realistically conservative as you can. Then take that project plan and a group of people knowledgeable about the plan. Ask them to imagine that the project team is one year in the future, that the plan has been implemented as it now stands and that it has been an unmitigated disaster. What is the story of that disaster? What went wrong?

In this one somewhat difficult and time consuming move you have legitimised doubts and may even have brainstormed your way into noticing threats you otherwise would have missed. The plan might even be less over-optimistic now than it was before. Congratulations.

And on that note I end this review – the book is a long read, but it is well written, interesting and enjoyable. I plan to re-read it many times. I strongly recommend it.

Advertisements

One thought on “Thinking, Fast and Slow – a book review

What do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s