Main The Art of Thinking Clearly : Better thinking, Better decision

The Art of Thinking Clearly : Better thinking, Better decision

5.0 / 5.0
How much do you like this book?
What’s the quality of the file?
Download the book for quality assessment
What’s the quality of the downloaded files?
Sceptre (Hodder & Stoughton Ltd)
ISBN 10:
ISBN 13:
PDF, 1001 KB
Download (pdf, 1001 KB)

You may be interested in Powered by Rec2Me


Most frequently terms

1 comment
Titus M
One of the best books i've read, if not the best.
14 July 2015 (20:38) 

To post a review, please sign in or sign up
You can write a book review and share your experiences. Other readers will always be interested in your opinion of the books you've read. Whether you've loved the book or not, if you give your honest and detailed thoughts then people will find new books that are right for them.

Examination and Analysis of Starch and Starch Products

PDF, 20.51 MB
0 / 0

The Science Delusion

EPUB, 671 KB
0 / 0
For Sabine

The Art of Thinking Clearly
Rolf Dobelli

First published in Great Britain in 2013 by Sceptre
An imprint of Hodder & Stoughton
An Hachette UK company


Copyright © Rolf Dobelli 2013

The right of Rolf Dobelli to be identified as the Author of the
Work has been asserted by him in accordance with the
Copyright, Designs and Patents Act 1988.

All rights reserved. No part of this publication may be
reproduced, stored in a retrieval system, or transmitted, in any
form or by any means without the prior written permission of the
publisher, nor be otherwise circulated in any form of binding or
cover other than that in which it is published and without a
similar condition being imposed on the subsequent purchaser.

A CIP catalogue record for this title is available from the British Library.

eBook ISBN 978 1 444 75955 6
Hardback ISBN 978 1 444 75954 9

Hodder & Stoughton Ltd
338 Euston Road
London NW1 3BH


2 DOES HARVARD MAKE YOU SMARTER?: Swimmer’s Body Illusion
Social Proof
7 BEWARE THE ‘SPECIAL CASE’: Confirmation Bias (Part 1)
8 MURDER YOUR DARLINGS: Confirmation Bias (Part 2)
9 DON’T BOW TO AUTHORITY: Authority Bias
12 WHY ‘NO PAIN, NO GAIN’ SHOULD SET ALARM BELLS RINGING: The It’llGet-Worse-Before-It-Gets-Better Fallacy
AND ABILITIES: Overconfidence Effect

17 YOU CONTROL LESS THAN YOU THINK: Illusion of Control
18 NEVER PAY Y; OUR LAWYER BY THE HOUR: Incentive Super-Response
PSYCHOTHERAPISTS: Regression to Mean
21 LESS IS MORE: The Paradox of Choice
23 DON’T CLING TO THINGS: Endowment Effect
Scarcity Error
Gambler’s Fallacy
33 WHY TEAMS ARE LAZY: Social Loafing
34 STUMPED BY A SHEET OF PAPER: Exponential Growth
Fundamental Attribution Error

40 FALSE PROPHETS: Forecast Illusion
45 DON’T BLAME ME: Self-Serving Bias
46 BE CAREFUL WHAT YOU WISH FOR: Hedonic Treadmill
50 SWEET LITTLE LIES: Cognitive Dissonance
Hyperbolic Discounting
52 ANY LAME EXCUSE: ‘Because’ Justification
53 DECIDE BETTER – DECIDE LESS: Decision Fatigue
with Averages

60 HURTS SO GOOD: Effort Justification
61 WHY SMALL THINGS LOOM LARGE: The Law of Small Numbers
62 HANDLE WITH CARE: Expectations
63 SPEED TRAPS AHEAD!: Simple Logic
67 BE YOUR OWN HERETIC: Introspection Illusion
68 WHY YOU SHOULD SET FIRE TO YOUR SHIPS: Inability to Close Doors
71 WHY IT’S NEVER JUST A TWO-HORSE RACE: Alternative Blindness
72 WHY WE TAKE AIM AT YOUNG GUNS: Social Comparison Bias
73 WHY FIRST IMPRESSIONS DECEIVE: Primacy and Recency Effects
74 WHY YOU CAN’T BEAT HOME-MADE: Not-Invented-Here Syndrome
77 THE MYTH OF LIKE-MINDEDNESS: False-Consensus Effect
78 YOU WERE RIGHT ALL ALONG: Falsification of History

84 WHY MONEY IS NOT NAKED: House-Money Effect
89 HOT AIR: Strategic Misrepresentation
90 WHERE’S THE OFF SWITCH?: Overthinking
91 WHY YOU TAKE ON TOO MUCH: Planning Fallacy
95 WHY CHECKLISTS DECEIVE YOU: Feature-Positive Effect
97 THE STONE-AGE HUNT FOR SCAPEGOATS: Fallacy of the Single Cause
Author Biography
A Note on Sources


In the fall of 2004, a European media mogul invited me to Munich to partake in
what was described as an ‘informal exchange of intellectuals’. I had never
considered myself an ‘intellectual’ – I had studied business, which made me quite
the opposite, really – but I had also written two literary novels and that, I guessed,
must have qualified me for such an invitation.
Nassim Nicholas Taleb was sitting at the table. At that time, he was an obscure
Wall Street trader with a penchant for philosophy. I was introduced to him as an
authority on the English and Scottish Enlightenment, particularly the philosophy
of David Hume. Obviously I had been mixed up with someone else. Stunned, I
nevertheless flashed a hesitant smile around the room and let the resulting
silence act as proof of my philosophical prowess. Straight away, Taleb pulled
over a free chair and patted the seat. I sat down. After a cursory exchange about
Hume, the conversation mercifully shifted to Wall Street. We marveled at the
systematic errors in decision making that CEOs and business leaders make –
ourselves included. We chatted about the fact that unexpected events seem much
more likely in retrospect. We chuckled about why it is that investors cannot part
with their shares when they drop below acquisition price.
Following the event, Taleb sent me pages from his manuscript, a gem of a
book, which I commented on and partly criticised. These went on to form part of
his international best-seller, The Black Swan. The book catapulted Taleb into the
intellectual all-star league. Meanwhile, my appetite whetted, I began to devour
books and articles written by cognitive and social scientists on topics such as
‘heuristics and biases’, and I also increased my email conversations with a large
number of researchers and started to visit their labs. By 2009, I had realised that,
alongside my job as a novelist, I had become a student of social and cognitive
The failure to think clearly, or what experts call a ‘cognitive error’, is a
systematic deviation from logic – from optimal, rational, reasonable thought and
behaviour. By ‘systematic’ I mean that these are not just occasional errors in
judgement, but rather routine mistakes, barriers to logic we stumble over time and
again, repeating patterns through generations and through the centuries. For
example, it is much more common that we overestimate our knowledge than that

we underestimate it. Similarly, the danger of losing something stimulates us much
more than the prospect of making a similar gain. In the presence of other people
we tend to adjust our behaviour to theirs, not the opposite. Anecdotes make us
overlook the statistical distribution (base rate) behind it, not the other way round.
The errors we make follow the same pattern over and over again, piling up in one
specific, predictable corner like dirty laundry while the other corner remains
relatively clean (i.e. they pile up in the ‘overconfidence corner’, not the
‘underconfidence corner’).
To avoid frivolous gambles with the wealth I had accumulated over the course
of my literary career, I began to put together a list of these systematic cognitive
errors, complete with notes and personal anecdotes – with no intention of ever
publishing them. The list was originally designed to be used by me alone. Some
of these thinking errors have been known for centuries; others have been
discovered in the last few years. Some come with two or three names attached to
them. I chose the terms most widely used. Soon I realised that such a compilation
of pitfalls was not only useful for making investing decisions, but also for business
and personal matters. Once I had prepared the list, I felt calmer and more
clearheaded. I began to recognise my own errors sooner and was able to change
course before any lasting damage was done. And, for the first time in my life, I
was able to recognise when others might be in thrall to these very same
systematic errors. Armed with my list, I could now resist their pull – and perhaps
even gain an upper hand in my dealings. I now had categories, terms, and
explanations with which to ward off the spectre of irrationality. Since Benjamin
Franklin’s kite-flying days, thunder and lightning have not grown less frequent,
powerful or loud – but they have become less worrisome. This is exactly how I
feel about my own irrationality now.
Friends soon learned of my compendium and showed interest. This led to a
weekly newspaper column in Germany, Holland and Switzerland, countless
presentations (mostly to medical doctors, investors, board members, CEOs and
government officials) and eventually to this book.
Please keep in mind three things as you peruse these pages: first, the list of
fallacies in this book is not complete. Undoubtedly new ones will be discovered.
Second, the majority of these errors are related to one another. This should come
as no surprise. After all, all brain regions are linked. Neural projections travel from

region to region in the brain; no area functions independently. Third, I am
primarily a novelist and an entrepreneur, not a social scientist; I don’t have my
own lab where I can conduct experiments on cognitive errors, nor do I have a staff
of researchers I can dispatch to scout for behavioural errors. In writing this book, I
think of myself as a translator whose job is to interpret and synthesise what I’ve
read and learned – to put it in terms others can understand. My great respect goes
to the researchers who, in recent decades, have uncovered these behavioural
and cognitive errors. The success of this book is fundamentally a tribute to their
research. I am enormously indebted to them.
This is not a how-to book. You won’t find ‘seven steps to an error-free life’ here.
Cognitive errors are far too ingrained for us to be able to rid ourselves of them
completely. Silencing them would require superhuman willpower, but that isn’t
even a worthy goal. Not all cognitive errors are toxic, and some are even
necessary for leading a good life. Although this book may not hold the key to
happiness, at the very least it acts as insurance against too much self-induced
Indeed, my wish is quite simple: if we could learn to recognise and evade the
biggest errors in thinking – in our private lives, at work or in government – we
might experience a leap in prosperity. We need no extra cunning, no new ideas,
no unnecessary gadgets, no frantic hyperactivity – all we need is less irrationality.

Survivorship Bias

No matter where Rick looks, he sees rock stars. They appear on television, on the
front pages of magazines, in concert programmes and at online fan sites. Their
songs are unavoidable – in the mall, on his playlist, in the gym. The rock stars are
everywhere. There are lots of them. And they are successful. Motivated by the
stories of countless guitar heroes, Rick starts a band. Will he make it big? The
probability lies a fraction above zero. Like so many others, he will most likely end
up in the graveyard of failed musicians. This burial ground houses 10,000 times
more musicians than the stage does, but no journalist is interested in failures –
with the exception of fallen superstars. This makes the cemetery invisible to
In daily life, because triumph is made more visible than failure, you
systematically overestimate your chances of succeeding. As an outsider, you (like
Rick) succumb to an illusion, and you mistake how minuscule the probability of
success really is. Rick, like so many others, is a victim of Survivorship Bias.
Behind every popular author you can find 100 other writers whose books will
never sell. Behind them are another 100 who haven’t found publishers. Behind
them are yet another 100 whose unfinished manuscripts gather dust in drawers.
And behind each one of these are 100 people who dream of – one day – writing a
book. You, however, hear of only the successful authors (these days, many of
them self-published) and fail to recognise how unlikely literary success is. The
same goes for photographers, entrepreneurs, artists, athletes, architects, Nobel
Prize winners, television presenters and beauty queens. The media is not
interested in digging around in the graveyards of the unsuccessful. Nor is this its
job. To elude the survivorship bias, you must do the digging yourself.
You will also come across survivorship bias when dealing with money and risk:
imagine that a friend founds a start-up. You belong to the circle of potential
investors and you sense a real opportunity: this could be the next Google. Maybe
you’ll be lucky. But what is the reality? The most likely scenario is that the
company will not even make it off the starting line. The second most likely

outcome is that it will go bankrupt within three years. Of the companies that
survive these first three years, most never grow to more than ten employees. So,
should you never put your hard-earned money at risk? Not necessarily. But you
should recognise that the survivorship bias is at work, distorting the probability of
success like cut glass.
Take the Dow Jones Industrial Average Index. It consists of out-and-out
survivors. Failed and small businesses do not enter the stock market, and yet
these represent the majority of business ventures. A stock index is not indicative
of a country’s economy. Similarly, the press does not report proportionately on all
musicians. The vast number of books and coaches dealing with success should
also make you sceptical: the unsuccessful don’t write books or give lectures on
their failures.
Survivorship bias can become especially pernicious when you become a
member of the ‘winning’ team. Even if your success stems from pure coincidence,
you’ll discover similarities with other winners and be tempted to mark these as
‘success factors’. However, if you ever visit the graveyard of failed individuals and
companies, you will realise that its tenants possessed many of the same traits
that characterise your success.
If enough scientists examine a particular phenomenon, a few of these studies
will deliver statistically significant results through pure coincidence – for example
the relationship between red wine consumption and high life expectancy. Such
(false) studies immediately attain a high degree of popularity and attention. As a
result, you will not read about the studies with the ‘boring’, but correct results.
Survivorship bias means this: people systematically overestimate their chances
of success. Guard against it by frequently visiting the graves of once-promising
projects, investments and careers. It is a sad walk, but one that should clear your
See also Self-serving Bias (ch. 45); Beginner’s Luck (ch. 49); Base-Rate Neglect (ch. 28);
Induction (ch. 31); Neglect of Probability (ch. 26); Illusion of Skill (ch. 94); Intention-ToTreat Error (ch. 98)

Swimmer’s Body Illusion

As essayist and trader Nassim Taleb resolved to do something about the
stubborn extra pounds he’d be carrying, he contemplated taking up various
sports. However, joggers seemed scrawny and unhappy, and bodybuilders
looked broad and stupid, and tennis players? Oh, so upper-middle class!
Swimmers, though, appealed to him with their well-built, streamlined bodies. He
decided to sign up at his local swimming pool and to train hard twice a week.
A short while later, he realised that he had succumbed to an illusion.
Professional swimmers don’t have perfect bodies because they train extensively.
Rather, they are good swimmers because of their physiques. How their bodies
are designed is a factor for selection and not the result of their activities. Similarly,
female models advertise cosmetics and thus, many female consumers believe
that these products make you beautiful. But it is not the cosmetics that make these
women model-like. Quite simply, the models are born attractive and only for this
reason are they candidates for cosmetics advertising. As with the swimmers’
bodies, beauty is a factor for selection and not the result.
Whenever we confuse selection factors with results, we fall prey to what Taleb
calls the swimmer’s body illusion. Without this illusion, half of advertising
campaigns would not work. But this bias has to do with more than just the pursuit
of chiselled cheekbones and chests. For example, Harvard has the reputation of
being a top university. Many highly successful people have studied there. Does
this mean that Harvard is a good school? We don’t know. Perhaps the school is
terrible, and it simply recruits the brightest students around. I experienced this
phenomenon at the University of St Gallen in Switzerland. It is said to be one of
the top ten business schools in Europe, but the lessons I received (although note
that this was twenty-five years ago) were mediocre. Nevertheless, many of its
graduates were successful. The reason behind this is unknown – perhaps it was
due to the climate in the narrow valley or even the cafeteria food. Most probable,
however, is the rigorous selection.
All over the world, MBA schools lure candidates with statistics regarding future

income. This simple calculation is supposed to show that the horrendously high
tuition fees pay for themselves after a short period of time. Many prospective
students fall for this approach. I am not implying that the schools doctor the
statistics, but still their statements must not be swallowed wholesale. Why?
Because those who pursue an MBA are different from those who do not. The
income gap between these groups stems from a multitude of reasons that have
nothing to do with the MBA degree itself. Once again we see the swimmer’s body
illusion at work: the factor for selection confused with the result. So, if you are
considering further study, do it for reasons other than a bigger pay cheque.
When I ask happy people about the secret of their contentment, I often hear
answers like ‘You have to see the glass half-full rather than half-empty.’ It is as if
these individuals do not realise that they were born happy, and now tend to see
the positive in everything. They do not realise that cheerfulness – according to
many studies, such as those conducted by Harvard’s Dan Gilbert – is largely a
personality trait that remains constant throughout life. Or, as social scientists
Lykken and Tellegen starkly suggest, ‘trying to be happier is as futile as trying to
be taller.’ Thus, the swimmer’s body illusion is also a self-illusion. When these
optimists write self-help books, the illusion can become treacherous. That’s why
it’s important to give a wide berth to tips and advice from self-help authors. For
billions of people, these pieces of advice are unlikely to help. But because the
unhappy don’t write self-help books about their failures, this fact remains hidden.
In conclusion: be wary when you are encouraged to strive for certain things –
be it abs of steel, immaculate looks, a higher income, a long life, a particular
demeanour or happiness. You might fall prey to the swimmer’s body illusion.
Before you decide to take the plunge, look in the mirror – and be honest about
what you see.
See also Halo Effect (ch. 38); Outcome Bias (ch. 20); Self-Selection Bias (ch. 47);
Alternative Blindness (ch. 71); Fundamental Attribution Error (ch. 36)

Clustering Illusion

In 1957, Swedish opera singer Friedrich Jorgensen bought a tape player to
record his vocals. When he listened back to the recording, he heard strange
noises throughout, whispers that sounded like supernatural messages. A few
years later, he recorded birdsong. This time, he heard the voice of his deceased
mother in the background whispering to him: ‘Fried, my little Fried, can you hear
me? It’s Mammy.’ That did it. Jorgensen turned his life around and devoted
himself to communicating with the deceased via tape recordings.
In 1994, Diane Duyser from Florida also had an otherworldly encounter. After
biting into a slice of toast and placing it back down on the plate, she noticed the
face of the Virgin Mary in it. Immediately, she stopped eating and stored the
divine message (minus a bite) in a plastic container. In November 2004, she
auctioned the still fairly well preserved snack on eBay. Her daily bread earned
her $28,000.
In 1978, a woman from New Mexico had a similar experience. Her tortilla’s
blackened spots resembled Jesus’ face. The press latched on to the story, and
thousands of people flocked to New Mexico to see the saviour in burrito form.
Two years earlier, in 1976, the orbiter of the Viking Spacecraft had photographed
a rock formation that, from high above, looked like a human face. The ‘Face on
Mars’ made headlines around the world.
And you? Have you ever seen faces in the clouds or the outlines of animals in
rocks? Of course. This is perfectly normal. The human brain seeks patterns and
rules. In fact, it takes it one step further: if it finds no familiar patterns, it simply
invents some. The more diffuse the signal, such as the background noise on the
tape, the easier it is to find ‘hidden messages’ in it. Twenty-five years after
uncovering the ‘Face on Mars’, the Mars Global Surveyor sent back crisp, clear
images of the rock formations: the captivating human face had dissolved into
plain old scree.
These frothy examples make the clustering illusion seem innocuous; it is not.
Consider the financial markets, which churn out floods of data every second.

Grinning ear to ear, a friend told me that he had discovered a pattern in the sea of
data: ‘If you multiply the percentage change of the Dow Jones by the percentage
change of the oil price, you get the move of the gold price in two days’ time.’ In
other words, if share prices and oil climb or fall in unison, gold will rise the day
after tomorrow. His theory worked well for a few weeks, until he began to
speculate with ever-larger sums and eventually squandered his savings. He had
sensed a pattern where none existed.
oxxxoxxxoxxoooxooxxoo. Is this sequence random or planned? Psychology
professor Thomas Gilovich interviewed hundreds of people for an answer. Most
did not want to believe the sequence was arbitrary. They figured some law must
govern the order of the letters. Wrong, explained Gilovich, and pointed to some
dice: it is quite possible to roll the same number four times in a row, which
mystifies many people. Apparently we have trouble accepting that such events
can take place by chance.
During WWII, the Germans bombed London. Among other ammunition, they
used V1 rockets, a kind of self-navigating drone. With each attack, the impact
sites were carefully plotted on a map, terrifying Londoners: they thought they had
discovered a pattern, and developed theories about which parts of the city were
the safest. However, after the war, statistical analysis confirmed that the
distribution was totally random. Today it’s clear why: the V1’s navigation system
was extremely inaccurate.
In conclusion: when it comes to pattern recognition, we are oversensitive.
Regain your scepticism. If you think you have discovered a pattern, first consider
it pure chance. If it seems too good to be true, find a mathematician and have the
data tested statistically. And if the crispy parts of your pancake start to look a lot
like Jesus’ face, ask yourself: if he really wants to reveal himself, why doesn’t he
do it in Times Square or on CNN?
See also Illusion of Control (ch. 17); Coincidence (ch. 24); False Causality (ch. 37)

Social Proof

You are on your way to a concert. At an intersection, you encounter a group of
people, all staring at the sky. Without even thinking about it, you peer upwards
too. Why? Social proof. In the middle of the concert, when the soloist is displaying
absolute mastery, someone begins to clap and suddenly the whole room joins in.
You do, too. Why? Social proof. After the concert you go to the coat check to pick
up your coat. You watch how the people in front of you place a coin on a plate,
even though, officially, the service is included in the ticket price. What do you do?
You probably leave a tip as well.
Social proof, sometimes roughly termed the herd instinct, dictates that
individuals feel they are behaving correctly when they act the same as other
people. In other words, the more people who follow a certain idea, the better
(truer) we deem the idea to be. And the more people who display a certain
behaviour the more appropriate this behaviour is judged to be by others. This is,
of course, absurd.
Social proof is the evil behind bubbles and stock market panic. It exists in
fashion, management techniques, hobbies, religion and diets. It can paralyse
whole cultures, such as when sects commit collective suicide.
A simple experiment carried out in the 1950s by legendary psychologist
Solomon Asch shows how peer pressure can warp common sense. A subject is
shown a line drawn on paper, and next to it three lines – numbered 1, 2 and 3 –
one shorter, one longer and one of the same length as the original one. He or she
must indicate which of the three lines corresponds to the original one. If the
person is alone in the room, he gives correct answers – unsurprising, because
the task is really quite simple. Now five other people enter the room; they are all
actors, which the subject does not know. One after another, they give wrong
answers, saying ‘number 1’, although it’s very clear that number 3 is the correct
answer. Then it is the subject’s turn again. In one third of cases, he will answer
incorrectly to match the other people’s responses.

Why do we act like this? Well, in the past, following others was a good survival
strategy. Suppose that 50,000 years ago, you were travelling around the
Serengeti with your hunter-gatherer friends, and suddenly they all bolted. What
would you have done? Would you have stayed put, scratching your head, and
weighing up whether what you were looking at was a lion or something that just
looked like a lion but was in fact a harmless animal that could serve as a great
protein source? No, you would have sprinted after your friends. Later on, when
you were safe, you could have reflected on what the ‘lion’ had actually been.
Those who acted differently from the group – and I am sure there were some –
exited the gene pool. We are the direct descendants of those who copied the
others’ behaviour. This pattern is so deeply rooted in us that we still use it today,
even when it offers no survival advantage, which is most of the time. Only a few
cases come to mind where social proof is of value. For example, if you find
yourself hungry in a foreign city and don’t know a good restaurant, it makes sense
to pick the one that’s full of locals. In other words, you copy the locals’ behaviour.
Comedy and talk shows make use of social proof by inserting canned laughter
at strategic spots, inciting the audience to laugh along. One of the most
impressive, though troubling, cases of this phenomenon is the famous speech by
Nazi propaganda minister Joseph Goebbels, delivered to a large audience in
1943. (See it for yourself on YouTube.) As the war went from bad to worse for
Germany, he demanded to know: ‘Do you want total war? If necessary, do you
want a war more total and radical than anything that we can even imagine today?’
The crowd roared. If the attendees had been asked individually and
anonymously, it is likely that nobody would have consented to this crazy
The advertising industry benefits greatly from our weakness for social proof.
This works well when a situation is unclear (such as deciding among various car
makes, cleaning products, beauty products etc. with no obvious advantages or
disadvantages), and where people ‘like you and me’ appear.
So, be sceptical whenever a company claims its product is better because it is
‘the most popular’. How is a product better simply because it sells the most units?
And remember novelist W. Somerset Maugham’s wise words: ‘If 50 million
people say something foolish, it is still foolish.’

See also Groupthink (ch. 25); Social Loafing (ch. 33); In-Group Out-Group Bias (ch. 79);
False-Consensus Effect (ch. 77)

Sunk Cost Fallacy

The film was dire. After an hour, I whispered to my wife: ‘Come on, let’s go home.’
She replied: ‘No way. We’re not throwing away $30.’ ‘That’s no reason to stay,’ I
protested. ‘The money’s already gone. This is the sunk cost fallacy at work – a
thinking error!’ She glared at me as if she had just bitten off a piece of lemon. OK,
I sometimes go overboard on the subject, itself an error called déformation
professionnelle (see chapter 92). ‘We have spent the $30 regardless of whether
we stay or leave, so this factor should not play a role in our decision,’ I said,
desperately trying to clarify the situation. Needless to say, I gave in in the end and
sank back down in my seat.
The next day, I sat in a marketing meeting. Our advertising campaign had been
running for four months and had not met even one of its goals. I was in favour of
scrapping it. The advertising manager resisted, saying: ‘But we’ve invested so
much money in it. If we stop now, it’ll all have been for nothing.’ Another victim of
the sunk cost fallacy.
A friend struggled for years in a troubled relationship. His girlfriend cheated on
him time and again. Each time, she came back repentant and begged for
forgiveness. He explained it to me this way: ‘I’ve invested so much energy in the
relationship, it would be wrong to throw it away.’ A classic case of the sunk cost
The sunk cost fallacy is most dangerous when we have invested a lot of time,
money, energy or love in something. This investment becomes a reason to carry
on, even if we are dealing with a lost cause. The more we invest, the greater the
sunk costs are, and the greater the urge to continue becomes.
Investors frequently fall victim to the sunk cost fallacy. Often they base their
trading decisions on acquisition prices. ‘I lost so much money with this stock, I
can’t sell it now,’ they say. This is irrational. The acquisition price should play no
role. What counts is the stock’s future performance (and the future performance of
alternative investments). Ironically, the more money a share loses, the more
investors tend to stick by it.

This irrational behaviour is driven by a need for consistency. After all,
consistency signifies credibility. We find contradictions abominable. If we decide
to cancel a project halfway through, we create a contradiction: we admit that we
once thought differently. Carrying on with a meaningless project delays this
painful realisation and keeps up appearances.
Concorde is a prime example of a government deficit project. Even though both
parties, Britain and France, had long known that the supersonic aircraft business
would never work, they continued to invest enormous sums of money in it – if only
to save face. Abandoning the project would have been tantamount to admitting
defeat. The sunk cost fallacy is therefore often referred to as the Concorde effect.
It leads to costly, even disastrous errors of judgement. The Americans extended
their involvement in the Vietnam War because of this. Their thinking: ‘We’ve
already sacrificed so much for this war; it’d be a mistake to give up now.’
‘We’ve come this far?. . .’ ‘I’ve read so much of this book already?. . .’ ‘But I’ve
spent two years doing this course?. . .’ If you recognise any of these thought
patterns, it shows that the sunk cost fallacy is at work in a corner of your brain.
Of course, there may be good reasons to continue investing in something to
finalise it. But beware of doing so for the wrong reasons, such as to justify nonrecoverable investments. Rational decision-making requires you to forget about
the costs incurred to date. No matter how much you have already invested, only
your assessment of the future costs and benefits counts.
See also The It’ll-Get-Worse-Before-It-Gets-Better Fallacy (ch.12); Inability to Close
Doors (ch. 68); Endowment Effect (ch. 23); Effort Justification (ch. 60); Loss Aversion
(ch. 32); Outcome Bias (ch. 20)


Not so long ago, you may have come across disciples of the Hare Krishna sect
floating around in saffron-coloured robes as you hurried to catch a flight or a train
to your destination. A member of the sect presented you with a small flower and a
smile. If you’re like most people, you took the flower, if only not to be rude. If you
tried to refuse, you would have heard a gentle ‘Take it, this is our gift to you.’ If
you wanted to dispose of the flower in the next trashcan, you found that there
were already a few there. But that was not the end. Just as your bad conscience
started to tug at you, another disciple of Krishna approached you, this time asking
for a donation. In many cases, this plea was successful – and so pervasive that
many airports banned the sect from the premises.
Psychologist Robert Cialdini can explain the success of this and other such
campaigns. He has studied the phenomenon of reciprocity and has established
that people have extreme difficulty being in another person’s debt.
Many NGOs and philanthropic organisations use exactly the same techniques:
first give, then take. Last week, a conservation organisation sent me an envelope
full of postcards featuring all sorts of idyllic landscapes. The accompanying letter
assured me that the postcards were a gift to be kept, whether or not I decided to
donate to their organisation. Even though I understood the tactic, it took a little
willpower and ruthlessness to throw them in the trash.
Unfortunately, this kind of gentle blackmail – you could also call it corruption –
is widespread. A supplier of screws invites a potential customer to join him at a
big sports game. A month later, it’s time to order screws. The desire not to be in
debt is so strong that the buyer gives in and places an order with his new friend.
It is also an ancient technique. We find reciprocity in all species whose food
supplies are subject to high fluctuations. Suppose you are a hunter-gatherer. One
day you are lucky and kill a deer. You can’t possibly eat all of it in a day, and
refrigerators are still a few centuries away. You decide to share the deer with the
group, which ensures that you will benefit from others’ spoils when your haul is
less impressive. The bellies of your buddies serve as your refrigerator.

Reciprocity is a very useful survival strategy, a form of risk management.
Without it, humanity – and countless species of animal – would be long extinct. It
is at the core of cooperation between people who are not related to each other
and a necessary ingredient for economic growth and wealth creation. There
would be no global economy without it – there would be no economy at all. That’s
the good side of reciprocity.
But there is also an ugly side of reciprocity: retaliation. Revenge breeds
counter-revenge and you soon find yourself in a full-scale war. Jesus preached
that we should break this cycle by turning the other cheek, which proves very
difficult to do. So compelling is the pull of reciprocity even when the stakes are far
less high.
Several years ago, a couple invited me and my wife to dinner. We had known
this couple casually for quite some time. They were nice, but far from entertaining.
We couldn’t think of a good excuse to refuse, so we accepted. Things played out
exactly as we had imagined: the dinner party was beyond tedious. Nevertheless,
we felt obliged to invite them to our home a few months later. The constraint of
reciprocity had now presented us with two wearisome evenings. And, lo and
behold, a few weeks later a follow-up invitation from them arrived. I wonder how
many dinner parties have been endured in the name of reciprocity, even if the
participants would have preferred to drop out of the vicious cycle years ago.
In much the same way, if someone approaches you in the supermarket,
whether to offer you a taste of wine, a chunk of cheese or a handful of olives, my
best advice is to refuse their offer – unless you want to end up with a refrigerator
full of stuff you don’t even like.
See also Framing (ch. 42); Incentive Super-Response Tendency (ch. 18); Liking Bias
(ch. 22); Motivation Crowding (ch. 56)

Confirmation Bias (Part 1)

Gil wants to lose weight. He selects a particular diet and checks his progress on
the scales every morning. If he has lost weight, he pats himself on the back and
considers the diet a success. If he has gained weight, he writes it off as a normal
fluctuation and forgets about it. For months, he lives under the illusion that the diet
is working, even though his weight remains constant. Gil is a victim of the
confirmation bias – albeit a harmless form of it.
The confirmation bias is the mother of all misconceptions. It is the tendency to
interpret new information so that it becomes compatible with our existing theories,
beliefs and convictions. In other words, we filter out any new information that
contradicts our existing views (‘disconfirming evidence’). This is a dangerous
practice. ‘Facts do not cease to exist because they are ignored,’ said writer
Aldous Huxley. However, we do exactly that, as super-investor Warren Buffett
knows: ‘What the human being is best at doing, is interpreting all new information
so that their prior conclusions remain intact.’
The confirmation bias is alive and well in the business world. One example: an
executive team decides on a new strategy. The team enthusiastically celebrates
any sign that the strategy is a success. Everywhere the executives look, they see
plenty of confirming evidence, while indications to the contrary remain unseen or
are quickly dismissed as ‘exceptions’ or ‘special cases’. They have become blind
to disconfirming evidence.
What can you do? If the word ‘exception’ crops up, prick up your ears. Often it
hides the presence of disconfirming evidence. It pays to listen to Charles Darwin:
from his youth, he set out systematically to fight the confirmation bias. Whenever
observations contradicted his theory, he took them very seriously and noted them
down immediately. He knew that the brain actively ‘forgets’ disconfirming
evidence after a short time. The more correct he judged his theory to be, the more
actively he looked for contradictions.
The following experiment shows how much effort it takes to question your own
theory. A professor presented his students with the number sequence 2–4–6.

They had to calculate the underlying rule that the professor had written on the
back of a sheet of paper. The students had to provide the next number in the
sequence, to which the professor would reply ‘fits the rule’ or ‘does not fit the
rule’. The students could guess as many numbers as they wanted, but could try to
identify the rule only once. Most students suggested 8 as the next number, and
the professor replied: ‘Fits the rule.’ To be sure, they tried 10, 12 and 14. The
professor replied each time: ‘Fits the rule.’ The students concluded that: ‘The rule
is to add two to the last number.’ The professor shook his head: ‘That is not the
One shrewd student tried a different approach. He tested out the number -2.
The professor said ‘Does not fit the rule.’ ‘Seven?’ he asked. ‘Fits the rule.’ The
student tried all sorts of numbers -24, 9, -43?. . .?Apparently he had an idea, and
he was trying to find a flaw with it. Only when he could no longer find a counterexample, the student said: ‘The rule is this: the next number must be higher than
the previous one.’ The professor turned over the sheet of paper, and this was
exactly what he’d written down.
What distinguished the resourceful student from the others? While the majority
of students sought merely to confirm their theories, he tried to find fault with his,
consciously looking for disconfirming evidence. You might think: ‘Good for him,
but not the end of the world for the others.’ However, falling for the confirmation
bias is not a petty intellectual offence. How it affects our lives will be revealed in
the next chapter.
See also Availability Bias (ch. 11); Feature-Positive Effect (ch. 95); Coincidence (ch. 24);
Forer Effect (ch. 64); Illusion of Attention (ch. 88)

Confirmation Bias (Part 2)

In the previous chapter, we met the father of all fallacies, the confirmation bias.
We are forced to establish beliefs about the world, our lives, the economy,
investments, our careers and more. We deal mostly in assumptions, and the more
nebulous these are, the stronger the confirmation bias. Whether you go through
life believing that ‘people are inherently good’ or ‘people are inherently bad’, you
will find daily proof to support your case. Both parties, the philanthropists and the
misanthropes, simply filter disconfirming evidence (evidence to the contrary) and
focus instead on the do-gooders and dictators who support their worldviews.
Astrologers and economists operate on the same principle. They utter
prophecies so vague that any event can substantiate them: ‘In the coming weeks
you will experience sadness,’ or ‘in the medium term, the pressure on the dollar
will increase.’ But what is the medium term? What will cause the dollar to
depreciate? And, depreciation measured against what – gold, yen, pesos, wheat,
residential property in Manhattan, the average price of a hot dog?
Religious and philosophical beliefs represent an excellent breeding ground for
the confirmation bias. Here, in soft, spongy terrain, it grows wild and free. For
example, worshippers always find evidence for God’s existence, even though he
never shows himself overtly – except to illiterates in the desert and in isolated
mountain villages. It is never to the masses in, say, Frankfurt or New York.
Counter-arguments are dismissed by the faithful, demonstrating just how powerful
the confirmation bias is.
No professionals suffer more from the confirmation bias than business
journalists. Often, they formulate an easy theory, pad it out with two or three
pieces of ‘evidence’ and call it a day. For example: ‘Google is so successful
because the company nurtures a culture of creativity.’ Once this idea is on paper,
the journalist corroborates it by mentioning a few other prosperous companies
that foster ingenuity. Rarely does the writer seek out disconfirming evidence,
which in this instance would be struggling businesses that live and breathe
creativity or, conversely, flourishing firms that are utterly uncreative. Both groups

have plenty of members, but the journalist simply ignores them. If he or she were
to mention just one, the storyline would be ruined.
Self-help and get-rich-quick books are further examples of blinkered
storytelling. Their shrewd authors collect piles of proof to pump up the most banal
of theories, such as ‘meditation is the key to happiness.’ Any reader seeking
disconfirming evidence does so in vain: nowhere in these books do we see
people who lead fulfilled lives without meditation, or those who, despite
meditation, are still sad.
The Internet is particularly fertile ground for the confirmation bias. To stay
informed, we browse news sites and blogs, forgetting that our favoured pages
mirror our existing values, be they liberal, conservative or somewhere in between.
Moreover, a lot of sites now tailor content to personal interests and browsing
history, causing new and divergent opinions to vanish from the radar altogether.
We inevitably land in communities of like-minded people, further reinforcing our
convictions – and the confirmation bias.
Literary critic Arthur Quiller-Couch had a memorable motto: ‘Murder your
darlings.’ This was his advice to writers who struggled with cutting cherished but
redundant sentences. Quiller-Couch’s appeal is not just for hesitant hacks, but for
all of us who suffer from the deafening silence of assent. To fight against the
confirmation bias, try writing down your beliefs – whether in terms of worldview,
investments, marriage, healthcare, diet, career strategies – and set out to find
disconfirming evidence. Axeing beliefs that feel like old friends is hard work, but
See also Introspection Illusion (ch. 67); Salience Effect (ch. 83); Cognitive Dissonance
(ch. 50); Forer Effect (ch. 64); News Illusion (ch. 99)

Authority Bias

The first book of the Bible explains what happens when we disobey a great
authority: we get ejected from paradise. This is also what less celestial authorities
would have us believe – political pundits, scientists, doctors, CEOs, economists,
government heads, sports commentators, consultants and stock market gurus.
Authorities pose two main problems to clear thinking: first, their track records
are often sobering. There are about one million trained economists on the planet,
and not one of them could accurately predict the timing of the 2008 financial crisis
(with the exception of Nouriel Roubini and Nassim Taleb), let alone how the
collapse would play out, from the real-estate bubble bursting to credit default
swaps collapsing, right through to the full-blown economic crunch. Never has a
group of experts failed so spectacularly. The story from the medical world is much
the same: up until 1900 it was discernibly wiser for patients to avoid doctor’s
visits; too often the ‘treatment’ only worsened the illness, due to poor hygiene and
folk practices such as bloodletting.
Psychologist Stanley Milgram demonstrated the authority bias most clearly in
an experiment in 1961. His subjects were instructed to administer ever-increasing
electrical shocks to a person sitting on the other side of a pane of glass. They
were told to start with 15 volts, then 30V, 45V and so on, until they reached the
maximum – a lethal dose of 450V. In reality, no electrical current was actually
flowing; Milgram used an actor to play the role of victim, but those charged with
administering the shocks didn’t know that. The results were, well, shocking: as
the person in the other room wailed and writhed in pain, and the subject
administering the shock wanted to stop, the professor would say, ‘Keep going, the
experiment depends on it.’ The majority of people continued with the
electrocution. More than half of the participants went all the way up to maximum
voltage – out of sheer obedience to authority.
Over the past decade, airlines have also learned the dangers of the authority
bias. In the old days, the captain was king. His commands were not to be
doubted. If a co-pilot suspected an oversight, he wouldn’t have dared to address it

out of respect for – or fear of – his captain. Since this behaviour was discovered,
nearly every airline has instituted ‘Crew Resource Management’ (CRM), which
coaches pilots and their crews to discuss any reservations they have openly and
quickly. In other words: they carefully deprogramme the authority bias. CRM has
contributed more to flight safety in the past twenty years than any technical
advances have.
Many companies are light years from this sort of foresight. Especially at risk are
firms with domineering CEOs, where employees are likely to keep their ‘lesser’
opinions to themselves – much to the detriment of the business.
Authorities crave recognition and constantly find ways to reinforce their status.
Doctors and researchers sport white coats. Bank directors don suits and ties.
Kings wear crowns. Members of the military wield rank badges. Today, even
more symbols and props are used to signal expertise: from appearances on talk
shows and on the covers of magazines, to book tours and their own Wikipedia
entries. Authority changes much like fashion does, and society follows it just as
In conclusion: whenever you are about to make a decision, think about which
authority figures might be exerting an influence on your reasoning. And when you
encounter one in the flesh, do your best to challenge him or her.
See also Twaddle Tendency (ch. 57); Chauffeur Knowledge (ch. 16); Forecast Illusion
(ch. 40); Illusion of Skill (ch. 94)

Contrast Effect

In his book Influence, Robert Cialdini tells the story of two brothers, Sid and
Harry, who ran a clothing store in 1930s America. Sid was in charge of sales and
Harry led the tailoring department. Whenever Sid noticed that the customers who
stood before the mirror really liked their suits, he became a little hard of hearing.
He would call to his brother: ‘Harry, how much for this suit?’ Harry would look up
from his cutting table and shout back: ‘For that beautiful cotton suit, $42.’ (This
was a completely inflated price at that time.) Sid would pretend he hadn’t
understood: ‘How much?’ Harry would yell again: ‘Forty-two dollars!’ Sid would
then turn to his customer and report: ‘He says $22.’ At this point, the customer
would have quickly put the money on the table and hastened from the store with
the suit before poor Sid noticed his ‘mistake’.
Maybe you know the following experiment from your schooldays: take two
buckets. Fill the first with lukewarm water and the second with ice water. Dip your
right hand into the ice water for one minute. Then put both hands into the
lukewarm water. What do you notice? The lukewarm water feels as it should to
the left hand but piping hot to the right hand.
Both of these stories epitomise the contrast effect: we judge something to be
beautiful, expensive or large if we have something ugly, cheap or small in front of
us. We have difficulty with absolute judgements.
T h e contrast effect is a common misconception. You order leather seats for
your new car because compared to the $60,000 price tag on the car, $3,000
seems a pittance. All industries that offer upgrade options exploit this illusion.
T h e contrast effect is at work in other places, too. Experiments show that
people are willing to walk an extra ten minutes to save $10 on food. But those
same people wouldn’t dream of walking ten minutes to save $10 on a thousanddollar suit. An irrational move because ten minutes is ten minutes, and $10 is
$10. Logically, you should walk back in both cases or not at all.

the contrast effect, the discount business would be completely

untenable. A product that has been reduced from $100 to $70 seems better value
than a product that has always cost $70. The starting price should play no role.
The other day an investor told me: ‘The share is a great value because it’s 50 per
cent below the peak price.’ I shook my head. A share price is never ‘low’ or ‘high’.
It is what it is, and the only thing that matters is whether it goes up or down from
that point.
When we encounter contrasts, we react like birds to a gunshot: we jump up and
get moving. Our weak spot: we don’t notice small, gradual changes. A magician
can make your watch vanish because, when he presses on one part of your body,
you don’t notice the lighter touch on your wrist as he relieves you of your Rolex.
Similarly, we fail to notice how our money disappears. It constantly loses its
value, but we do not notice because inflation happens over time. If it were
imposed on us in the form of a brutal tax (and basically that’s what it is), we would
be outraged.
The contrast effect can ruin your whole life: a charming woman marries a fairly
average man. But because her parents were awful people, the ordinary man
appears to be a prince.
One final thought: bombarded by advertisements featuring supermodels, we
now perceive beautiful people as only moderately attractive. If you are seeking a
partner, never go out in the company of your supermodel friends. People will find
you less attractive than you really are. Go alone or, better yet, take two ugly
See also Availability Bias (ch. 11); Endowment Effect (ch. 23); Halo Effect (ch. 38);
Social Comparison Bias (ch. 72); Regression to Mean (ch. 19); Scarcity Error (ch. 27);
Framing (ch. 42)

Availability Bias

‘Smoking can’t be that bad for you: my grandfather smoked three packs of
cigarettes a day and lived to be more than 100.’ Or: ‘Manhattan is really safe. I
know someone who lives in the middle of the Village and he never locks his door.
Not even when he goes on vacation, and his apartment has never been broken
into.’ We use statements like these to try to prove something, but they actually
prove nothing at all. When we speak like this, we succumb to the availability bias.
Are there more English words that start with a K or more words with K as their
third letter? Answer: more than twice as many English words have K in third
position than start with a K. Why do most people believe the opposite is true?
Because we can think of words beginning with a K more quickly. They are more
available to our memory.
T h e availability bias says this: we create a picture of the world using the
examples that most easily come to mind. This is absurd, of course, because in
reality things don’t happen more frequently just because we can conceive of them
more easily.
Thanks to the availability bias, we travel through life with an incorrect risk map
in our heads. Thus, we systematically overestimate the risk of being the victim of
a plane crash, a car accident or a murder. And we underestimate the risk of dying
from less spectacular means, such as diabetes or stomach cancer. The chances
of bomb attacks are much rarer than we think, and the chances of suffering
depression are much higher. We attach too much likelihood to spectacular, flashy
or loud outcomes. Anything silent or invisible we downgrade in our minds. Our
brains imagine show-stopping outcomes more readily than mundane ones. We
think dramatically, not quantitatively.
Doctors often fall victim to the availability bias. They have their favourite
treatments, which they use for all possible cases. More appropriate treatments
may exist, but these are in the recesses of the doctors’ minds. Consequently they
practise what they know. Consultants are no better. If they come across an
entirely new case, they do not throw up their hands and sigh: ‘I really don’t know

what to tell you.’ Instead they turn to one of their more familiar methods, whether
or not it is ideal.
If something is repeated often enough, it gets stored at the forefront of our
minds. It doesn’t even have to be true. How often did the Nazi leaders have to
repeat the term ‘the Jewish question’ before the masses began to believe that it
was a serious problem? You simply have to utter the words ‘UFO’, ‘life energy’ or
‘karma’ enough times before people start to credit them.
The availability bias has an established seat at the corporate board’s table, too.
Board members discuss what management has submitted – usually quarterly
figures – instead of more important things, such as a clever move by the
competition, a slump in employee motivation or an unexpected change in
customer behaviour. They tend not to discuss what’s not on the agenda. In
addition, people prefer information that is easy to obtain, be it economic data or
recipes. They make decisions based on this information rather than on more
relevant but harder to obtain information – often with disastrous results. For
example, we have known for ten years that the so-called Black–Scholes formula
for the pricing of derivative financial products does not work. But we don’t have
another solution, so we carry on with an incorrect tool. It is as if you were in a
foreign city without a map, and then pulled out one for your home town and simply
used that. We prefer wrong information to no information. Thus, the availability
bias has presented the banks with billions in losses.
What was it that Frank Sinatra sang? ‘Oh, my heart is beating wildly/And it’s all
because you’re here/When I’m not near the girl I love/I love the girl I’m near.’ A
perfect example of the availability bias. Fend it off by spending time with people
who think differently than you think – people whose experiences and expertise
are different than yours. We require others’ input to overcome the availability bias.
See also Ambiguity Aversion (ch. 80); Illusion of Attention (ch. 88); Association Bias
(ch. 48); Feature-Positive Effect (ch. 95); Confirmation Bias (ch. 7–8); Contrast Effect (ch.
10); Neglect of Probability (ch. 26)

The It’ll-Get-Worse-Before-It-Gets-Better Fallacy

A few years ago, I was on vacation in Corsica and fell sick. The symptoms were
new to me, and the pain was growing by the day. Eventually I decided to seek
help at a local clinic. A young doctor began to inspect me, prodding my stomach,
gripping my shoulders and knees and then poking each vertebra. I began to
suspect that he had no idea what my problem was, but I wasn’t really sure so I
simply endured the strange examination. To signal its end, he pulled out his
notebook and said: ‘Antibiotics. Take one tablet three times a day. It’ll get worse
before it gets better.’ Glad that I now had a treatment, I dragged myself back to my
hotel room with the prescription in hand.
The pain grew worse and worse – just as the doctor had predicted. The doctor
must have known what was wrong with me after all. But, when the pain hadn’t
subsided after three days, I called him. ‘Increase the dose to five times a day. It’s
going to hurt for a while more,’ he said. After two more days of agony, I finally
called the international air ambulance. The Swiss doctor diagnosed appendicitis
and operated on me immediately. ‘Why did you wait so long?’ he asked me after
the surgery.
I replied: ‘It all happened exactly as the doctor said, so I trusted him.’
‘Ah, you fell victim to the it’ll-get-worse-before-it-gets-better fallacy. That
Corsican doctor had no idea. Probably just the same type of stand-in you find in
all the tourist places in high season.’
Let’s take another example: a CEO is at his wits’ end. Sales are in the toilet, the
salespeople are unmotivated, and the marketing campaign has sunk without a
trace. In his desperation, he hires a consultant. For $5,000 a day, this man
analyses the company and comes back with his findings: ‘Your sales department
has no vision, and your brand isn’t positioned clearly. It’s a tricky situation. I can
fix it for you – but not overnight. The measures will require sensitivity, and most
likely, sales will fall further before things improve.’ The CEO hires the consultant.
A year later, sales fall, and the same thing happens the next year. Again and
again, the consultant stresses that the company’s progress corresponds closely

to his prediction. As sales continue their slump in the third year, the CEO fires the
A mere smokescreen, the It’ll-Get-Worse-Before-It-Gets-Better Fallacy is a
variant of the so-called confirmation bias. If the problem continues to worsen, the
prediction is confirmed. If the situation improves unexpectedly, the customer is
happy and the expert can attribute it to his prowess. Either way he wins.
Suppose you are president of a country, and have no idea how to run it. What
do you do? You predict ‘difficult years’ ahead, ask your citizens to ‘tighten their
belts’, and then promise to improve the situation only after this ‘delicate stage’ of
the ‘cleansing’, ‘purification’ and ‘restructuring’. Naturally you leave the duration
and severity of the period open.
The best evidence of this strategy’s success is Christianity: its literal followers
believe that before we can experience heaven on earth, the world must be
destroyed. Disasters, floods, fires, death – they are all part of the larger plan and
must take place. Believers will view any deterioration of the situation as
confirmation of the prophecy, and any improvement as a gift from God.
In conclusion: if someone says ‘It’ll get worse before it gets better,’ you should
hear alarm bells ringing. But beware: situations do exist where things first dip and
then improve. For example, a career change requires time and often incorporates
loss of pay. The reorganisation of a business also takes time. But in all these
cases, we can see relatively quickly if the measures are working. The milestones
are clear and verifiable. Look to these rather than to the heavens.
See also Action Bias (ch. 43); Sunk Cost Fallacy (ch. 5); Regression to the Mean (ch. 19)

Story Bias

Life is a muddle, as intricate as a Gordian knot. Imagine that an invisible Martian
decides to follow you around with an equally invisible notebook, recording what
you do, think and dream. The rundown of your life would consist of entries such
as ‘drank coffee, two sugars’, ‘stepped on a thumbtack and swore like a sailor’,
‘dreamed that I kissed the neighbour’, ‘booked vacation, Maldives, now nearly out
of money’, ‘found hair sticking out of ear, plucked it straight away’ and so on. We
like to knit this jumble of details into a neat story. We want our lives to form a
pattern that can be easily followed. Many call this guiding principle ‘meaning’. If
our story advances evenly over the years, we refer to it as ‘identity’. ‘We try on
stories as we try on clothes,’ said Max Frisch, a famous Swiss novelist.
We do the same with world history, shaping the details into a consistent story.
Suddenly we ‘understand’ certain things; for example, why the Treaty of
Versailles led to the Second World War, or why Alan Greenspan’s loose
monetary policy created the collapse of Lehman Brothers. We comprehend why
the Iron Curtain had to fall or why Harry Potter became a best-seller. Here, we
speak about ‘understanding’, but these things cannot be understood in the
traditional sense. We simply build the meaning into them afterward. Stories are
dubious entities. They simplify and distort reality, and filter things that don’t fit. But
apparently we cannot do without them. Why remains unclear. What is clear is that
people first used stories to explain the world, before they began to think
scientifically, making mythology older than philosophy. This has led to the story
In the media, the story bias rages like wildfire. For example: a car is driving
over a bridge when the structure suddenly collapses. What do we read the next
day? We hear the tale of the unlucky driver, where he came from and where he
was going. We read his biography: born somewhere, grew up somewhere else,
earned a living as something. If he survives and can give interviews, we hear
exactly how it felt when the bridge came crashing down. The absurd thing: not
one of these stories explains the underlying cause of the accident. Skip past the

driver’s account and consider the bridge’s construction: where was the weak
point? Was it fatigue? If not, was the bridge damaged? If so, by what? Was a
proper design even used? Where are there other bridges of the same design?
The problem with all these questions is that, though valid, they just don’t make for
a good yarn. Stories attract us; abstract details repel us. Consequently,
entertaining side issues and backstories are prioritised over relevant facts. (On
the upside, if it were not for this, we would be stuck with only non-fiction books.)
Here are two stories from the English novelist E. M. Forster. Which one would
you remember better? A) ‘The king died, and the queen died.’ B) ‘The king died,
and the queen died of grief.’ Most people will retain the second story more easily.
Here, the two deaths don’t just take place successively; they are emotionally
linked. Story A is a factual report, but story B has ‘meaning’. According to
information theory, we should be able to hold on to A better: it is shorter. But our
brains don’t work that way.
Advertisers have learned to capitalise on this too. Instead of focusing on an
item’s benefits, they create a story around it. Objectively speaking, narratives are
irrelevant, but still we find them irresistible. Google illustrated this masterfully in its
Super Bowl commercial from 2010, ‘Google Parisian Love’. Take a look at it on
From our own life stories to global events, we shape everything into meaningful
stories. Doing so distorts reality and affects the quality of our decisions, but there
is a remedy: pick these apart. Ask yourself: what are they trying to hide? Visit the
library and spend half a day reading old newspapers. You will see that events
that today look connected weren’t so at the time. To experience the effect once
more, try to view your life story out of context. Dig into your old journals and notes,
and you’ll see that your life has not followed a straight arrow leading to today, but
has been a series of unplanned, unconnected events and experiences, as we’ll
see in the next chapter.
Whenever you hear a story, ask yourself: who is the sender, what are his
intentions and what did he hide under the rug? The omitted elements might not
be of relevance. But then again, they might be even more relevant than the
elements featured in the story, such as when ‘explaining’ a financial crisis or the
‘cause’ of war. The real issue with stories: they give us a false sense of

understanding, which inevitably leads us to take bigger risks and urges us to take
a stroll on thin ice.
See also False Causality (ch.37); ‘Because’ Justification (ch. 52); Personification (ch.
87); Hindsight Bias (ch. 14); Fundamental Attribution Error (ch. 36); Conjunction Fallacy
(ch. 41); Falsification of History (ch. 78); Cherry-Picking (ch. 96); News Illusion (ch. 99)

Hindsight Bias

I came across the diaries of my great-uncle recently. In 1932, he emigrated from a
tiny Swiss village to Paris to seek his fortune in the movie industry. In August
1940, two months after Paris was occupied, he noted: ‘Everyone is certain that
the Germans will leave by the end of the year. Their officers also confirmed this to
me. England will fall as fast as France did, and then we will finally have our
Parisian lives back – albeit as part of Germany.’ The occupation lasted four years.
In today’s history books, the German occupation of France seems to form part
of a clear military strategy. In retrospect, the actual course of the war appears the
most likely of all scenarios. Why? Because we have fallen victim to the hindsight
Let’s take a more recent example: in 2007, economic experts painted a rosy
picture for the coming years. However, just twelve months later, the financial
markets imploded. Asked about the crisis, the same experts enumerated its
causes: monetary expansion under Greenspan, lax validation of mortgages,
corrupt rating agencies, low capital requirements, and so forth. In hindsight, the
reasons for the crash seem painfully obvious.
The hindsight bias is one of the most prevailing fallacies of all. We can aptly
describe it as the ‘I told you so’ phenomenon: in retrospect, everything seems
clear and inevitable. If a CEO becomes successful due to fortunate circumstances
he will, looking back, rate the probability of his success a lot higher than it
actually was. Similarly, following Ronald Reagan’s massive election victory over
Jimmy Carter in 1980, commentators announced his appointment to be
foreseeable, even though the election lay on a knife-edge until a few days before
the final vote. Today, business journalists opine that Google’s dominance was
predestined, even though each of them would have snorted had such a prediction
been made in 1998. One particularly blundering example: nowadays it seems
tragic, yet completely plausible, that a single shot in Sarajevo in 1914 would
totally upturn the world for thirty years and cost 50 million lives. Every child learns
this historical detail in school. But back then, nobody would have dreamed of

such an escalation. It would have sounded too absurd.
So why is the hindsight bias so perilous? Well, it makes us believe we are
better predictors than we actually are, causing us to be arrogant about our
knowledge and consequently to take too much risk. And not just with global
issues: ‘Have you heard? Sylvia and Chris aren’t together any more. It was
always going to go wrong, they were just so different.’ Or: ‘They were just so
similar.’ Or: ‘They spent too much time together.’ Or even: ‘They barely saw one
Overcoming the hindsight bias is not easy. Studies have shown that people
who are aware of it fall for it just as much as everyone else. So, I’m very sorry, but
you’ve just wasted your time reading this chapter.
If you’re still with me, I have one final tip, this time from personal rather than
professional experience: keep a journal. Write down your predictions – for
political changes, your career, your weight, the stock market and so on. Then,
from time to time, compare your notes with actual developments. You will be
amazed at what a poor forecaster you are. Don’t forget to read history too – not
the retrospective, compacted theories compiled in textbooks, but the diaries, oral
histories and historical documents from the period. If you can’t live without news,
read newspapers from five, ten or twenty years ago. This will give you a much
better sense of just how unpredictable the world is. Hindsight may provide
temporary comfort to those overwhelmed by complexity, but as for providing
deeper revelations about how the world works, you’ll benefit by looking
See also Fallacy of the Single Cause (ch. 97); Falsification of History (ch. 78); Story
Bias (ch. 13); Forecast Illusion (ch. 40); Outcome Bias (ch. 20); Self-Serving Bias (ch.

Overconfidence Effect

My favourite musician, Johann Sebastian Bach, was anything but a one-hit
wonder. He composed numerous works. How many there were I will reveal at the
end of this chapter. But for now, here’s a small assignment: how many concertos
do you think Bach composed? Choose a range, for example, between 100 and
500, aiming for an estimate that is 98% correct and only 2% off.
How much confidence should we have in our own knowledge? Psychologists
Howard Raiffa and Marc Alpert, wondering the same thing, have interviewed
hundreds of people in this way. They have asked participants to estimate the total
egg production in the U.S., or the number of physicians and surgeons listed in the
Yellow Pages of the phone directory for Boston, or the number of foreign
automobiles imported into the U.S., or even the toll collections of the Panama
Canal in millions of dollars. Subjects could choose any range they liked, with the
aim of not being wrong more than 2% of the time. The results were amazing. In
the final tally, instead of just 2%, they were off 40% of the time. The researchers
dubbed this amazing phenomenon overconfidence.
Overconfidence also applies to forecasts, such as stock market performance
over a year or your firm’s profits over three years. We systematically overestimate
our knowledge and our ability to predict – on a massive scale. The
overconfidence effect does not deal with whether single estimates are correct or
not. Rather, it measures the difference between what people really know and
what they think they know. What’s surprising is this: experts suffer even more from
overconfidence than laypeople do. If asked to forecast oil prices in five years’
time, an economics professor will be as wide of the mark as a zookeeper will.
However, the professor will offer his forecast with certitude.
Overconfidence does not stop at economics: in surveys, 84% of Frenchmen
estimate that they are above-average lovers. Without the overconfidence effect,
that figure should be exactly 50% – after all, the statistical ‘median’ means 50%
should rank higher and 50% should rank lower. In another survey, 93% of the

U.S. students asked estimated themselves to be ‘above average’ drivers. And
68% of the faculty at the University of Nebraska rated themselves in the top 25%
for teaching ability. Entrepreneurs and those wishing to marry also deem
themselves to be different: they believe they can beat the odds. In fact,
entrepreneurial activity would be a lot lower if overconfidence did not exist. For
example, every restaurateur hopes to establish the next Michelin-starred
restaurant, even though statistics show that most close their doors after just three
years. The return on investment in the restaurant business lies chronically below
Hardly any major projects exist that are completed in less time and at a lower
cost than forecasted. Some delays and cost overruns are even legendary, such
as the Airbus A400M, the Sydney Opera House and Boston’s Big Dig. The list
can be added to at will. Why is that? Here, two effects act in unison. First, you
have classic overconfidence. Second, those with a direct interest in the project
have an incentive to underestimate the costs: consultants, contractors and
suppliers seek follow-up orders. Builders feel bolstered by the optimistic figures
and, through their activities, politicians get more votes. We will examine this
strategic misrepresentation (Chapter 89) later in the book.
What makes overconfidence so prevalent and its effect so confounding is that it
is not driven by incentives; it is raw and innate. And it’s not counterbalanced by
the opposite effect, ‘underconfidence’, which doesn’t exist. No surprise to some
readers: overconfidence is more pronounced in men – women tend not to
overestimate their knowledge and abilities as much. Even more troubling:
optimists are not the only victims of overconfidence. Even self-proclaimed
pessimists overrate themselves – just less extremely.
In conclusion: be aware that you tend to overestimate your knowledge. Be
sceptical of predictions, especially if they come from so-called experts. And with
all plans, favour the pessimistic scenario. This way you have a chance of judging
the situation somewhat realistically.
Back to the question from the beginning: Johann Sebastian Bach composed
1127 works that survived to this day. He may have composed considerably more,
but they are lost.
See also Illusion of Skill (ch. 94); Forecast Illusion (ch. 40); Strategic Misrepresentation

(ch. 89); Incentive Super-Response Tendency (ch. 18); Self-Serving Bias (ch. 45)

Chauffeur Knowledge

After receiving the Nobel Prize for Physics in 1918, Max Planck went on tour
across Germany. Wherever he was invited, he delivered the same lecture on new
quantum mechanics. Over time, his chauffeur grew to know it by heart: ‘It has to
be boring giving the same speech each time, Professor Planck. How about I do it
for you in Munich? You can sit in the front row and wear my chauffeur’s cap.
That’d give us both a bit of variety.’ Planck liked the idea, so that evening the
driver held a long lecture on quantum mechanics in front of a distinguished
audience. Later, a physics professor stood up with a question. The driver
recoiled: ‘Never would I have thought that someone from such an advanced city
as Munich would ask such a simple question! My chauffeur will answer it.’
According to Charlie Munger, one of the world’s best investors (and from whom
I have borrowed this story), there are two types of knowledge. First, we have real
knowledge. We see it in people who have committed a large amount of time and
effort to understanding a topic. The second type is chauffeur knowledge –
knowledge from people who have learned to put on a show. Maybe they have a
great voice or good hair, but the knowledge they espouse is not their own. They
reel off eloquent words as if reading from a script.
Unfortunately, it is increasingly difficult to separate true knowledge from
chauffeur knowledge. With news anchors, however, it is still easy. These are
actors. Period. Everyone knows it. And yet it continues to astound me how much
respect these perfectly-coiffed script readers enjoy, not to mention how much they
earn moderating panels about topics they barely fathom.
With journalists, it is more difficult. Some have acquired true knowledge. Often
they are veteran reporters who have specialised for years in a clearly defined
area. They make a serious effort to understand the complexity of a subject and to
communicate it. They tend to write long articles that highlight a variety of cases
and exceptions. The majority of journalists, however, fall into the category of
chauffeur. They conjure up articles off the tops of their heads, or rather, from
Google searches. Their texts are one-sided, short, and – often as compensation

for their patchy knowledge – snarky and self-satisfied in tone.
The same superficiality is present in business. The larger a company, the more
the CEO is expected to possess ‘star quality’. Dedication, solemnity, and
reliability are undervalued, at least at the top. Too often shareholders and
business journalists seem to believe that showmanship will deliver better results,
which is obviously not the case.
To guard against the chauffeur effect, Warren Buffett, Munger’s business
partner, has coined a wonderful phrase, ‘circle of competence’. What lies inside
this circle you understand intuitively; what lies outside, you may only partially
comprehend. One of Munger’s best pieces of advice is: ‘You have to stick within
what I call your circle of competence. You have to know what you understand and
what you don’t understand. It’s not terribly important how big the circle is. But it is
terribly important that you know where the perimeter is.’ Munger underscores this:
‘So you have to figure out what your own aptitudes are. If you play games where
other people have the aptitudes and you don’t, you’re going to lose. And that’s as
close to certain as any prediction that you can make. You have to figure out
where you’ve got an edge. And you’ve got to play within your own circle of
In conclusion: be on the lookout for chauffeur knowledge. Do not confuse the
company spokesperson, the ringmaster, the newscaster, the schmoozer, the
verbiage vendor or the cliché generator with those who possess true knowledge.
How do you recognise the difference? There is a clear indicator: true experts
recognise the limits of what they know and what they do not know. If they find
themselves outside their circle of competence, they keep quiet or simply say, ‘I
don’t know.’ This they utter unapologetically, even with a certain pride. From
chauffeurs, we hear every line except this.
See also Authority Bias (ch. 9); Domain Dependence (ch. 76); Twaddle Tendency (ch.

Illusion of Control

Every day, shortly before nine o’clock, a man with a red hat stands in a square
and begins to wave his cap around wildly. After five minutes he disappears. One
day, a policeman comes up to him and asks: ‘What are you doing?’ ‘I’m keeping
the giraffes away.’ ‘But there aren’t any giraffes here.’ ‘Well, I must be doing a
good job, then.’
A friend with a broken leg was stuck in bed and asked me to pick up a lottery
ticket for him. I went to the store, checked a few boxes, wrote his name on it and
paid. As I handed him the copy of the ticket, he balked. ‘Why did you fill it out? I
wanted to do that. I’m never going to win anything with your numbers!’
‘Do you really think it affects the draw if you pick the numbers?’ I inquired. He
looked at me blankly.
In casinos, most people throw the dice as hard as they can if they need a high
number, and as gingerly as possible if they are hoping for a low number – which
is as nonsensical as football fans thinking they can swing a game by
gesticulating in front of the TV. Unfortunately they share this illusion with many
people who also seek to influence the world by sending out the ‘right’ thoughts
(vibrations, positive energy, karma?. . .?).
T h e illusion of control is the tendency to believe that we can influence
something over which we have absolutely no sway. This was discovered in 1965
by two researchers, Jenkins and Ward. Their experiment was simple, consisting
of just two switches and a light. The men were able to adjust when the switches
connected to the light and when not. Even when the light flashed on and off at
random, subjects were still convinced that they could influence it by flicking the
Or consider this example: an American researcher has been investigating
acoustic sensitivity to pain. For this, he placed people in sound booths and
increased the volume until the subjects signalled him to stop. The two rooms, A
and B, were identical, save one thing: room B had a red panic button on the wall.

The button was purely for show, but it gave participants the feeling that they were
in control of the situation, leading them to withstand significantly more noise. If
you have read Aleksandr Solzhenitsyn, Primo Levi or Viktor Frankl, this finding
will not surprise you: the idea that people can influence their destiny even by a
fraction encouraged these prisoners not to give up hope.
Crossing the street in Los Angeles is a tricky business, but luckily, at the press
of a button, we can stop traffic. Or can we? The button’s real purpose is to make
us believe we have an influence on the traffic lights, and thus we’re better able to
endure the wait for the signal to change with more patience. The same goes for
‘door-open’ and ‘door-close’ buttons in elevators: many are not even connected to
the electrical panel. Such tricks are also designed into open-plan offices: for
some people it will always be too hot, for others too cold. Clever technicians
create the illusion of control by installing fake temperature dials. This reduces
energy bills – and complaints. Such ploys are called ‘placebo buttons’ and they
are being pushed in all sorts of realms.
Central bankers and government officials employ placebo buttons masterfully.
Take, for instance, the federal funds rate, which is an extreme short-term rate, an
overnight rate to be precise. While this rate doesn’t affect long-term interest rates
(which are a function of supply and demand, and an important factor in
investment decisions), the stock market, nevertheless, reacts frenetically to its
every change. Nobody understands why overnight interest rates can have such
an effect on the market, but everybody thinks they do, and so they do. The same
goes for pronouncements made by the Chairman of the Federal Reserve; markets
move, even though these statements inject little of tangible value into the real
economy. They are merely sound waves. And still we allow economic heads to
continue to play with the illusory dials. It would be a real wake-up call if all
involved realised the truth – that the world economy is a fundamentally
uncontrollable system.
And you? Do you have everything under control? Probably less than you think.
Do not think you command your way through life like a Roman emperor. Rather,
you are the man with the red hat. Therefore, focus on the few things of importance
that you can really influence. For everything else: que sera, sera.
See also Coincidence (ch. 24); Neglect of Probability (ch. 26); Forecast Illusion (ch. 40);
Illusion of Skill (ch. 94); Clustering Illusion (ch. 3); Introspection Illusion (ch. 67)

Incentive Super-Response Tendency

To control a rat infestation, French colonial rulers in Hanoi in the nineteenth
century passed a law: for every dead rat handed in to the authorities, the catcher
would receive a reward. Yes, many rats were destroyed, but many were also bred
specially for this purpose.
In 1947, when the Dead Sea scrolls were discovered, archaeologists set a
finder’s fee for each new parchment. Instead of lots of extra scrolls being found,
they were simply torn apart to increase the reward. Similarly, in China in the
nineteenth century, an incentive was offered for finding dinosaur bones. Farmers
located a few on their land, broke them into pieces and cashed in. Modern
incentives are no better: company boards promise bonuses for achieved targets.
And what happens? Managers invest more energy in trying to lower the targets
than in growing the business.
These are examples of the incentive super-response tendency. Credited to
Charlie Munger, this titanic name describes a rather trivial observation: people
respond to incentives by doing what is in their best interests. What is noteworthy
is, first, how quickly and radically people’s behaviour changes when incentives
come into play or are altered and, second, the fact that people respond to the
incentives themselves and not the grander intentions behind them.
Good incentive systems comprise both intent and reward. An example: in
Ancient Rome, engineers were made to stand underneath the construction at
their bridges’ opening ceremonies. Poor incentive systems, on the other hand,
overlook and sometimes even pervert the underlying aim. For example, censoring
a book makes its contents more famous and rewarding bank employees for each
loan sold leads to a miserable credit portfolio. Making CEOs’ pay public didn’t
dampen the astronomical salaries; to the contrary, it pushed them upward.
Nobody wants to be the loser CEO in his industry.
Do you want to influence the behaviour of people or organisations? You could
always preach about values and visions, or you could appeal to reason. But in
nearly every case, incentives work better. These need not be monetary; anything

is useable, from good grades to Nobel Prizes to special treatment in the afterlife.
For a long time I tried to understand what made well-educated nobles from the
Middle Ages bid adieu to their comfortable lives, swing themselves up on to
horses and take part in the Crusades. They were well aware that the arduous ride
to Jerusalem lasted at least six months and passed directly through enemy
territory, yet they took the risk. And then it came to me: the answer lies in incentive
systems. If they came back alive, they could keep the spoils of war and live out
their days as rich men. If they died, they automatically passed on to the afterlife as
martyrs – with all the benefits that came with it. It was win-win.
Imagine for a moment that, instead of demanding enemies’ riches, warriors and
soldiers charged by the hour. We would effectively be incentivising them to take
as long as possible, right? So why do we do just this with lawyers, architects,
consultants, accountants and driving instructors? My advice: forget hourly rates
and always negotiate a fixed price in advance.
Be wary, too, of investment advisers endorsing particular financial products.
They are not interested in your financial well-being, but in earning a commission
on these products. The same goes for entrepreneurs’ and investment bankers’
business plans. These are often worthless because, again, the vendors have
their own interests at heart. What is the old adage? ‘Never ask a barber if you
need a haircut.’
In conclusion: keep an eye out for the incentive super-response tendency. If a
person’s or an organisation’s behaviour confounds you, ask yourself what
incentive might lie behind it. I guarantee you that you’ll be able to explain 90% of
the cases this way. What makes up the remaining 10%? Passion, idiocy,
psychosis or malice.
See also Motivation Crowding (ch. 56); Reciprocity (ch. 6); Overconfidence Effect (ch.
15); Motivation Crowding (ch. 56)

Regression to Mean

His back pain was sometimes better, sometimes worse. There were days when
he felt like he could move mountains, and those when he could barely move.
When that was the case – fortunately it happened only rarely – his wife would
drive him to the chiropractor. The next day he would feel much more mobile and
would recommend the therapist to everyone.
Another man, younger and with a respectable golf handicap of 12, gushed in a
similar fashion about his golf instructor. Whenever he played miserably, he
booked an hour with the pro, and lo and behold, in the next game he fared much
A third man, an investment adviser at a major bank, invented a sort of ‘rain
dance’, which he performed in the restroom every time his stocks had performed
extremely badly. As absurd as it seemed, he felt compelled to do it: and things
always improved afterward.
What links the three men is a fallacy: the regression-to-mean delusion.
Suppose your region is experiencing a record period of cold weather. In all
probability, the temperature will rise in the next few days, back toward the monthly
average. The same goes for extreme heat, drought or rain. Weather fluctuates
around a mean. The same is true for chronic pain, golf handicaps, stock market
performance, luck in love, subjective happiness and test scores. In short, the
crippling back pain would most likely have improved without a chiropractor. The
handicap would have returned to 12 without additional lessons. And the
performance of the investment adviser would also have shifted back toward the
market average – with or without the restroom dance.
Extreme performances are interspersed with less extreme ones. The most
successful stock picks from the past three years are hardly going to be the most
successful stocks in the coming three years. Knowing this, you can appreciate
why some athletes would rather not make it on to the front pages of the

newspapers: subconsciously they know that the next time they race, they
probably won’t achieve the same top result – which has nothing to do with the
media attention, but is to do with natural variations in performance.
Or, take the example of a division manager who wants to improve employee
morale by sending the least motivated 3% of the workforce on a course. The
result? The next time he looks at motivation levels, the same people will not make
up the bottom few – there will be others. Was the course worth it? Hard to say,
since the group’s motivation levels would probably have returned to their
personal norms even without the training. The situation is similar with patients
who are hospitalised for depression. They usually leave the clinic feeling a little
better. It is quite possible, however, that the stay contributed absolutely nothing.
Another example: in Boston, the lowest-performing schools were entered into a
complex support programme. The following year, the schools had moved up in
the rankings, an improvement that the authorities attributed to the programme
rather than to natural regression to mean.
Ignoring regression to mean can have destructive consequences, such as
teachers (or managers) concluding that the stick is better than the carrot. For
example, following a test the highest performing students are praised, and the
lowest are castigated. In the next exam, other students will probably – purely
coincidentally – achieve the highest and lowest scores. Thus, the teacher
concludes that reproach helps and praise hinders. A fallacy that keeps on giving.
In conclusion: when you hear stories such as: ‘I was sick, went to the doctor,
and got better a few days later’ or ‘the company had a bad year, so we got a
consultant in and now the results are back to normal’, look out for our old friend,
the regression-to-mean error.
See also Problem with Averages (ch. 55); Contrast Effect (ch. 10); The It’ll-Get-WorseBefore-It-Gets-Better Fallacy (ch. 12); Coincidence (ch. 24); Gambler’s Fallacy (ch. 29)

Outcome Bias

A quick hypothesis: say one million monkeys speculate on the stock market. They
buy and sell stocks like crazy and, of course, completely at random. What
happens? After one week, about half of the monkeys will have made a profit and
the other half a loss. The ones that made a profit can stay; the ones that made a
loss you send home. In the second week, one half of the monkeys will still be
riding high, while the other half will have made a loss and are sent home. And so
on. After ten weeks, about 1,000 monkeys will be left – those who have always
invested their money well. After twenty weeks, just one monkey will remain – this
one always, without fail, chose the right stocks and is now a billionaire. Let’s call
him the success monkey.
How does the media react? They will pounce on this animal to understand its
‘success principles’. And they will find some: perhaps the monkey eats more
bananas than the others. Perhaps he sits in another corner of the cage. Or,
maybe he swings headlong through the branches, or he takes long, reflective
pauses while grooming. He must have some recipe for success, right? How else
could he perform so brilliantly? Spot-on for twenty weeks – and that from a simple
monkey? Impossible!
The monkey story illustrates the outcome bias: we tend to evaluate decisions
based on the result rather than on the decision process. This fallacy is also
known as the historian error. A classic example is the Japanese attack on Pearl
Harbor. Should the military base have been evacuated or not? From today’s
perspective: obviously, for there was plenty of evidence that an attack was
imminent. However, only in retrospect do the signals appear so clear. At the time,
in 1941, there was a plethora of contradictory signals. Some pointed to an attack;
others did not. To assess the quality of the decision, we must use the information
available at the time, filtering out everything we know about it post-attack
(particularly that it did indeed take place).
Another experiment: you must evaluate the performance of three heart
surgeons. To do this, you ask each to carry out a difficult operation five times.

Over the years, the probability of dying from these procedures has stabilised at
20%. With surgeon A, no one dies. With surgeon B, one patient dies. With
surgeon C, two die. How do you rate the performance of A, B and C? If you think
like most people, you rate A the best, B the second best, and C the worst. And
thus you’ve just fallen for the outcome bias. You can guess why: the samples are
too small, rendering the results meaningless. You can only really judge a surgeon
if you know something about the field, and then carefully monitor the preparation
and execution of the operation. In other words, you assess the process and not
the result. Alternatively, you could employ a larger sample, if you have enough
patients who need this particular operation: 100 or 1,000 operations. For now it is
enough to know that, with an average surgeon, there is a 33% chance that no one
will die, a 41% chance that one person will die and a 20% chance that two people
will die. That’s a simple probability calculation. What stands out: there is no huge
difference between zero dead and two dead. To assess the three surgeons purely
on the basis of the outcomes would be not only negligent but also unethical.
In conclusion: never judge a decision purely by its result, especially when
randomness or ‘external factors’ play a role. A bad result does not automatically
indicate a bad decision and vice versa. So rather than tearing your hair out about
a wrong decision, or applauding yourself for one that may have only
coincidentally led to success, remember why you chose what you did. Were your
reasons rational and understandable? Then you would do well to stick with that
method, even if you didn’t strike lucky last time.
See also Sunk Cost Fallacy (ch. 5); Swimmer’s Body Illusion (ch. 2); Hindsight Bias
(ch. 14); Illusion of Skill (ch. 94)

The Paradox of Choice

My sister and her husband bought an unfinished house a little while ago. Since
then, we haven’t been able to talk about anything else. The sole topic of
conversation for the past two months has been bathroom tiles: ceramic, granite,
marble, metal, stone, wood, glass and every type of laminate known to man.
Rarely have I seen my sister in such anguish. ‘There are just too many to choose
from,’ she exclaims, throwing her hands in the air and returning to the tile
catalogue, her constant companion.
I’ve counted and researched: my local grocery store stocks 48 varieties of
yogurt, 134 types of red wine, 64 different cleaning products and a grand total of
30,000 items. Amazon, the Internet bookseller, has two million titles available.
Nowadays, people are bombarded with options, such as hundreds of mental
disorders, thousands of different careers, even more holiday destinations and an
infinite variety of lifestyles. There has never been more choice.
When I was young, we had three types of yogurt, three television channels, two
churches, two kinds of cheese (mild or strong), one type of fish (trout) and one
telephone, provided by the Swiss Post. The black box with the dial served no
other purpose than making calls, and that did us just fine. In contrast, anyone who
enters a phone store today runs the risk of being flattened by an avalanche of
brands, models and contract options.
And yet, selection is the yardstick of progress. It is what sets us apart from
planned economies and the Stone Age. Yes, abundance makes you giddy, but
there is a limit. When it is exceeded, a surfeit of choices destroys quality of life.
The technical term for this is the paradox of choice.
In his book of the same title, psychologist Barry Schwartz describes why this is
so. First, a large selection leads to inner paralysis. To test this, a supermarket set
up a stand where customers could sample twenty-four varieties of jelly. They
could try as many as they liked and then buy them at a discount. The next day,
the owners carried out the same experiment with only six flavours. The result?
They sold ten times more jelly on day two. Why? With such a wide range,

customers could not come to a decision, so they bought nothing. The experiment
was repeated several times with different products. The results were always the
Second, a broader selection leads to poorer decisions. If you ask young people
what is important in a life partner, they reel off all the usual qualities: intelligence,
good manners, warmth, the ability to listen, a sense of humour and physical
attractiveness. But do they actually take these criteria into account when
choosing someone? In the past, a young man from a village of average size could
choose among maybe twenty girls of similar age with whom he went to school.
He knew their families and vice versa, leading to a decision based on several
well-known attributes. Nowadays, in the era of online dating, millions of potential
partners are at our disposal. It has been proven that the stress caused by this
mind-boggling variety is so large that the male brain reduces the decision to one
single criterion: physical attractiveness. The consequences of this selection
process you already know – perhaps even from personal experience.
Finally, large selection leads to discontent. How can you be sure you are
making the right choice when 200 options surround and confound you? The
answer is: you cannot. The more choice you have, the more unsure and therefore
dissatisfied you are afterward.
So, what can you do? Think carefully about what you want before you inspect
existing offers. Write down these criteria and stick to them rigidly. Also, realise
that you can never make a perfect decision. Aiming for this, given the flood of
possibilities, is a form of irrational perfectionism. Instead, learn to love a ‘good’
choice. Yes, even in terms of life partners. Only the best will do? In this age of
unlimited variety, rather the opposite is true: ‘good enough’ is the new optimum
(except, of course, for you and me).
See also Decision Fatigue (ch. 53); Alternative Blindness (ch. 71); Default Effect (ch. 81)

Liking Bias

Kevin has just bought two boxes of fine Margaux. He rarely drinks wine – not
even Bordeaux – but the sales assistant was so nice, not fake or pushy, just really
likeable. So he bough