Daniel Kahneman won the Nobel Prize in economics in 2002 for his brilliant work along with the late Amos Tversky on the way people make decisions. Kahneman clearly worked hard to make the conclusions of his research accessible to all in his book Thinking Fast and Slow. No other book as given me as much useful insight into the workings of my own mind.
I understand much better why I would hesitate before accepting a 50/50 bet to lose $200 or win $300, and why I have an opinion on the future of Apple stock even though I know I really have no idea what will happen. The key is to understand the workings of my “System 1”.
Kahneman portrays the human mind as consisting of two actors, System 1 and System 2. He’s careful to explain that they are not really separate characters inside your mind; speaking of them like they are separate entities is so useful for helping the reader understand the research results that the book would be far less useful without them.
Roughly speaking, System 1 is the fast and involuntary part of your brain. It sees potential threats and answers just about any question quickly, whether it knows a good answer or not. System 2 is slow and lazy by comparison. It usually just accepts judgments from System 1, but sometimes it kicks in and thinks things over to come to a decision different from System 1’s decision.
It is my System 1 that is overly averse to losses. It tends to feel losses about twice as intensely as it feels gains. System 1 is the reason why my initial reaction was to turn down a 50/50 bet to lose $200 or win $300. It is my System 2 that takes its time to verify that the 50/50 coin toss will be fair and decide to accept the bet.
It is my System 1 that decides quickly if Apple stock will go up or down based on available evidence. Of course, it might just substitute an easier question, such as “do I like Apple?” System 2 almost always accepts System 1 judgements; it couldn’t possibly verify every decision made by System 1. There was a time when I acted on my System 1 judgement of what would happen to Apple. Now my System 2 overrides these snap decisions and prevents me from trading Apple stock.
System 2 is good at coming up with reasons to back up the judgments that come from System 1. Most people would turn down the $200/$300 bet. When pressed they could come up with intelligent-sounding reasons for their decision. But the real reason is that their System 1 hates losses. The truth is that “you know far less about yourself than you feel you do.”
System 1 is a remarkable machine that we can’t do without, but it does tend to make certain types of predictable errors. One such error is to confuse familiarity with truth. “A reliable way to make people believe in falsehoods is frequent repetition.” This explains some of the “reporting” on Fox News.
System 1 “deals well with averages but poorly with sums.” Perhaps this explains why I’ve found that when asked about casual spending such as lunches out, most people can fairly accurately say the average cost of their lunches and seem to know how often they buy lunch, but consistently underestimate by a long shot their total spending on lunches.
The human mind tends to be bad with probabilities. When it comes to small risks, “we either ignore them altogether or give them far too much weight – nothing in between.”
The idea of regression to the mean is familiar to me, but Kahneman shows how it can be hidden. For example, a sports coach who praises good performance and criticizes poor performance can easily get the impression that criticism works and praise doesn’t. However, even without any words from the coach, unusually poor performance is usually followed by an improvement, and strong performance is usually followed by a more average effort.
If we like a person, our mind’s “halo effect” tends to make us think well of them in all respects, including assessment of their skills. It works the other way too: we judge people we don’t like to be bad at everything. Kahneman uses the example “Hitler loved dogs and little children” to drive home this point. We instantly expect someone who makes such a statement to follow it with denials of past atrocities and other offensive statements.
Kahnemen tells an amusing story where he demonstrating to a Wall Street firm that based on statistical evidence provided by the firm, they had absolutely no stock-picking skill whatsoever. However, “people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers.” At this point I’m not sure if Kahneman was talking about stock picking, religion, or both.
In an experiment showing a failing of System 1 thinking, students were given a chance to win a prize if they pulled a red marble from one of two urns. They were told that the first urn had 1 red marble out of 10 marbles total. The second had 8 red out of 100. More than a third or students chose the second urn because it contained more winners even though the better odds of winning are with the first urn.
To show that we’re susceptible to how a question is framed, consider two drivers. The first changes from a 12 mpg gas-guzzler to one that runs at 14 mpg. The second driver changes cars from one that gets 30 mpg to a 40 mpg car. Who saves more money? The surprising answer is that the first driver saves more money. If the figures were given as the number of litres (or gallons) per 100 km (or miles), the correct answer would have been more obvious.
The book summarizes the “focusing illusion” with the following interesting quote: “Nothing in life is as important as you think it is when you are thinking about it.”
Overall, I found this book very valuable to me personally. I now understand why it is so important to figure out when it is safe to trust my snap judgements and when I should slow down and think things through. I highly recommend this book to my readers.