Book Review: “Thinking Fast and Slow” by Daniel Kahneman

This book shines some light on what make people tick – makes judgments and decisions. Too many concepts were presented and require lots of time relating our current life. I listened to the audiobook twice. The first time was more than a year ago. The second time wasn’t sufficient without reading the book for this review. By the way the Kindle version of the book is selling for just $3. What a bargain for a rich book like this!

My key take-aways:
1. Thinking statistically is really hard and maybe the way to avoid falling trap into our own System 1.
2. Planning fallacy really hit home in my career. I like the remedy of applying past statistics and adjust from there.
3. The use of “premortem” (Part 3) to counter group think is brilliant. I think it’s a great way to think about one’s big personal decision. Ask yourself, “Imagine X years into the future and you have failed. Write a brief history of your failure.” Wow, that’s powerful.
4. WYSIATI (What you see is all there is) lies in the basic theme of System 1. We can only decide based on what we perceive at that moment, mostly from our intuition or System 1.

Overall, it is a very good book if you’re interested in understanding human nature and the conflicts we often found in ourselves.

Here is a quick summary:

There are two systems and two selves in each of us:

Part 1 goes in System 1 and System 2.
System 1: Fast and automatic, with little effort and voluntary control.
System 2: Slow, effortful and attentive – articulates judgments including mental activities and makes choices but if often endorses or rationalizes ideas and feelings that were generated by System 1.
Priming effect is due to System 1. Whatever is easier like what’s easier to pronounce, or rhyme due to cognitive ease. Explaining away the coincidence like the “Black Swan” event, halo effect, and jumping into conclusion are what System 1 is good at, assisted by System 2. System 2 plays the “apologist” role for the emotions of System 1 then a critic of those emotions – an endorser rather than an enforcer.

Part 2 talks about the heuristics vs. thinking statistically, which system 1 lacks. Small samples do to lend to statistically meaningful conclusions – fooled by small samples. Also we’re subject to anchoring for something we have no reference of, like “Is the tallest redwood tree taller than 1200 ft? What’s your best guess?” The availability bias makes us think things are more frequent if we can recall most instances like consecutive plane crashes. “Availability Cascade” is attributed to the limitation of our mind to deal with small risks: we either ignore them altogether or give them far too much weight due to viral news spreading. The sins of representativeness: excessive willingness to predict the occurrence of unlikely (low base-rate) events. Both System 1 and 2 may be guilty of incorrect intuitive judgement. System 1 suggested the incorrect intuition and System 2 endorsed it and expressed it as a judgment – due to ignorance or laziness. 2nd sin: insensitivity to the quality of evidence. Leverage Bayesian Statistics and anchor your judgement of the probability of an outcome on a plausible base rate and question the diagnosticity of your evidence. Conjunction fallacy as in the “Linda” example highlights the laziness of System 2 that breaks the statistical logic – less is more – a more plausible story that appeals to System 1. People often don’t learn things from statistics (especially the base rate) until they can derive causal interpretation from them. This is where Bayesian Statistic is important. Also grasping the concept of regression to the mean is difficult because of the insistent demand for causal interpretations by System 1.

Part 3 describes the limitation of our mind and over confidence in what we believe we know. Hindsight bias leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad. Outcome bias is when the outcome is are bad, the agents get blamed for not seeing the handwriting. Illusion of validity is acting like as if each of our specific predictions was valid, for example, author’s predicting the solder’s future performance and the fund manager in finance. Experts are often inferior to algorithms because experts try to be clever, consider complex combinations of features in making their prediction and humans are inconsistent in making summary judgments of complex information. Only trust those with two basic conditions for acquiring a skill: an environment that’s sufficiently regular to be predictable, and an opportunity to learn these regularities through prolonged practices and feedback. Planning fallacy describes plans and forecasts that are unrealistically close to the best-case scenarios, and could be improved by consulting the statistics of similar cases. We tend to exaggerate our forecast ability which fosters optimistic overconfidence. Optimistic bias is what drives the entrepreneurs who drive our economy. Use of “Premortem” procedure – when a group almost come to an important decision but has not formally committed itself, gather a group of people knowledgeable about the decision and ask “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.” The premortem legitimizes doubts – not suppressing healthy amount of doubts and paranoia.

Part 4 challenges the rationality assumption in standard economics. The author debunked the Bernoulli’s utility theory by arguing that it doesn’t take reference point into account. Starting from $4M to $2M vs. from $1M to $2M would have different utility curves. A person become risk seeking when all his options are bad – the basis for author’s Prospect Theory. Endowment effect explains why you tend to value more of what you own and less of what others own because of different reference point and loss looms larger than gain – unlike a trader. Possibility effect places a heavy weight between 0% and 5% probability and certainty effect places a heavy weight between 95% and 100%. In the author’s Fourfold pattern, people tend to be risk adverse when there is high probability of gain and low probability of loss (insurance), but risk seeking when there is high probability of loss and low probability of gain (lottery). People tend to over-estimate the probability of rare events and overweight their probability. The emotion of regret is more felt to an outcome that is produced by action than to the same outcome when it’s produced by inaction. By anticipating the regret before your decision, you may minimize the regret.

Part 5 describes the two selves: the experiencing self and the remembering self.
When it comes to pain, the following were observed: Peak-end rule – worst pain and the pain at the end dominates. Duration Neglect: the duration of the procedure had no effect whatsoever on the ratings of the total pain. Is the experience or the memory of the experience that matters? We’re often confused by it – cognitive illusion. Pain is preferred to be short and pleasure long but our memory, a function of System 1, has evolved to the remember the peak (pain or pleasure) and at its end and neglects its duration. We have an experiencing self and remembering self, which we seem to care more as demonstrated in the “amnesia vacation” example.

One thought on “Book Review: “Thinking Fast and Slow” by Daniel Kahneman”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.