Book Review – Thinking Fast & Slow

Adam Books, Reading

Thinking Fast and SlowThinking Fast and Slow by Daniel Kahneman is a dense read. It’s one of the tougher things I’ve made it through recently, but if it’s well worth it if you have the time and inclination. Thinking Fast and Slow is a masters class on the human brain, it’s systems, strengths and weaknesses, and how our programming can work against us.

As a Nobel Prize Winner in Economics and a giant in Behavioral Study, Kahneman is uniquely qualified to help us explore the small influences, with dozens of examples, of exactly how are mind are not the logical machines we think them to be and how we can do our best to control those errors. At least being aware of them give you a chance at controlling or avoiding these errors.

The title of the book comes largely from Kahneman‘s breakdown of the human mind between intuitive, unconscious and fast System 1 and our slow but methodical and logical but lazy System 2. These basic ideas can transform how you think about thinking.

If you are a fan of Charlie Munger and mental models this is a must read and will certainly help your investing and is the truth in comparison to Malcolm Gladwell’s fun but incomplete (and inaccurate) book Blink.

Michael Lewis said of the book “[Thinking Fast and Slow] is wonderful, of course. To anyone with the slightest interest in the workings of his own mind, it is so rich and fascinating that any summary would seem absurd.” ―Michael Lewis, Vanity Fair

My Notes:

Systematic errors (biases) recur predictably.

  • Halo Effect – A good looking and confident person’s words will be judged more favorably than someone unattractive.
  • Availability Heuristic – People tend to assess the relative importance of issues by the ease with which they are retrieved from memory, which is largely influenced by what the media talks about.
  • Intuitive Heuristic – When faced with a difficult question, we often answer an easier one instead, WITHOUT NOTICING the substitution.

System 1

Automatic and quick with little or no effort and no sense of voluntary control. It originates impressions and feelings and throws out associations between things that are the main source of deliberate choices by System 2. It recognizes patterns and makes snap judgements.

System 2

Allocates attention, makes mental effort and complex computations. System 2 is largely lazy and will defer to System 1 unless forced. System 2 can’t function when distracted and it must focus. The division of labor is highly efficient between the systems and works well most of the time.

System 2 is in charge of self-control.

I love the example of the baseketball players because I did that experiment in high school and have always remembered it. I’m putting the video below and you can let me know what you see.

If you saw the gorilla congratulations. If not, like a shocking number of people (myself included), a gorilla walked across the screen and you never noticed. Intense focus can make us blind. Your brain was so focused in that it failed to notice. It teachers us we can be blind to the obvious and we are blind to our blindness.

Attention and Effort – Pupils are sensitive indicators of mental effort and dilate when your System 2 is doing work. The harder the problem, the more they dilate. Dilated pupils subconsciously make a person more attractive.

We only have a certainly level of attention and our brain allocates it on a second by second basis. If the bulk of it is consumed by a difficult task, it’s not getting allocated elsewhere.

System 2 is the only one that can follow rules, compare objects on several attributes and make deliberate choices. System 1 detects simple relationships and excels at integrating information on 1 thing.

System 2 is Lazy – You can think and walk, but you tend to stop if you suddenly need a large percentage of short-term memory. You also can’t think as well if you have to go faster than your natural pace.

Self-control and deliberate thought draw on the same limited budget of effort.

It’s possible to exert considerable effort without willpower – the modern term for it is “flow”. Flow is a state of effortless concentration so deep that we lose our sense of time, ourselves, and our problems”.

Ego Depletion – An effort of will or self-control is tiring. If you force yourself to do something you are less able to force yourself to do the next thing. This mental energy is more than metaphor. They’ve actually found it depletes glucose.

Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed. In fact in studies, children who exhibit more self-control had substantially higher scores on tests of intelligence.

The Associative Machine – The books example is if you put “banana” and “vomit” next to each other on the page, your brain instantly creates a brief not noticeable dislike of bananas and that your brain would have immediately reacted to the words, raising heart-rate, dilating pupils, and turning on sweat glands; all without your control.

Priming Effect – A prior idea or word sets your brain up to associate the next idea or word with the prior one. Example – If you say EAT and then see SO_P you think SOUP. If it has been WASH you would have thought SOAP. You were primed.

Reciprocal Priming – If primed to think of old age, it makes you feel and act old, reinforcing the thought. Another example would be that smiling tends to make you feel happy or things seem funnier. It also would lend a lot of credence to something I’m a firm believer in – Fake it till you Make it.

  • Money primed people are more selfish. Money primes individualism.
  • Mortality primes people to accept authoritarian ideas.
  • The Lady Macbeth Effect – Feeling you’ve done something morally dirty makes you want to clean your physical body.

The Mere Exposure Effect – Link between the repetition of an arbitrary stimulus and the mild affection that people eventually have for it. This doesn’t even have to be conscious.

You see things easier that you’ve seen before. Familiarity increases clarity. When in doubt people choose the familiar as truth because it seems more likely than the unfamiliar. Familiarity is not easily distinguished from Truth. You can see this used a lot in politics.

Surprise is the most sensitive indication of how we understand our world and what we expect from it. A single incidence of something makes any further occurrence less of a surprise, even if it should be.

Humans search for causality even when none exists. We find patterns where there aren’t any.

System 1 immediately jumps to conclusions without any effort on your part. This is an efficient use of the system, but dangerous in unfamiliar environments. When System 2 is otherwise engaged, we will believe almost anything.

Confirmation Bias – we look for evidence to confirm what we already believe and disregard information that does not fit that narrative.

To avoid the halo or horns effect, decorrelate error. Don’t let the first piece of data influence the next. To avoid same problem in a meeting have all participants right down a brief summary of their position before the meeting.

Consistency of information makes a good story, not its completeness.

Answering Easier Questions – If a satisfactory answer to a hard question can not be found quickly, System 1 will find a related question that is easier and will answer it. Just as important, most people will not realize they do so.

Law of Small Numbers – Humans are bad at understanding statistics. Small numbers naturally lend themselves to extreme outcomes (good or bad). This can be seen in small schools having extremely good or bad outcomes in tiny numbers, cancer rates in small towns, or the illusion of the hot hand in basketball.

The Anchoring Effect – When people consider a particular value for an unknown quantity before estimating that quantity. In other words, you grab the first thing provided and base all your estimates off that.

Availability Bias – We judge the frequency of something by the ease at which examples and instances come to mind. Basically, we exaggerate what we remember and minimize what we don’t.

The keys to Beyesian Reasoning – 1 Anchor your judgement of the probability of an outcome to a plausible base rate and 2 question the diagnosticity of your evidence.

Adding details to scenarios makes them more persuasive but less likely to be true (statistically).

** In Skill Training – Rewards for improved performance work better than punishment for mistakes.

Humans make great effort to give reasons and stories for the natural tendency to revert to the mean.

Narrative Fallacy – Flawed stories of the past shape our views of the world and our expectations for the future. A compelling narrative fosters an illusion of inevitability. Human minds don’t deal well with non-events.

Hindsight Bias – The “I knew it all along” effect. It leads decision makers to assess the quality of a decision not by the process but by the outcome. Fear out outcome drives one to follow strict procedures to avoid blame. Our tendency to construct and believe coherent narratives of the past make it difficult to accept the limits of our forecasting ability.

The Planning Fallacy – Plans and forecasts that are unrealistically close to best case scenarios and could be improved by consulting the statistics of similar cases (we tend to be overly optimistic). This can cause us to take significant amounts of risk we shouldn’t. But it also encourages us to be persistence.

Something that I took a lot of value from was the idea of “the premortem” When an organization comes to an important decision but has not formally committed itself, take your informed decisions makers and to imagine the project to be a spectacular failure. Now write a brief history of what happened. This overcomes groupthink and unleashes the imagination to all the problems that had not occurred to you.

Losses always loom larger than wins. It’s not symmetric between outcomes. Humans don’t look at a weighted average probability of outcomes.

The Endowment Effect – Something you have is assigned a greater value than you would assign if you didn’t have it.

Goals are reference points. We are driven more strongly to avoid losses than achieve gains. The aversion to the failure of not reaching the goal is much stronger than the desire to exceed it.

The Disposition Effect – our massive preference to sell winners and hold losers.

The Sunk-Cost Fallacy – Our poor mental accounting in which we throw good money after bad, rather than cutting our losses. This fallacy keeps people in jobs, marriages and other things they should have long left.

A quirk of human psychology. We feel stronger emotional reaction (including regret) to an outcome that is produced by action than the same outcome from inaction.

I found the idea of the experiencing self and the remembering self to be different, but I’m not quite sure how to even take a note about it. It may require a full blog post so I’ll just say it was worth the price of admission.

Focusing Illusion – Nothing in life is as important as you think it is when you are thinking about it.

Conclusion

Thinking Fast and Slow is an incredible book and if you are someone who cares about human psychology, decision making and improving those processes, it’s a must read.


Still here? If so consider using Tradeking for your next brokerage. They’re currently offering a HUGE $200 sign-up bonus. and Personal Capital, my favorite tool to see all your accounts in place. You can also support us by using our Amazon link. We get a bit back if you do and it helps keep the lights on.

If you have a moment, sharing this post on one of your various social media accounts and follow us (facebook, twitter, pinterest) can really help us gain readers, so if you think this post is worth it, please considering sharing. You can use one of the buttons below.

Don’t forget to check out our Resources Page where we’re working to keep links to the best of the web for personal finance, investing, travel and frugal badassity.