Thinking Fast and Slow | 11 Key learnings from this book

Prashant Aggarwal
8 min readSep 28, 2021

Thinking Fast and slow offers a description of the two main methods our brains use. Much like a computer, the brain is comprised of systems. System 1 is quick and intuitive. System 2 is emotional. Daniel Kahneman encourages us to abandon our dependence on this model. Systems 1 is the primary reason for stagnation and errors. System 2 is a less sluggish, more thoughtful, and rational thought process. Kahneman suggests tapping into this process more often. Alongside this suggestion, Kahneman provides guidance on the reasons and methods we use to make choices.

1. Systems 1 Is Innate

There are two different systems that are connected with our thinking processes. In each system, Kahneman exposes the fundamental tasks and decision-making methods that go with each.

For instance, you will not have to consider what the capital of England is. In time, you’ve developed an automatic connection with the question, “What do you think is the name of the city in England as well as an intuitive understanding that system 1 has also dealt with the learned skills like reading an article cycling or reading a book. It also teaches you how to behave in everyday social situations.

2. System 2 can Manage Parts of System 1

System 2 is younger, it was developed in the last few millennia. System 2 has become more and more crucial in the wake of modernization and changing priorities. A majority of the activities of the second system require attention to detail, for example, providing a person with your number. The actions of System 2 are usually associated with the experience of choice, agency, and concentration. When we think about ourselves, we are a part of System 2. It is our conscious, rational self that holds beliefs, makes decisions, and decides on what we should think of and how to do it.

3. The Two Systems Work Together

Another instance of two systems working together is when playing the game. Certain aspects of the sport will involve automatic actions. Think about a tennis match. It is a sport that requires running which is a natural human skill that is controlled by system 1. A ball’s hitting can be a system 1 task by practicing. But there will be certain strokes or choices that require system 2. Thus the two systems can be complemented to one another when you engage in a sport like a tennis.

4. Heuristics As Mental Shortcuts

5. The Biases we create in our own minds

Kahneman discusses eight common biases and heuristics that could cause poor decision-making:

  1. Law of Small Numbers This law exposes our strongly held beliefs about smaller samples or those that resemble the same population from which they originate. Many underestimate the degree of variability that exists in small sample sizes. In other words, they underestimate what an insignificant study could accomplish. If a drug works for the majority of patients. How many patients will benefit from treatment if they are five? In actuality, from 5 patients there’s only a 41% probability that four patients will react.
  2. Anchoring when people make decisions that affect their lives, they tend to rely more heavily on prior information or on the first piece of information they encounter. This is called anchoring bias. If you first look at an item that is priced at $1200 and then you see another one for $100, you’re likely to overlook the second one. If you’ve just seen the second one that costs $100, you’d not think of it as to be cheap. The anchor — which was the first price you saw had an unintentional influence on your decision.
  3. Priming Our brains work through associations between words and objects. Thus, we are prone to be primed. A common belief is triggered by any event and guides us in a specific direction when we make our choices. Kahneman says that the concept of priming is the base for nudges and advertisements using positive images. For instance, Nike primes for feelings of achievement and exercise.
  4. Cognitive ease: Anything that is easy to comprehend for System 2 is more likely to be accepted as truth. It is due to the repetition of ideas, clear displays of a preconceived idea, and even one’s personal mood. In the end, the repetition of a lie could lead people to believe the idea, even though they know that it’s not true because the notion becomes a common one and simple to comprehend.
  5. Making assumptions without thinking : Kahneman suggests that our system 1 is a machine that operates by leaping to conclusions. This conclusion is based upon ‘What you see is what there is. In reality System 1 draws conclusions based on easily accessible and sometimes inaccurate data. When these conclusions are drawn, we trust them to the last. The impact that is measured of the halo effect, confirmation bias framing effects, as well as base-rate negligence are all aspects of making assumptions in the real world.
  • The Halo effect can be described as when you assign more positive attributes to an individual or thing by relying on the positive perception. For example, believing that a person is smarter than they really are due to the fact that they look gorgeous.
  • Confirmation bias is when you hold certain beliefs and search for information that confirms the conviction. Also, you avoid information that contradicts the beliefs. For instance, a detective might spot a suspect in the early stages of the case, but just want to confirm rather than disproving evidence. Filter bubbles, or “algorithmic editing” can increase confirmation bias on social media.
  • The algorithms achieve this by presenting users with only posts and information they are likely to agree with instead of exposing them to different viewpoints.
  • The effects of framing are related to how an environment or problem affects people’s behavior. For instance, individuals tend to be cautious when frames that are positive are presented and are more prone to risk when the negative frame is shown. In one study in which penalties for late registration were implemented the number of Ph.D. applicants registered earlier.
  • However, the number dropped to 67% after it was made available as a discount to encourage early registration.
  • In the end, base-rate neglect or base-rate fallacy is a result of our tendency to concentrate on the individuation of information, rather than base rate information. Individuating information is unique to a specific person or an event. Basis rate statistics are objective statistical data.
  • We tend to attribute more significance to specific data and, in most cases, ignore base rate data completely.
  • This means that we tend to base our decisions on individual traits instead of the generality of something that is general. This is an illustration of the fallacy that is called the base rate. There are situations in which there are a greater amount of fake positives than genuine positives. For instance 100 out of 1000 people tested negative for an infection however only 20 have the disease.
  • The result is that 80 tests are false positives. The chance of positive results is dependent on many factors, such as the accuracy of testing and the nature of the population being tested.
  • The frequency, which is the percentage of the population that suffers from the condition, could be less than the false positive rate of the test. In this scenario test results that have the lowest chance of generating a false positive in a particular situation can give many false positives than genuine positives all over.
  • This is another example: even if one student you have in your Chemistry elective class looks and behaves like a traditional doctor, their likelihood that they’re studying medicine is very slim. It’s because medical programs typically have just about 100 to 100 people in contrast in comparison to thousands of pupils in other faculties, such as Business and Engineering.
  • Although it’s possible to make quick judgments about individuals based on certain information we should not allow this to erase all baseline data on statistics.
  1. Accessibility: This bias happens when we use the significance of a particular event, recent event, or an experience that’s memorable to us, in order in order to arrive at our conclusions. Individuals who operate under System 1 are more susceptible to the availability bias than others. One illustration of this bias could be watching the news and then hearing that there’s been a massive plane crash in a different country.
  2. The Sunk Cost Fallacy: The Sunk-Cost Fallacy fallacy occurs when investors continue to invest more resources in a loss-making account, despite having better options accessible. For instance, when investors let the price at which they bought the stock decide the time they are able to sell it and when they can sell, they fall victim to the”sunk cost” fallacy.
  3. The tendency of investors to sell successful stocks too early and keeping hold of losing stocks for too long has been studied extensively. Another instance is staying in a relationship for a long time despite it being emotionally destructive.
  4. People are afraid to start over since it implies that everything they’ve done previously was worthless, however, this anxiety is often more damaging than letting go. This is the reason why people get dependent on gambling. To overcome this misconception, it is important to avoid the increase of a commitment to something that is likely to not succeed.

6. A Regression Back to the Mean

  • The illusion of Understanding: We create narratives to help us understand the world. We search for causality when there is none.
  • The illusion of authenticity: Pundits as well as stock pickers and experts have a vast amount of their expertise.
  • Expert intuition: Algorithms that are applied with discipline usually outdo experts and their intuition.
  • Planning fallacy: This error is when people underestimate the positive results of an event based on the chance because they have planned for the event.
  • The optimism and optimism and Entrepreneurial Delusion The majority of people are confident, and often overlook competitors, and think that they are better than the average.

7. Hindsight Importantly influences Decision Making

8. Risk Aversion

9. Loss Aversion

This kind of risk-taking frequently transforms manageable failures into catastrophes. Since defeat is difficult for the loser during wars typically fights beyond the point where victory is guaranteed to the opposing side.

10. Don’t Believe that your preferences reflect Your Personal Interests

11. Our Memories Influence Our Choices

Memories shape our decisions. And, worryingly, they could be incorrect. The inherent inconsistency of our memories is built into the nature of our brains. Our minds have strong preferences regarding the duration of our experience of pleasure and pain. We would like the pain to be short and the satisfaction to endure. Memory, which is a part of System 1 has developed to capture the most intense moments in the experience of pleasure or pain. A memory that does not consider time will not satisfy our desire for pleasures that last long and quick painful moments.

In Nutshell

Thinking fast and slow outlines the way in which humans function. We are a part of two different systems which are supportive of the other synergy. The problem is when we are dependent too much on our fast and impulsive system 1. This dependence leads to a variety of biases that adversely affect decision-making. The most important thing is to recognize the sources of these biases and then use the analytical systems 2 in line.

If you found my post helpful, then do share it with your friends and colleagues. If you have any feedback/questions, you may leave a comment below.

Originally published at https://prashantaggarwal.com on September 28, 2021.

--

--