Thinking, Fast and Slow PDF by Daniel Kahneman

Download Thinking, Fast and Slow PDF book free online by Daniel Kahneman. In the highly anticipated Thinking, Fast and Slow by Daniel Kahneman, he takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. Kahneman exposes the extraordinary capabilities—and also the faults and biases—of fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behavior. GET FREE AUDIOBOOK

The impact of loss aversion and overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the challenges of properly framing risks at work and at home, the profound effect of cognitive biases on everything from playing the stock market to planning the next vacation—each of these can be understood only by knowing how the two systems work together to shape our judgments and decisions.

Engaging the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives—and how we can use different techniques to guard against the mental glitches that often get us into trouble. Thinking, Fast and Slow will transform the way you think about thinking.

Table of Contents

Summary of Thinking, Fast and Slow PDF

Our cognitive processes are linked to two different systems. Kahneman summarizes the major roles and decision-making processes linked with each system for each system.

System 1 encompasses those talents that are innate and generally shared among animals of the same species. Each of us, for example, is born with the ability to recognize items, focus our attention on crucial stimuli, and fear things associated with death or disease. System 1 is also in charge of mental activities that have become almost instinctive as they have become faster and more automated. Because of repeated practice, these tasks usually migrate into system 1. Certain pieces of information will come naturally to you. For example, you don’t even need to consider what England’s capital is. You’ve developed an automatic association with the question, “What is the capital of England?” through time. System 1 deals with taught skills such as reading a book, riding a bike, and how to respond in social circumstances, in addition to intuitive knowledge.

There are also some acts that are classified as system 1 but can also be classified as system 2. If you are making a conscious attempt to participate with that action, you will experience this overlap. Chewing, for example, is usually classified as a system 1 activity. Assume, though, that you realize you should be chewing your meal more than you have been. Some of your chewing actions will be moved to the effortful system 2 in this instance.

Both systems 1 and 2 are frequently linked to attention. They work together. System 1 will, for example, control your involuntary response to a loud sound. Your system 2 will then take control and devote voluntary attention to this sound, as well as logical reasoning regarding its origin.

System 1 acts as a filter through which you interpret your events. It’s the method you employ to make intuitive decisions. As a result of its evolutionarily primitive nature, it is unquestionably the earliest brain system. System 1 is also impulsive and unconscious. Even if you don’t think system 1 has much of an impact on your life, it does. It influences a lot of your decisions and judgements.

System 2 Can Control Parts of System 1 (StoryShot #2)
System 2 is made up of a variety of activities. However, each of these processes demands attention, which is disrupted when it is diverted. Your performance in these activities will suffer if you don’t pay attention. System 2 has the ability to significantly alter the way system 1 operates. For example, detection is usually a system 1 action. You can program system 2 to look for a specific individual in a crowd. This priming by system 2 improves the performance of system 1, making it more probable that you will find the specific person in the crowd. This is the same method we use when doing a word search.

System 2 tasks demand more effort than system 1 activities since they require attention. It’s also difficult to perform multiple system 2 activities at the same time. The only actions that can be accomplished at the same time are those that require little effort, such as conversing while driving. However, conversing while overtaking a truck on a small route is not a good idea. In other words, the more attention a task demands, the less feasible it is to work on another system 2 duty at the same time.

System 2 is newer, having only existed for a few thousand years. As we adjust to modernization and shifting priorities, System 2 has become increasingly critical. The majority of the operations in the second system, such as giving someone your phone number, necessitate conscious attention. System 2’s functions are frequently linked to subjective feelings of agency, choice, and attentiveness. We identify with System 2 when we think about ourselves. The conscious, thinking self is the one who holds beliefs, makes decisions, and decides what to think about and do.

StoryShot #3: The Two Systems Help Each Other Based on the descriptions of the two systems, it’s easy to conceive that they happen one after the other. These two systems, according to Kahneman, are genuinely interconnected and mutually helpful. As a result, practically all tasks are a combination of both systems and are mutually beneficial. Emotions (system 1), for example, are critical in adopting logical reasoning (system 2). Emotions help us make more meaningful and successful decisions.

Another time when the two systems operate together is when we are participating in sports. Certain aspects of the game will be automatic. Consider a tennis match. Tennis will make use of running, which is a natural human skill regulated by system 1. Through practice, hitting a ball can also become a system 1 activity. However, specific strokes or tactical considerations will always necessitate the use of your system 2. As a result, when playing a sport like tennis, both systems compliment each other.

When people rely too much on system 1 because it involves less work, problems can occur. Activities that aren’t part of your regular routine can cause additional problems. This is the point at which systems 1 and 2 will clash.

Heuristics as Mental Shortcuts (StoryShot #4)
The concept of heuristics is introduced in the book’s second section. Heuristics are mental shortcuts that we develop while making decisions. We are continually looking for the most efficient solution to solve challenges. As a result, heuristics are extremely useful for conserving energy in our daily lives. Our heuristics, for example, allow us to apply previous information to slightly different situations automatically. Although heuristics might be beneficial, it’s also important to remember that they can also be the root of prejudice. For example, you might have had a bad experience with someone from a particular ethnic group. You may stereotype other persons from the same ethnic group if you rely exclusively on your heuristics. Cognitive biases, systemic errors in thinking, poor decisions, and misreading of events can all be caused by heuristics.

The Biases We Create in Our Own Minds (StoryShot #5)
Eight common biases and heuristics that might lead to poor decision-making are introduced by Kahneman:

The law of small numbers: This law demonstrates our significantly biased perception that smaller numbers or samples are more representative of the population from which they are drawn. The heterogeneity in small samples is often underestimated. To put it another way, people exaggerate the impact of a little study. Assume that a medicine is effective in 80% of patients. If five patients are treated, how many will respond? In fact, there’s only a 41% probability that four people will answer out of a group of five.
Anchoring: When making decisions, people are more likely to rely on pre-existing information or the first information they encounter. Anchoring bias is the term for this. You’re more likely to discard the second shirt if you first see a T-shirt that costs $1,200 and then see one that costs $100. You wouldn’t consider the second garment, which costs $100, cheap if you only saw it. Your decision was swayed by the anchor — the first price you saw.
Priming: Our minds function by associating words and objects. As a result, we are vulnerable to priming. Anything might elicit a common association, leading us to make decisions in a certain manner. According to Kahneman, priming is the foundation for nudges and positive imagery advertising. Nike, for example, sets the stage for feelings of accomplishment and activity. Consumers are likely to think about Nike products when starting a new sport or maintaining their health. Nike promotes professional athletes and utilizes slogans like “Just Do It” to highlight their achievements and dedication. Here’s another illustration: If a restaurant owner has too much Italian wine on hand, playing Italian music in the background can encourage people to buy it.
What is easier for System 2 to believe is more likely to be believed. Notion repetition, clear display, a prepared idea, and even one’s own good mood all contribute to ease. It turns out that even if people are aware that an untruth is false, they might accept it since the concept becomes familiar and cognitively easy to absorb. An individual who is surrounded by people who believe and talk about fake news is an illustration of this. Despite the fact that evidence says this belief is untrue, the simplicity with which it can now be processed makes it lot easier to believe it.
Making snap judgments: Our system 1, according to Kahneman, is a computer that functions by rushing to conclusions. ‘What you see is all there is,’ is the basis for these findings. In effect, system 1 makes decisions based on easily available but occasionally incorrect data. We believe in these conclusions all the way to the finish after they’ve been reached. In practice, halo effects, confirmation bias, framing effects, and base-rate neglect are all factors that influence leaping to conclusions.
The halo effect occurs when you associate a person or object with more good characteristics based on a single positive impression. For example, believing a person is smarter than they are simply because they are attractive.
Confirmation bias arises when you have a strong view and search out evidence to back it up. You also dismiss material that contradicts your beliefs. A detective, for example, may identify a suspect early in the investigation but only seek confirming evidence rather than disproving evidence. Confirmation bias in social media is amplified by filter bubbles or “algorithmic editing.” The algorithms achieve this by displaying only material and messages that the user is likely to agree with, rather than exposing them to competing viewpoints.
The context of a dilemma can impact people’s conduct, which is what framing effects are all about. People, for example, prefer to avoid danger when offered with a positive framing and seek risk when presented with a negative frame. When a late registration penalty was implemented in one study, 93 percent of PhD students registered early. When it was presented as a discount for early registration, however, the percentage dropped to 67%.
Finally, base-rate neglect, also known as base-rate fallacy, refers to our proclivity to focus on individuating rather than base-rate data. Information that identifies a person or an event is referred to as identifying information. The information in the base rate is objective and statistical. We have a tendency to place a higher emphasis on specific information while ignoring the base rate information. As a result, rather than making assumptions based on the occurrence of something in general, we are more prone to make conclusions based on individual traits. The base rate fallacy is shown by the false-positive dilemma. There are times when the number of false positives outnumbers the number of actual positives. For example, 100 people out of 1,000 test positive for an infection, but only 20 of them truly have it. This means that 80 of the tests were false positives. Positive results are dependent on a number of criteria, including testing accuracy and the characteristics of the tested population. The prevalence of a condition, or the percentage of people who have it, can be lower than the test’s false positive rate. Even tests with an extremely low likelihood of giving a false positive in a single case will produce more false positives than genuine positives overall in such a setting. Here’s another illustration: Even if one of the students in your Chemistry elective looks and acts like a regular medical student, the chances of them studying medicine are small. This is because, in comparison to the thousands of students enrolled in other faculties such as Business or Engineering, medical programs often have only 100 or so students. While it may be tempting to draw fast judgments about people based on specific facts, we must not allow this to totally overshadow the statistical data.
Availability bias arises when we make decisions based on a significant event, a recent experience, or anything that is very vivid in our minds. The Availability bias affects those who are guided by System 1 more than others. Listening to the news and hearing about a huge plane catastrophe in another country is one example of this prejudice. If you had a flight the following week, you might have an irrational fear that it might crash as well.
The Sunk-Cost Illusion: This fallacy happens when consumers continue to put money into a losing account despite the fact that there are better options accessible. The sunk cost fallacy, for example, occurs when investors let the purchase price of a stock to dictate when they can sell. The tendency of investors to sell winning companies too soon while holding losing equities for far too long has been thoroughly studied. Another example is continuing to be in a long-term relationship that is emotionally draining. They are afraid of starting again because it implies that everything they have done in the past has been for naught, but this fear is usually more damaging than letting go. People become addicted to gambling as a result of this illusion. To combat this misconception, avoid escalating your commitment to something that might fail.
Regression to the Mean (StoryShot #6)
The statistical fact that every sequence of trials will eventually converge to the mean is known as regression to the mean. Despite this, humans are prone to see lucky and bad streaks as indicators of future occurrences, for example, I’ve lost five consecutive slot machine pulls, so I’m due for a win. This belief is linked to a number of mental flaws, according to Kahneman:

Illusion of comprehension: To make sense of the world, we create narratives. Where there isn’t any, we look for it.
Illusion of validity: Pundits, stock pickers, and other specialists gain a disproportionate sense of knowledge.
Expert intuition: When algorithms are used with discipline, they may often outperform experts and their intuition.
The planning fallacy arises when people overestimate the good results of a random encounter because they have prepared for it.
Optimism and the Entrepreneurial Delusion: The majority of people are overconfident, overlook competition, and feel they will surpass the average.
StoryShot #7: Hindsight Has a Big Impact on Decision-Making
Daniel Kahneman demonstrates how little we know about our past by combining several parts. He brings up hindsight bias, which has a particularly harmful impact on decision-making. In particular, hindsight changes the standard by which judgments are judged. This shift shifts the focus from the procedure to the nature of the result. At hindsight, activities that looked smart in the moment can appear irresponsibly negligent, according to Kahneman.

Our incapacity to accurately reconstruct prior levels of knowledge or beliefs that have changed is a common human weakness. The impact of hindsight bias on decision-makers’ evaluations is significant. It leads spectators to judge the quality of a decision only on its outcome rather than whether the process was sound.

Physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, and politicians are among the decision-makers who suffer the most from hindsight. We have a habit of blaming decision-makers for good decisions gone wrong. We also give them too little credit for successful actions that are only visible after the results have been achieved. As a result, there is an obvious outcome bias among people.

Although hindsight and the result bias promote risk aversion in general, they also reward reckless risk takers unfairly. Entrepreneurs who take risky bets and win are an illustration of this. Lucky leaders are never penalized for taking too many risks.

Risk Aversion (StoryShot #8)
Humans, according to Kahneman, are risk averse, meaning we want to avoid risk wherever possible. Danger is disliked by most people since it carries the risk of receiving the worst possible consequence. As a result, if given the choice between a gamble and an amount equivalent to its predicted worth, they will choose the latter. The expected value is derived by multiplying each conceivable outcome by the probability that it will occur, then adding all of the numbers together. A risk-averse decision-maker will opt for a safe bet that is less than the risk’s projected value. They are, in effect, paying a premium to avoid risk.

Loss Aversion (StoryShot #9)
The concept of loss aversion is also introduced by Kahneman. Many of the choices we make in life are a mix of possible gain and loss. There is a danger of loss as well as the possibility of profit. As a result, we must choose whether to take the risk or not.

The relative intensity of two motives is referred to as loss aversion: humans are more motivated to avoid losses than to acquire benefits. A reference point might represent the current situation, but it can also be a future aim. For example, failing to meet a goal is a setback; exceeding the goal is a win.

The two motivations are not equal in strength. Failure aversion is significantly more powerful than the desire to achieve a goal. As a result, people frequently set short-term goals that they seek to meet but may not always attain. When they have achieved their immediate objectives, they are more likely to reduce their efforts. As a result, their outcomes may occasionally defy economic rationality.

People place a higher value on gains and losses than on wealth, according to Kahneman. As a result, the decision weights they assign to outcomes are not the same as probabilities. When faced with bad choices, people take desperate risks, accepting a high risk of making things worse in exchange for a slim chance of averting a significant loss. This type of risk-taking frequently turns manageable failures into disasters. Because failure is so difficult to accept, the losing side in battles frequently fights long after the opposing side has secured victory.

StoryShot #10: Don’t Take Your Preferences Seriously If They Don’t Match Your Interests
When it comes to decisions, Daniel Kahneman believes that we all assume that our choices are in our best interests. Normally, this is not the case. Our recollections, which aren’t always correct or appropriately interpreted, have a big impact on our decisions.

For believers in the rationality of choice, decisions that do not give the best possible experience are bad news. We can’t rely on our tastes to accurately reflect our desires. Even if it is based on personal experience and recent memories, this lack of trust is true.

StoryShot #11: How Do Our Memories Influence Our Decisions?
Our decisions are influenced by our memories. Worryingly, our memories are subject to error. Our minds are designed with inconsistency in mind. We have strong preferences regarding the duration of our pain and pleasure experiences. We want pleasure to last and grief to be temporary. Our memory, which is a System 1 function, has evolved to represent the most intense moments of a pain or pleasure episode. Our preference for extended pleasures and brief sufferings will not be served by a memory that ignores duration.

A single happiness value is difficult to convey the sensation of a particular moment or event. Although both good and negative emotions exist at the same time, most situations of life can be classified as either positive or negative. The temperament and overall contentment of an individual determine their mood at any one time. Still, emotional well-being varies on a daily and weekly basis. The current mood is mostly determined by the current situation.

Thinking Fast and Slow: A Summary and Review
The book Thinking, Fast and Slow explains how all human minds work. We all have two systems that work together to assist each other. The problem arises when we rely too heavily on our impulsive and quick-thinking system 1. This overconfidence leads to a variety of biases that can have a harmful impact on decision-making. The idea is to recognize these biases and use our analytical system 2 to keep our system 1 in control.

Book Review by ahall

First, for reasons explained below I would not buy this as an audio book.

I have mixed feelings about this book for various reasons. The first 200 pages (Part 1 and 2) are heavily focused on the author trying to convince the reader that it is better to think statistically rather than instinctively / intuitively. After stating countless studies to support his premise, the author (very briefly) in Chapter 21 admits that “formulas based on statistics or on common sense” are both good forms to develop valued algorithms – Doesn’t common sense fit into instinct or intuition? Later in the same chapter the author concedes that intuition adds value but only to the extent that the individual bases it off sufficient research. To me, the way most of the book was written, especially in Parts 1 and 2, was a little over the top. The chapters are short and each one cites at least one study that the author or someone else performed. It becomes example after example after example and redundant. The beginning chapters seem as if the author put a group of journal articles together to develop part of the book. Don’t get me wrong, many of the studies are really interesting and I find them very helpful, I just believe that it became a little redundant. However, there is some evidence that also says that many of the studies referenced in this book were not able to be reproduced, adding more speculation on the evidence supporting the author’s premise.

Furthermore, the book is very interactive with the reader and some parts are a little condescending. For example, in the Introduction, the author poses a question to the reader asking whether or not a personality description means the person in question is a farmer or a librarian. Rather than assuming that the multitude of readers may come up with different responses, the author states “Did it occur to you that there are more than 20 male farmers.” While I understand where the author was going with the question, the author presumed that the readers would only answer one way and this recurs throughout the book. Another example in Chapter 16 assumed that the reader came up with the wrong answer and even stated that the most common answer to this question is wrong, however, the author does not explain how to come up with the correct answer.

Since this book is very interactive, I wouldn’t purchase the audio book. I do have both the hard copy and the audio book and further noticed that there were a few mistakes between the hard copy and the audio. Sometimes the mistake was quite minimal such as words were flip-flopped but at the end of Chapter 17 the author asks a question which requires some thought and work by the reader. The total in the audiobook was completely off. Instead of stating the total at 81 million (as in the hard copy) the audio book read it as 61 million and the Total for another part of the question in the same example was 67.1 million in the audio book instead of 89.1 million as the hard copy stated.

All in all, a good part of the book is intriguing. The author clearly has conducted extensive research throughout his career and was able to present much of it in this book in a form that would be comprehensible to non-econ and non-psychology persons.

We also Recommend

About the Author

Daniel Kahneman (Hebrew: דניאל כהנמן‎ (born 5 March 1934) is an Israeli-American psychologist and winner of the 2002 Nobel Memorial Prize in Economic Sciences, notable for his work on behavioral finance and hedonic psychology.

Download Thinking Fast and Slow PDF

1 Comment

Comment