Thinking, Fast and Slow by Daniel Kahneman

Excerpts from Thinking, Fast and Slow by Daniel Kahneman. “The book’s main thesis is a differentiation between two modes of thought: “System 1” is fast, instinctive and emotional; “System 2″ is slower, more deliberative, and more logical.”


Most impressions and thoughts arise in your conscious experience without your knowing how they got there.

People tend to assess the relative importance of issues by the ease with which they are retrieved from memory — and this is largely determined by the extent of coverage of in the media.

I describe mental life by the metaphor of two agents, called System 1 and System 2, which respectively produce fast and slow thinking. The intuitive System 1 is more influential than your experience tells you, and it is the secret author of many of the choices and judgements you make.

We can be blind to the obvious, and we are also blind to our blindness.

When people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound.

Cognition is embodied; you think with your body, not only with your brain.

A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.

Understanding a statement must begin with an attempt to believe it.

Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold.

The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.

success = talent + luck
great success = a little more talent + a lot of luck

Our mind is strongly biased toward casual explanations and does not deal well with “mere statistics.”

We humans constantly fool ourselves by constructing flimsy accounts of the past and believing they are true.

You cannot help dealing with the limited information you have as if it were all there is to know. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

In everyday language, we apply the word ‘know’ only when what is known is true and can be shown to be true. We can know something only if it is true and knowable.

For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous — and it is also essential

Hindsight Bias – You cannot help dealing with the limited information you have as if it were all there is to know. Our comforting conviction that the world makes sense rests on a secure foundation: Our almost unlimited ability to ignore our ignorance.

A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”
“The brains of humans and other animals contain a mechanism that is designed to give priority to bad news.”

Premortem – When the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.

The Planning Fallacy – Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopts as more achievable than they are likely to be.

Optimistic individuals play a disproportionate role in shaping our lives.

Anyone who has been in the business world for a bit will recognize “the planning fallacy.” I just didn’t know it had a name.

When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns — or even to be completed.

In this view, people often (but not always) take on risky projects because they are overly optimistic about the odds they face. I will return to this idea several times in this book—it probably contributes to an explanation of why people litigate, why they start wars, and why they open small businesses.

How important is the CEO? – Because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observations of success. And even if you had perfect foreknowledge that a CEO has brilliant vision and extraordinary competence, you still would be unable to predict how the company will perform with much better accuracy than the flip of a coin. On average, the gap in corporate profitability and stock returns between the outstanding firms and the less successful firms studied in Built to Last shrank to almost nothing in the period following the study. The average profitability of the companies identified in the famous In Search of Excellence dropped sharply as well within a short time.”

Availability cascade – William Eastery calls Daniel Kahneman’s Thinking, Fast and Slow, “one of the greatest and most engaging collections of insights into the human mind I have read.” I only mention this so I’ll have a reason to link to Professor Easterly’s review below. Tell me if this description of an “availability cascade” sounds familiar:

“An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public’ panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement. The cycle is sometimes sped along deliberately by “availability entrepreneurs,” individuals or organizations who work to ensure a continuous flow of worrying news. The danger is increasingly exaggerated as the media compete for attention-grabbing headlines. Scientists and others who try to dampen the increasing fear and revulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a “heinous cover-up.” The issue becomes politically important because it is on everyone’s mind, and the response of the political system is guided by the intensity of public sentiment. The availability cascade has now reset priorities. Other” risks, and other ways that resources could be applied for the public good, all have faded into the background.”

“The dominance of conclusions over arguments is most pronounced where emotions are involved. The psychologist Paul Slovic has proposed an affect heuristic in which people let their likes and dislikes determine their beliefs about the world. Your political preference determines the arguments that you find compelling. If you like the current health policy, you believe its benefits are substantial and its costs more manageable than the costs of alternatives. If you are a hawk in your attitude toward other nations, you probably think they are relatively weak and likely to submit to your country’s will. If you are a dove, you probably think they are strong and will not be easily coerced. Your emotional attitude to such things as irradiated food red meat, nuclear power, tattoos, or motorcycles drives your beliefs aboul their benefits and their risks. If you dislike any of these things, you probably believe that its risks are high and its benefits negligible.”

“A simple rule can help: before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position. This procedure makes good use of the value of the diversity of knowledge and opinion in the group. The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.

Experience vs Memory