Summary of "Thinking Fast and Slow" by Daniel Kahneman

By Andreas Ramos. Updated December, 2023

Book published by Farrar, Straus and Giroux, April, 2013

Notes by Howard Derneld

Page numbers in parentheses

Daniel Kahneman is an Israeli-American psychologist notable for his work on the psychology of judgment and decision-making, as well as behavioral economics, for which he was awarded the 2002 Nobel Prize in Economic Sciences. His empirical findings challenge the assumption of human rationality prevailing in modern economic theory. With Amos Tversky and others, Kahneman established a cognitive basis for common human errors that arise from heuristics and biases, and developed prospect theory. He is professor emeritus of psychology and public affairs at Princeton University's Woodrow Wilson School.

Control of Attention Is Shared by Two Systems:

System 1: Fast Thinking

  • Involuntary, Impulsive, intuitive
  • Operates automatically and quickly
  • Seeing and orienting
  • Cannot be turned off
  • Detects simple relations, they are all alike
  • Associative activation: ideas that have been evoked trigger many other ideas in a spreading cascade of brain activity; each element is connected and each supports and strengthens the others
  • Perceives coherence through association
  • Finds coherent casual stories that links fragments of knowledge
  • Generates impressions, feelings and inclinations; when endorsed by System 2 these become beliefs, attitudes and intentions
  • Can be directed to mobilize attention to search for a pattern
  • Links a sense of cognitive ease to illusions of truth, pleasant feelings, and reduce vigilance
  • Distinguishes surprising from the normal
  • Exaggerates emotional consistency (halo effect)
  • Focuses on existing evidence and ignores absent evidence (WYSIATI)
  • Represents sets by norms and prototypes, does not integrate
  • Substitutes an easier question for a difficult one: heuristics
  • More sensitive to changes than to states: prospect theory
  • Overweighs low probabilities
  • Shows diminishing sensitivity to quantity: psychophysics
  • Responds more strongly to losses than to gains: loss aversion
  • Frames decision problems narrowly, in isolation from one another

System 2: Slow Thinking

  • Voluntary actions
  • Effortful mental activities that demand it
  • Surge of conscious attention whenever you are surprised
  • Continuous monitoring of your own behavior
  • Maintain in memory several ideas that require separate actions
  • Can follow rules, compare objects on several attributes and make deliberate choices between options. Capable of reasoning and cautious, but lazy for some people

You dispose of a limited budget of attention that you can allocate to activities (23)

Impressions and intuitions turn into belief and impulses turn into voluntary actions.

Peak of pupil size coincides with maximum effort (33)

Self control and deliberate thought draw on the same limited budget of effort (40)

Michael Csikszentmihalyi studied the state of effortless attending, called flow (40)

People who are cogitatively busy are also more likely to make selfish choices, use sexist language, and make superficial judg-ments in social situations. All variants of voluntary effort—cognitive, emotional or physical—draw at least partially on a shared pool of mental energy. Ego Depletion: An effort of will or self-control is tiring: if you have had to force yourself to do something, you are less willing or less able to exert self-controls when the next challenge comes

The exertion of self-control is depleting and unpleasant. Unlike cognitive load, ego depletion is at least in part a loss of moti-vation. Ego depletion is not the same mental state as cognitive busyness. The nervous system consumer more glucose than most other parts of the body and effortful mental activity appears to be especially expensive in using glucose.

System 2 is lazy: many people are overconfident, prone to place too much faith in their intuitions. The find cognitive effort at least mildly unpleasant ad avoid it. (45)

It is like we have separate minds. One mind (algorithmic) deals with slow thinking and demanding computation—good for brain power and excelling in intelligence tests and switching from one task to another quickly and efficiently. However, high intelligence does not make people immune to biases. The ability of rationality is being engaged. Rationality is distinguished from intelligence.

Association is affected by priming, without any awareness. Priming—the influencing of an action by the idea—is the ideomo-tor effect.

Reciprocal links are common in the associative network: being amused makes one smile, and smiling makes you feel amused. (54

Money-primed people become more independent than they would be without the associative trigger. They preserved almost twice as long in trying to solve a very difficult problem.

Living in a culture that surrounds us with reminders of money may shape our behavior and our attitudes in ways that we do not know about. Other cultures are reminding of respect, others of God and some prime obedience behavior. The effects of the primes are robust but not necessarily large (56)

The various causes of ease or strain have interchangeable effects. When your are in as state of cognitive ease, you are probably in a good mood, like what you see, believe what you hear=, trust your intuitions, and feel that the current situation is comfort-ably familiar. You are also like to be relative casual and superficial in your thinking. When you feel strained you are more like-ly to be vigilant and suspicious, invest more effort in what you are doing, feel less comparable and make fewer errors—but you are also less intuitive and less creative than usual. (60)

You can enlist cognitive ease to have others believe you. Studies of truth illusions provide specific suggestions:

  • Reduce cognitive strain for the recipient
  • Make type and colors and background to maximize legibility
  • Use simple language
  • Make the message memorable—alteration or rhyme helps
  • Stay away from anything complex or complicated

If information comes from a source linked by logic or association to other beliefs or preferences you hold, or comes from a source you trust and like, you will feel a sense of cognitive ease. (64) Cognitive stress, whatever the source, mobilizes System 2 which is more likely to reject the intuitive answer suggested by system 1. (65)

Mere exposure effect: effect of repetition on liking is profoundly important biological fact, and that it extends to all animals. Survival prospects are poor for an animal that is not suspicious of novelty. The mere exposure effect occurs because the re-peated exposure of a stimulus is followed by nothing bad. Such a stimulus will eventually become a safety signal, and safety is good. Words that are presented more frequently were rated much more favorably. (66-67)

People’s guesses are much more accurate than they would be by chance. A sense of cognitive ease is apparently generated by a very faint signal from the associative machine which knows that words are coherent (share an association) long before the as-sociation is retrieved.

We see causality, just as directly as we see color. From birth we have impressions of causality which do not depend on reason-ing about patters of causation. We are barn prepared to make intentional attributions: infants under one year old identify bul-lies and victims, and expect a pursuer to follow the most direct path in attempting to catch whatever it is chasing (77)

Paul Bloom make the provocative claim that our inborn readiness to separate physical and intentional causality explains the near universality of religious beliefs. We perceive the world of objects as essentially separate from the world of minds, making it possible for us to envision soulless bodies and bodiless souls. The two modes of causation that we are set to perceive make it natural for us to accept the two central beliefs of many religions: an immaterial divinity is the ultimate cause of the physical world, and immortal souls temporarily control our bodies while we live and leave them behind as we die. The two concepts o causality were shared by evolutionary forces, building the origins of religion into the structure of System 1. (77)

The prominence of causal intuitions is a recurrent theme in this book because people are prone to apply causal thinking inap-propriately to situations that require statistical reasoning. Statistical thinking derives conclusion about individual case from properties of categories and ensembles. System 1 does not have the capability for this mode of reasoning.

In System 1 only one interpretation comes to mind and you are not aware of any ambiguity. System 1 does not keep track of alternatives that it rejects, or even of the fact that there were alternatives. When System 2 I engaged, we will believe almost anything. System 1 is gullible and biased to believe; System 2 is in charge of doubting and unbelieving. However System 2 is sometimes busy and often lazy. (81)

A deliberate search for confirming evidence, positive test strategy, is also how System 2 tests a hypothesis. Contrary to the rules of philosophers of science, who advise testing hypothesis by trying to refute them, people seek data that are like to be compat-ible with beliefs they currently hold. The confirmatory bias of System 1 favors uncritical acceptance of suggestions and exag-gerations of the likelihood of extreme and improbable events.

The halo effect is the tendency to like or dislike everything about a person, including things you have not observed. System 1 generates a representation of the world that is simpler and more coherent than the real thing. The halo effect is an example of suppressed ambiguity.

To derive the most useful information from multiple sources of evidence, you should always try to make these sources inde-pendent of each other.

System 1 continuously monitors what is going on outside and inside the mind, and continuously generates assessment of var-ious aspects of the situation. These basic assessments are easily substituted for more difficult questions—the essential idea of heuristics and biases.

Alex Todorov showed that there are biological roots of rapid judgments for how safe it is to interact with a stranger. We are endowed with an ability to evaluate, in a single glace at a stranger’s face, to crucial factors about the person: dominance (a threat) and trustworthiness ( whether he is friendly). The shape of the face provides cues for assessing dominance, such as a square chin. People judge competence by combining the two dimensions of strength and trustworthiness. (91)

Because System 1 represents categories by a prototype or a set of tyo[ical exemplars, it deals well with averages but poorly with sums. There is almost a complete neglect of quantity in emotional contexts. (93)

Matching intensities across scales. For System 1, an underlying scale of intensity allows matching across diverse dimensions—correlations or matching of scale of an event with intensity of colors or music.

The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect answers to difficult questions.

Anchoring Effect. People consider a particular value for an unknown quantity before estimating that quantity. System 1 uses a priming effect and System 2 makes a deliberate adjustment to move away from the anchor, which is effortful—wo people stay closer to the anchor when their mental resources are depleted. (120)

Availability heuristic is the process of judging frequency by the ease with which instances come to mind. If the number of in-stances remembered is difficult that will be more influences than the count. The ease and fluency of retrieval has more impact than the number of instances [that would presumably validate the proposition]. This is partly because we have an experience of diminishing fluency as instances are produced (recalled). People who let themselves be guided by System 1 are more strong-ly susceptible to the availability bias than others who are in a state of higher vigilance. (135)

Media converge warps our estimates because it is biased toward novelty and poignancy.

The amount of concern we have is not adequately sensate to the probability of harm; you are imaging the numerator, such as the tragic story, and not the denominator.

Sets of things are represented by norms and prototypes. Larger and comparable sets with additional, but less desirable ele-ments, were valued more than the smaller one in noint evaluations, but less in single evaluation. (161)

If you visit a courtroom you will observe that lawyers apply two styles of criticism: to demolish a case they raise doubts about the strongest arguments that favor it; to discredit a witness they focus on the weakest part of the testimony. (165)

Types of base rates include statistical base rates are facts about a population to which a case belongs, but they are not relevant to the individual case. Casual base rates change your view of how the individually came to be:

  • Statistical base rate are generally underweighted, and sometimes neglected altogether, when specific information about the case at hand is available.
  • Causal base rate are treated as information about the individual case and are easily combined with other case-specific information.
  • Stereotypes are statements about the group that are accepted as facts about every member. System 1 represents categories as norms and prototypical examples. When the categories are social, these representations are called stereotypes. In sensitive social contexts, we do not want to draw possibly erroneous conclusions about the individual from the statistics of the group. We consider it morally desirable for base rates to be treated as statistical facts about the group rather than as presumptive facts about individuals. In other words, we reject causal base rates. (169)

    Causes trump statistics. When presented with a surprising statistical fact, people managed to learn nothing at all. But when the students were surprised by individual cases they immediately made the generalization and inferred the same statistical fact.

    Statistical results with a causal interpretation have a stronger effect on our thinking than no-causal information. But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience. However, surprising indi-vidual cases have a powerful impact and are more effective tool for learning because the incongruity must be resolved and embedded in a causal story. (174)

    Regression to the mean. Whenever the correlation between to scores is imperfect, there will be regression to the mean. Groups with extreme attributes regress to the mean over time.

    (1) Some institutions draw primarily on skill and expertise acquired by repeated experience—the solution to the current prob-lem comes to mind quickly because familiar cues are recognized. (2) Other institutions which are sometimes subjectively indis-tinguishable from the first arise from the operation of heuristics that often substitute an easy quest for the harder one that was asked.

    The correlation between two measures is equal to the proportion of the shared factors among their determinants. To estimate a proportion of shared factors among the determinants, assume up to 30% which is optimistic:

    • Start with an estimate of the average [measure]
    • Determine the [measure] that matches your impression of the evidence
    • Estimate the correlation between your evidence and the [measure]
    • If the correlation is .30 them move 30% of the distance from the average to the matching [measure]

    The approach builds on your intuition, but moderates it, regresses it toward the mean. (190)

    Nassim Taleb introduced narrative fallacy to describe how flawed stories of the past shape our views of the word and our ex-pectations. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity and intentions than to luck; and focus on a few striking events that happened rather than on the count-less events that failed to happen. (199)

    The ultimate test of an explanation is whether it would have made the event predictable in advance. No story of Google’s un-likely success will beet that test, because no story can include the myriad of events that would have caused a different out-come. The human mind does not deal well with nonevents. The Google story is like watching a skilled rafter in whitewater; however, the skilled rafter has gone down the rapids hundreds of times and had learned how to read the roiling water to avoid obstacles. Google was a lot more luck, and the more luck there is less to be learned.

    Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance. (201)

    The mind that makes up narratives about the past is a sense-making organ. When an unpredicted event occurs, we immedi-ately adjust our view of the world to accommodate the surprise.

    Hindsight bias: “I knew it all along.” Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame deci-sion makers for good decisions that worked out badly and to give them tool little credit for successful moves that appear ob-vious only after the fact—the outcome bias. (203)

    The halo effect is so powerful that you probably find yourself t=resisting the idea that the same person and the same behaviors appear methodical when things are going well and rigid when things are going poorly. Because of the halo effect we get the causal relationship backward: we are prone to believe that the firm fails because its CEO is rigid, when the truth is that the CEO appears to be rigid because the firm is failing. This is how illusions of understanding are born. (206)

    Because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observa-tions of success. The performance of the top companies in Built to Last and Search of Excellence regress to the mean. We are temped to think of causal explanations, such as the successful firms became complacent or the less successful firms tried harder—but this is incorrect thinking about what happened. The average gap must shrink because the original gap was due in good part to luck. (207)

    • Illusion of validity: people are often reluctant to infer the particular from the general.
    • Illusion of skill: those that seem to have predictive skill of random future events.

    Professional investors tend to do better than individual investors. Individual investors like to lock in their gains by selling win-ners that have appreciated since they were purchased, and they hand on to losers. Unfortunately for them, recent winners tend to do better than recent losers in the short run, so individuals sell the wrong stocks. They also buy the wrong stock. Individual investor s predictably flock to companies that draw their attention because they are in the news. Professional investors are more selective in responding to news. (214)

    Research analysts are not better investors because the skill in evaluating the business prospects of a firm is not sufficient for successful stock trading, where the key question is whether the information about the firm is already incorporated in the price of the stock. (217)

    The illusion that we understand the past fosters overconfidence in our ability to preudct the future. People who spend their time and earn their living studying a particular topic produce poorer predictions than dart-throwing monkeys. Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more know develops an enhanced illusion of her skill and becomes unrealistically overconfident. (219)

    Errors of prediction are inevitable because the world is unpredictable. High subjective confidence is not to be trusted as an indicator of accuracy—low confidence could be more informative. Short term trends can be forecast, and behavior and achievements can be predicted with fair accuracy from previous behaviors and achievements.

    Low-validity environments: the accuracy of experts was matched or exceeded by a simple algorithm. Experts try to be cleaver, think outside the box and consider complex combinations of features in making their predictions.

    The prejudice against algorithms is magnified when the decisions are consequential.

    Intuition: The situation has provide a cue; this cue has given the expert access to information stored in memory, and the in-formation provides the answer. Intuition is nothing more than nothing less than recognition.

    Pavlov’s famous conditioning experiments, in which the dogs learned to recognize the sound of the bell as a signal that food was coming. This is learned hope. Learned fears are even more easily acquired. (238)

    Judgments reflect true expertise when two conditions are fulfilled:

    • An environment that is sufficiently regular to be predictable
    • An opportunity to learn these regularities through prolonged practice

    Intuition cannot be trusted in the absence of stable regularities in the environment.

    Planning fallacy of forecasts that:

    • Are unrealistically close to best-case scenarios
    • Could be improved by consulting the statistics of similar cases

    There are many unknown unknowns. (250)

    Economists identified optimistic CEOs by the amount of company stock that they owned personally and observed that highly optimistic leaders took excessive risks. They assumed debt rather than issue equity and were more likely than others to over-pay for target companies and undertake value-destroying mergers. (258) Cognitive biases contribute to competition neglect:

    • We focus on our goal, anchor on our plan, and neglect relevant base rates, exposing ourselves to the planning fallacy.
    • We focus on what we want to do and can do, neglecting the plans and skills of others.
    • Both in explaining the past and in predicting the future, we focus on the causal role of skill and neglect the role of luck. We are therefore prone to an illusion of control.
    • We focus on what we know and neglect what we do not know, which makes us overly confident in our beliefs.

    People tend to be overly optimistic about their relative standing on any activity in which they do moderately well. (260)

    Taleb has argued, inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued, socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers. (262)

    Martin Seligman, founder of positive psychology, an “optimistic explanation style” contributes resilience by defending one’s self image. In essence, the optimistic style involves taking credit for successes but little blame for failures.

    Overconfidence is a direct consequence of features of System 1 that can be tamed—but not vanquished. The main obstacle is that subjective confidence is determined by the coherence of the story one has constructed, not by the quality and amount of the information that supports it. (264)

    Prospect Theory: Operating characteristics of System 1 are common to many automatic processes of perception, judgment and emotion. (282):

    • Evaluation is relative to a neutral reference point, which is sometimes referred to as and “adaptation level.”
    • A principle of diminishing sensitivity applies to both sensor dimensions and the evaluation of changes in wealth. Turning on a weak light has a large effect n a dark room, but the same increment of light may be undetectable in a brightly illuminated room.
    • Loss aversion: when directly compared or weighed against each other, losses loom larger than gains.

    Endowment effect applies especially for goods that are not regularly traded. Some assets are held for exchange to trade for other goods. Others are held “for use” to be consumed or otherwise enjoyed. (294) Goods for use are often the case that the average selling price is about double the average buying price, and the number of trades was less than half of the number pre-dicted by standard theory. The magic of the market did not work for a good that the owners expected to use. There is an emo-tion in the moment: the high price that sellers set reflects the reluctance to give up an object that they already own—loss aver-sion is built into the automatic evaluations of System 1. (296) No endowment effect is expected when owners view their goods a carriers of value for future exchanges.

    Veteran traders have apparently learned to ask the correct question, which is “How much do I want to have that mug, com-pared with other things I could have instead?” This is the question that economic or rational thinkers ask, and with this ques-tion there is no endowment effect, because the asymmetry between the pleasure of getting and the paid of giving up is irrele-vant. The poor do not experience the endowment effect because they always live below their reference point. (298)

    Bad Events: The brains of humans and other animals contain a mechanism that is designed to give priority to bad news. By shaving few hundredths of a second from the time needed to detect a predator, this circuit improves the animal’s odds of living long enough to reproduce. Threats are privileged above opportunities, as they should be. The brain responds quickly to even symbolic threats. Emotionally loaded words quickly attract attention, and bad words (war, crime) attraction faster than do happy words (peace, love). (301)

    Loss aversion is one of many manifestations of a broad negativity dominance. The self is more motivated to avoid bad self-definitions than to pursue good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to discon-firmation than good ones.

    Goals are reference points. Loss aversion refers to the relative strength of two motives: we are driven more strongly to avoid losses than to achieve gains. People often adopt short-term goals that they strive to achieve but not necessarily to exceed. (303)

    Loss aversion crates an asymmetry that makes agreements difficult to reach. The concessions you make to me are my gains, but they are your losses; they cause you much more pain than they give me pleasure. Inevitably, you will place a higher value on them than I do. The same is true, of course, of the very painful concessions you demand from me, which you do not appear to value sufficiently! Negotiations over a shrinking pile are especially difficult, because they require an allocation of losses. People tend to be much more easygoing when they bargain over an expanding pie. (304)

    Negotiators often pretend intense attachment to some good , although they actually view that good as a bargaining chip and intend ultimately to give it away in exchange. Animals, including people, fight harder to prevent losses than to achieve gains. In the world of territorial animas, this principle explains the success of defenders. A biologist observed that “when a territory holder is challenged by a rival, the owner almost always wins the contest—usually within a matter of seconds. Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals (305)

    The basic principal is that the existing wage, price, or rent sets a reference point, which has the nature of an entitlement that must not be infringed. It is considered unfair for the firm to impose losses on its customers or workers relative to the reference transaction, unless it must do so to protect its own entitlement. A basic rule of fairness, we found, is that the exploitation of market power to impose losses on others is unacceptable. (307)

    The possibility of increasing the possibility of a favorable outcome by 5 percentage points is not valued the same at different ranges. The large impact of 0% to 5% illustrates the possibility effect, which causes highly unlikely outcomes to be dispropor-tionately more than they “deserve.” The improvement from 95% to 100% is a qualitative change that has a large impact, the certainty effect. (311)

    Because of the possibility effect, we tend to overweight small risks and are wiling to pay far more than expected value to elim-inate them altogether. Overweighting of small probabilities increase the attractiveness of both gambles and insurance policies. the decision weights that people assign to outcomes are not identical to the probabilities of those outcomes, contrary to the expectation principle. (312)

    The Fourfold Pattern

    Simplification of the theory:

    • People overestimate the probabilities of unlikely events
    • People overweigh unlikely events in their decisions (324)

    The probability of a rare event is most likely to be overestimated when the alternative is not fully specified.

    The valuation of gambles was much less sensitive to probability when the (factious) outcomes were emotional than when the outcomes were gains or losses of cash. …a rich a vivid representation of the outcome, whether or not it is emotional, reduces the role of probability in the evaluation of an uncertain prospect. (328)

    Vivid probabilities: the Dominator neglect. Individuals look more at the number of possible wins, neglecting the probability of actually winning.

    It is costly to be risk averse for gains and risk seeking for losses. These attitudes make you willing to pay a premium to obtain a sure gain rather than face a gamble.

    Mental accounts are a form of narrow framing: they keep things under control and manageable by a finite mind. (343) In fi-nance, there is a massive preference for selling winners rather than losers—a bias that has been given an opaque label: the dispo-sition effect. There is a well-documented market anomaly that stocks that recently gained in value are likely to go on gaining for at least a short while. The decision to invest additional resources in a losing account, when better investments are available, is know as the sunk-cost fallacy. (345)

    People expect to have stronger emotional reactions(including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction. The asymmetry is at least as strong for losses, and it applies to blame as well as to regret. (348)

    Losses are weighted about twice as much as gains in several contexts: choice between gambles, the endowment effect, and reactions to price changes.

    The intense aversion to trading increased risk for some other advantage plays out on a grand scale in the laws and regulations governing risk. This trend is especially strong in Europe, where the precautionary principle, which prohibits any action that might cause harm, is widely accepted doctrine. Enhanced loss aversion is embedded in a strong and widely shared moral intu-ition; it originates in System 1. (351)

    Preference Reversals. You are asked to choose between a safe bet and a riskier one: an almost certain win of a modest amount, or a small chance to win a substantially larger amount and a high probably of losing. Safety prevails and the safe bet is cho-sen. However, the features that caused the difference between the judgments of the options in the single evaluation are sup-pressed or irrelevant when the options are evaluated jointly. The emotional reactions of System 1 are much more likely to de-termine single evaluation; the comparison that occurs in a joint evaluation always involves a more careful and effortful as-sessment, which calls for System 2. (355)

    Emotional framing. Words such as “survival” versus “mortality” in logically equivalent descriptions affect our. Unless there is an obvious reason to do otherwise, most of us passively accept decision problems as they are framed and therefore rarely have an opportunity to discover the extent to which our preferences are frame-bound rather than reality-bound. (367)

    When people evaluate a situation that spans over time, they overweight the peak end and neglect the duration. The retrospec-tive assessments are insensitive to duration and weight two singular moments, the peak and the end, much more than others. The experiencing self is the one that answers the question “does it hurt now?” The remembering self is the one that answers the question “how was it, on the whole?”

    Confusing experience with the memory of it is a compelling cognitive illusion—and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily our future experience. This is the tyranny of the remembering self. (381)

    There are multiple examples of less-are more effects. One is the situation in which adding dishes to a set of 24 dishes lowered the total value because some of the added dishes were broken. Another was the activist woman who is judged more likely to be a feminist bank teller than a bank teller. The similarity is not accidental. The same operating feature of System 1 accounts for all three situations. System 1 represents sets by averages, norms and prototypes, not by sums. Each cold-hand episode is a set of moments, which the remembering self stores as a prototypical moment. This leads to a conflict. For an objective observer evaluating the episode from the reports of the experiencing self, what counts is the “area under the curve” that integrates pain over time, it has the nature of a sum. The memory that the remembering self keeps, in contrast, ins a representative moment, strongly influenced by the peak and the end. (383)

    A story s about significant events and memorable moments, not about time passing. Duration neglect is normal in a story, and the ending often defines its character. The same core features appear in the rules of narratives and in the memories of colonos-copies, vacations, and films. This is how the remembering self works: it composes stories and keeps them for future reference. Caring for people often takes the form of concern for the quality of their stories, not for their feelings. Most important, of course, we all care intensely for the narrative of our own life and very much want it to be a good story, with a decent hero. (387)

    In evaluating the quality of a person’s life there s clear evidence of both duration neglect and a peak-end effect. The person’s life was represented by a prototypical slice of time, not as a sequence of time slices. Experimenters find a less-is-more effect, a strong indication that an average (prototype) has be substituted for a sum. For example, adding five “slightly happy” years to a very happy life caused a substantial drop in the evaluations of the total happiness of that life. (388)

    Intentions for future vacations were entirely determined by the final evaluation—even when that score did not accurately rep-resent the quality of the experience that was described in the diaries. People choose by memory when they decide whether or not to repeat an experience.

    Most people are remarkably indifferent to the pains of their experiencing self. Some say they don’t care at all. Others share my feeling, which is that I feel pity for my suffering self but not more than I would feel for a stranger in pain. Odd as it may seem, I am my remembering self, and the experience self, who does my living, is like a stranger to me. (390)

    Experienced well-being. A striking observation was the extent of inequality in the distribution of emotional pain. About half our participants reported going through an entire day without experiencing an unpleasant episode. On the other hand, a sig-nificant minority of the population experienced considerable emotional distress for much of the day. It appears that a small fraction of the population does most of the suffering—whether because of physical or mental illness, an unhappy tempera-ment, or the misfortunes and personal tragedies in their life. (394)

    It is only a slight exaggeration to say that happiness is the experience of spending time with the people you love and who love you.

    More education is associated with higher evaluation of one’s life, but not with greater experienced well-being. Religious partic-ipation also has relatively greater favorable impact on both positive affect and stress reduction than on life evaluation. Sur-prisingly, however, religion provides no reduction of feeling of depression or worry. Being poor makes one miserable, and that being rich may enhance one’s life satisfaction, but it does not (on average) improve experienced well-being. The satiation level beyond which experienced well-being no longer increases was a household income of about $75,000 in high cost areas. Higher income is associated with a reduced ability to enjoy the small pleasures in life. (397)

    Focusing illusion: nothing in life is as important as you think it is when you are thinking about it.

    The remembering self is a construction of System 2. However, the distinctive features of the way it evaluates episodes and lives are characteristics of our memory. Duration neglect and the peak-end rule originate in System 1 and do not necessarily correspond to the values of System 2. We believe that duration is important, but our memory tells us it is not. The rules that govern the evaluation of the past are poor guides for decision making, because time does matter

    The central fact of our existence is that time is the ultimate finite resource, but the remembering self ignores that reality. The neglect of duration combined with the peak-end rule causes a bias that favors a short period of intense joy over a long period of moderate happiness. The mirror image of the same bias makes us fear a short period of intense pain but tolerable suffering more than we fear a much longer period of moderate pain. Duration neglect also makes us prone to accept a long period of mild unpleasantness because the end will be better, and it favors giving up an opportunity for a long happy period if it likely to have a poor ending. (409)

    As interpreted by the important Chicago school of economics, faith in human rationality is closely linked to an ideology in which is unnecessary and even immoral to protect people against their choices. Rational people should be free, and they should be responsible for taking care of themselves. (411)

    The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feed-back about the correctness of thoughts and actions.

    System 1 registers the cognitive ease with which it processes information, but it does not generate a warning signal when it becomes unreliable. Intuitive answers come to mind quickly and confidently, whether they originate from skills or from heuris-tics. There is no simple way for System 2 to distinguish between a skilled and heuristic response. Its only recourse is to slow down and attempt to construct an answer on it own, which is reluctant to do because it is indolent. (417)

    The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intui-tions is unpleasant when you face the stress of a big decision. More doubt is the last thin you want when you are in trouble. The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors. That was my reason for writing a book that is oriented to critics and gossipers than to decision makers. (417)