summaries of thinking fast and slow

62
Summaries of Thinking fast and slow 3/31/12 10:07 PM http://www.hyperink.com/Overall-Summary-For-Thinking-Fast-And- Slow-b895a12 The book is divided into five parts, with a pair of appendixes – the pioneering articles Kahneman authored alongside Amos Tversky – providing additional context. Each part of the book is spilt into numerous sections, averaging around 10 pages each. The organization successfully makes an intellectually imposing book feel more accessible and navigable. Part 1 In Part 1, Kahneman explains how the brain’s thought process switches between “System 1” and “System 2.” System 1 is the automatic response or kneejerk reaction. System 2, on the other hand, involves concentration and deliberation. “Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book,” Kahneman writes. He argues that overreliance on System 2 can result essentially in tunnel vision, then provides exercises that the reader can easily follow to test themselves and see firsthand some of the tricks System 1 and System 2 can play on one another. Further sections drive into deeper detail about the manner in which we generate opinions and make judgment calls. Part 1 closes with a comprehensive list of characteristics assigned to System 1, including “is biased to believe and confirm” and “neglects ambiguity and suppresses doubt.” Part 2

Upload: logan-spangler

Post on 28-Apr-2015

2.246 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Summaries of Thinking Fast and Slow

Summaries of Thinking fast and slow 3/31/12 10:07 PM

http://www.hyperink.com/Overall-Summary-For-Thinking-Fast-And-Slow-

b895a12

The book is divided into five parts, with a pair of appendixes – the pioneering

articles Kahneman authored alongside Amos Tversky – providing additional

context. Each part of the book is spilt into numerous sections, averaging

around 10 pages each. The organization successfully makes an intellectually

imposing book feel more accessible and navigable.

Part 1

In Part 1, Kahneman explains how the brain’s thought process switches

between “System 1” and “System 2.” System 1 is the automatic response or

kneejerk reaction. System 2, on the other hand, involves concentration and

deliberation. “Although System 2 believes itself to be where the action is, the

automatic System 1 is the hero of the book,” Kahneman writes.

He argues that overreliance on System 2 can result essentially in tunnel

vision, then provides exercises that the reader can easily follow to test

themselves and see firsthand some of the tricks System 1 and System 2 can

play on one another. Further sections drive into deeper detail about the

manner in which we generate opinions and make judgment calls. Part 1

closes with a comprehensive list of characteristics assigned to System 1,

including “is biased to believe and confirm” and “neglects ambiguity and

suppresses doubt.”

Part 2

Part 2, “Heuristics and Biases,” will be of particular interest to readers

possessing a strong feeling about statistics, be it positive or negative.

Kahneman repeatedly demonstrates how easily statistics can be

misinterpreted – always a timely reminder during any sort of election season.

He also demonstrates the risk of using statistics as an attempt to assign

greater meaning to something that occurred due to chance. “Casual

explanations of chance events are inevitably wrong,” he summarizes. He

goes on to demonstrate how “educated guesses” supported by statistics (or

stereotypes) can sometimes backfire.

Page 2: Summaries of Thinking Fast and Slow

In one example, the reader is asked to picture a passenger on the New York

subway reading the Times, and to guess whether the person has a PhD or

didn’t go to college at all. The common gut reaction is to pick the PhD, even

though there are far fewer PhDs on the subway at a given time than

passengers without college degrees. Not just taking a situation at its face

value, even statistically speaking, runs counter to how System 1 is

programmed to operate.

Kahneman sprinkles some academic autobiography through the book as

well. Part 2 includes a section on what he calls “the best-known and most

controversial” of his experiments with Amos Tversky: a seemingly simple

question about a young woman named Linda. Linda is introduced to the

crowd as a young woman who majored in philosophy and kept active with

various social causes. Kahneman’s audience then had to choose the most

likely outcome for Linda. Was she a bank teller or a bank teller who was

active in the feminist movement? Although the former is the smarter choice,

an overwhelming number of undergraduates chose the latter due to the

associations they were making about “Linda.” Even renowned scientist

Stephen Jay Gould fell into the trap.

Part 3

This part of the book tackles overconfidence. Citing the work of statistician

Nassim Taleb (The Black Swan), Kahneman puts the spotlight on our

constant need to think in concrete terms about the world around us, and how

frequently we don’t quite see the full picture, no matter what we may think.

He draws from further case studies to show that “hindsight is 20/20” is more

than just a cliche; in revisiting their past beliefs and opinions, people tend to

paint a much more flattering picture of themselves.

Kahneman extends his critique into Wall Street. “A major industry appears to

be built largely on an illusion of skill,” he writes, going on to say that,

statistically speaking, mutual fund managers perform more like dice rollers

than poker players. In an effort to override faulty intuition, Kahneman even

dishes out tips for any managers facing a hiring decision, with the goal of

having you never again say “Let’s go with that person; I got a good vibe.”

Part 4

Page 3: Summaries of Thinking Fast and Slow

“Choices,” the fourth part, provides the reader with an opportunity to absorb

some of the economics-centric work that led to Kahneman’s Nobel Prize. He

sheds some autobiographical light onto the story of how a pair of

psychologists (he and Tversky) became interested in studying the rationality

of economics. It wasn’t an unprecedented leap between fields, but it was

certainly atypical. The duo conducted intensive research in the areas of risk

aversion and prospect theory to learn more about the ways in which humans

make decisions and weigh risk.

Research suggested that the negative often trumped the positive; that is, if

presented with data that was 50 percent positive and 50 percent negative,

the general tendency is for our brains to come away with an impression of

“negative,” even though the positive/negative factors are equally divided.

Similarly, people tend to overestimate the probability of unlikely

occurrences. In a business setting, that may mean failing to take a smart risk

because of an unlikely hypothetical scenario. A gambler may refuse to cash

out on a hot hand because he or she has a “feeling” that it will continue. A

day trader may make a decision that is based on compensating for a past

mistake rather than an objective assessment of future potential. “The fear of

regret is a factor in many of the decisions that people make,” Kahneman

writes.

Part 5

While Part 4 stems from previous work, Part 5 focuses on current findings.

The reader learns about the relationship between the “experiencing self” and

the “remembering self.” Kahneman tells the story of a man who listened with

great pleasure to a long symphony on a recorded disc, only to find that the

disc was scratched near its end. The man said the experience had been

ruined, when in actuality it was only his memory of the experience that

suffered (the vast majority of his experience remained very pleasant, just

buried beneath the negative association at the end).

“Tastes and decisions are shaped by memories, and the memories can be

wrong,” Kahneman points out.

Page 4: Summaries of Thinking Fast and Slow

We sometimes feel our preferences and beliefs so strongly that they can

seem to be somehow rooted in fact; this, however, is often not the case. This

applies even to our choice of vacation. For many people, the value of a

vacation is not the experience but the anticipation beforehand and the

memory afterward. If mountain climbers didn’t remember climbing the

mountain, would they still feel the endeavor was worth experiencing?

Similarly, many people will say that they would be willing to experience a

tremendous amount of pain if they were only guaranteed a drug that would

wipe any trace of the memory from their brains.

“Odd as it may seem,” Kahneman sums up, “I am my remembering self, and

the experiencing self, who does my living, is like a stranger to me.”

Page 5: Summaries of Thinking Fast and Slow

ny times 3/31/12 10:07 PM

Two Brains Running

By JIM HOLT

THINKING, FAST AND SLOW

By Daniel Kahneman

499 pp. Farrar, Straus & Giroux. $30

In 2002, Daniel Kahneman won the Nobel in economic science. What made

this unusual is that Kahneman is a psychologist. Specifically, he is one-half of

a pair of psychologists who, beginning in the early 1970s, set out to

dismantle an entity long dear to economic theorists: that arch-rational

decision maker known as Homo economicus. The other half of the

dismantling duo, Amos Tversky, died in 1996 at the age of 59. Had Tversky

lived, he would certainly have shared the Nobel with Kahneman, his longtime

collaborator and dear friend.

Human irrationality is Kahneman’s great theme. There are essentially three

phases to his career. In the first, he and Tversky did a series of ingenious

experiments that revealed twenty or so “cognitive biases” — unconscious

errors of reasoning that distort our judgment of the world. Typical of these is

the “anchoring effect”: our tendency to be influenced by irrelevant numbers

that we happen to be exposed to. (In one experiment, for instance,

experienced German judges were inclined to give a shoplifter a longer

sentence if they had just rolled a pair of dice loaded to give a high number.)

In the second phase, Kahneman and Tversky showed that people making

decisions under uncertain conditions do not behave in the way that economic

models have traditionally assumed; they do not “maximize utility.” The two

then developed an alternative account of decision making, one more faithful

to human psychology, which they called “prospect theory.” (It was for this

achievement that Kahneman was awarded the Nobel.) In the third phase of

his career, mainly after the death of Tversky, Kahneman has delved into

“hedonic psychology”: the science of happiness, its nature and its causes.

His findings in this area have proved disquieting — and not just because one

of the key experiments involved a deliberately prolonged colonoscopy.

Page 6: Summaries of Thinking Fast and Slow

“Thinking, Fast and Slow” spans all three of these phases. It is an

astonishingly rich book: lucid, profound, full of intellectual surprises and self-

help value. It is consistently entertaining and frequently touching, especially

when Kahneman is recounting his collaboration with Tversky. (“The pleasure

we found in working together made us exceptionally patient; it is much

easier to strive for perfection when you are never bored.”) So impressive is

its vision of flawed human reason that the New York Times columnist David

Brooks recently declared that Kahneman and Tversky’s work “will be

remembered hundreds of years from now,” and that it is “a crucial pivot

point in the way we see ourselves.” They are, Brooks said, “like the Lewis

and Clark of the mind.”

Now, this worries me a bit. A leitmotif of this book is overconfidence. All of

us, and especially experts, are prone to an exaggerated sense of how well

we understand the world — so Kahneman reminds us. Surely, he himself is

alert to the perils of overconfidence. Despite all the cognitive biases, fallacies

and illusions that he and Tversky (along with other researchers) purport to

have discovered in the last few decades, he fights shy of the bold claim that

humans are fundamentally irrational.

Or does he? “Most of us are healthy most of the time, and most of our

judgments and actions are appropriate most of the time,” Kahneman writes

in his introduction. Yet, just a few pages later, he observes that the work he

did with Tversky “challenged” the idea, orthodox among social scientists in

the 1970s, that “people are generally rational.” The two psychologists

discovered “systematic errors in the thinking of normal people”: errors

arising not from the corrupting effects of emotion, but built into our evolved

cognitive machinery. Although Kahneman draws only modest policy

implications (e.g., contracts should be stated in clearer language), others —

perhaps overconfidently? — go much further. Brooks, for example, has

argued that Kahneman and Tversky’s work illustrates “the limits of social

policy”; in particular, the folly of government action to fight joblessness and

turn the economy around.

Page 7: Summaries of Thinking Fast and Slow

Such sweeping conclusions, even if they are not endorsed by the author,

make me frown. And frowning — as one learns on Page 152 of this book —

activates the skeptic within us: what Kahneman calls “System 2.” Just

putting on a frown, experiments show, works to reduce overconfidence; it

causes us to be more analytical, more vigilant in our thinking; to question

stories that we would otherwise unreflectively accept as true because they

are facile and coherent. And that is why I frowningly gave this extraordinarily

interesting book the most skeptical reading I could.

System 2, in Kahneman’s scheme, is our slow, deliberate, analytical and

consciously effortful mode of reasoning about the world. System 1, by

contrast, is our fast, automatic, intuitive and largely unconscious mode. It is

System 1 that detects hostility in a voice and effortlessly completes the

phrase “bread and. . . . ” It is System 2 that swings into action when we have

to fill out a tax form or park a car in a narrow space. (As Kahneman and

others have found, there is an easy way to tell how engaged a person’s

System 2 is during a task: just look into his or her eyes and note how dilated

the pupils are.)

More generally, System 1 uses association and metaphor to produce a quick

and dirty draft of reality, which System 2 draws on to arrive at explicit beliefs

and reasoned choices. System 1 proposes, System 2 disposes. So System 2

would seem to be the boss, right? In principle, yes. But System 2, in addition

to being more deliberate and rational, is also lazy. And it tires easily. (The

vogue term for this is “ego depletion.”) Too often, instead of slowing things

down and analyzing them, System 2 is content to accept the easy but

unreliable story about the world that System 1 feeds to it. “Although System

2 believes itself to be where the action is,” Kahneman writes, “the automatic

System 1 is the hero of this book.” System 2 is especially quiescent, it

seems, when your mood is a happy one.

At this point, the skeptical reader might wonder how seriously to take all this

talk of System 1 and System 2. Are they actually a pair of little agents in our

head, each with its distinctive personality? Not really, says Kahneman.

Rather, they are “useful fictions” — useful because they help explain the

quirks of the human mind.

Page 8: Summaries of Thinking Fast and Slow

To see how, consider what Kahneman calls the “best-known and most

controversial” of the experiments he and Tversky did together: “the Linda

problem.” Participants in the experiment were told about an imaginary

young woman named Linda, who is single, outspoken and very bright, and

who, as a student, was deeply concerned with issues of discrimination and

social justice. The participants were then asked which was more probable:

(1) Linda is a bank teller. Or (2) Linda is a bank teller and is active in the

feminist movement. The overwhelming response was that (2) was more

probable; in other words, that given the background information furnished,

“feminist bank teller” was more likely than “bank teller.” This is, of course, a

blatant violation of the laws of probability. (Every feminist bank teller is a

bank teller; adding a detail can only lower the probability.) Yet even among

students in Stanford’s Graduate School of Business, who had extensive

training in probability, 85 percent flunked the Linda problem. One student,

informed that she had committed an elementary logical blunder, responded,

“I thought you just asked for my opinion.”

What has gone wrong here? An easy question (how coherent is the

narrative?) is substituted for a more difficult one (how probable is it?). And

this, according to Kahneman, is the source of many of the biases that infect

our thinking. System 1 jumps to an intuitive conclusion based on a

“heuristic” — an easy but imperfect way of answering hard questions — and

System 2 lazily endorses this heuristic answer without bothering to scrutinize

whether it is logical.

Kahneman describes dozens of such experimentally demonstrated

breakdowns in rationality — “base-rate neglect,” “availability cascade,” “the

illusion of validity” and so on. The cumulative effect is to make the reader

despair for human reason.

Page 9: Summaries of Thinking Fast and Slow

Are we really so hopeless? Think again of the Linda problem. Even the great

evolutionary biologist Stephen Jay Gould was troubled by it. As an expert in

probability he knew the right answer, yet he wrote that “a little homunculus

in my head continues to jump up and down, shouting at me — ‘But she can’t

just be a bank teller; read the description.’ ” It was Gould’s System 1,

Kahneman assures us, that kept shouting the wrong answer at him. But

perhaps something more subtle is going on. Our everyday conversation

takes place against a rich background of unstated expectations — what

linguists call “implicatures.” Such implicatures can seep into psychological

experiments. Given the expectations that facilitate our conversation, it may

have been quite reasonable for the participants in the experiment to take

“Linda is a bank clerk” to imply that she was not in addition a feminist. If so,

their answers weren’t really fallacious.

This might seem a minor point. But it applies to several of the biases that

Kahneman and Tversky, along with other investigators, purport to have

discovered in formal experiments. In more natural settings — when we are

detecting cheaters rather than solving logic puzzles; when we are reasoning

about things rather than symbols; when we are assessing raw numbers

rather than percentages — people are far less likely to make the same

errors. So, at least, much subsequent research suggests. Maybe we are not

so irrational after all.

Some cognitive biases, of course, are flagrantly exhibited even in the most

natural of settings. Take what Kahneman calls the “planning fallacy”: our

tendency to overestimate benefits and underestimate costs, and hence

foolishly to take on risky projects. In 2002, Americans remodeling their

kitchens, for example, expected the job to cost $18,658 on average, but they

ended up paying $38,769.

Page 10: Summaries of Thinking Fast and Slow

The planning fallacy is “only one of the manifestations of a pervasive

optimistic bias,” Kahneman writes, which “may well be the most significant

of the cognitive biases.” Now, in one sense, a bias toward optimism is

obviously bad, since it generates false beliefs — like the belief that we are in

control, and not the playthings of luck. But without this “illusion of control,”

would we even be able to get out of bed in the morning? Optimists are more

psychologically resilient, have stronger immune systems, and live longer on

average than their more reality-based counterparts. Moreover, as Kahneman

notes, exaggerated optimism serves to protect both individuals and

organizations from the paralyzing effects of another bias, “loss aversion”: our

tendency to fear losses more than we value gains. It was exaggerated

optimism that John Maynard Keynes had in mind when he talked of the

“animal spirits” that drive capitalism.

Even if we could rid ourselves of the biases and illusions identified in this

book — and Kahneman, citing his own lack of progress in overcoming them,

doubts that we can — it is by no means clear that this would make our lives

go better. And that raises a fundamental question: What is the point of

rationality? We are, after all, Darwinian survivors. Our everyday reasoning

abilities have evolved to cope efficiently with a complex and dynamic

environment. They are thus likely to be adaptive in this environment, even if

they can be tripped up in the psychologist’s somewhat artificial experiments.

Where do the norms of rationality come from, if they are not an idealization

of the way humans actually reason in their ordinary lives? As a species, we

can no more be pervasively biased in our judgments than we can be

pervasively ungrammatical in our use of language — or so critics of research

like Kahneman and Tversky’s contend.

Page 11: Summaries of Thinking Fast and Slow

Kahneman never grapples philosophically with the nature of rationality. He

does, however, supply a fascinating account of what might be taken to be its

goal: happiness. What does it mean to be happy? When Kahneman first took

up this question, in the mid 1990s, most happiness research relied on asking

people how satisfied they were with their life on the whole. But such

retrospective assessments depend on memory, which is notoriously

unreliable. What if, instead, a person’s actual experience of pleasure or pain

could be sampled from moment to moment, and then summed up over time?

Kahneman calls this “experienced” well-being, as opposed to the

“remembered” well-being that researchers had relied upon. And he found

that these two measures of happiness diverge in surprising ways. What

makes the “experiencing self” happy is not the same as what makes the

“remembering self” happy. In particular, the remembering self does not care

about duration — how long a pleasant or unpleasant experience lasts.

Rather, it retrospectively rates an experience by the peak level of pain or

pleasure in the course of the experience, and by the way the experience

ends.

These two quirks of remembered happiness — “duration neglect” and the

“peak-end rule” — were strikingly illustrated in one of Kahneman’s more

harrowing experiments. Two groups of patients were to undergo painful

colonoscopies. The patients in Group A got the normal procedure. So did the

patients in Group B, except — without their being told — a few extra minutes

of mild discomfort were added after the end of the examination. Which group

suffered more? Well, Group B endured all the pain that Group A did, and then

some. But since the prolonging of Group B’s colonoscopies meant that the

procedure ended less painfully, the patients in this group retrospectively

minded it less. (In an earlier research paper though not in this book,

Kahneman suggested that the extra discomfort Group B was subjected to in

the experiment might be ethically justified if it increased their willingness to

come back for a follow-up!)

Page 12: Summaries of Thinking Fast and Slow

As with colonoscopies, so too with life. It is the remembering self that calls

the shots, not the experiencing self. Kahneman cites research showing, for

example, that a college student’s decision whether or not to repeat a spring-

break vacation is determined by the peak-end rule applied to the previous

vacation, not by how fun (or miserable) it actually was moment by moment.

The remembering self exercises a sort of “tyranny” over the voiceless

experiencing self. “Odd as it may seem,” Kahneman writes, “I am my

remembering self, and the experiencing self, who does my living, is like a

stranger to me.”

Kahneman’s conclusion, radical as it sounds, may not go far enough. There

may be no experiencing self at all. Brain-scanning experiments by Rafael

Malach and his colleagues at the Weizmann Institute in Israel, for instance,

have shown that when subjects are absorbed in an experience, like watching

the “The Good, the Bad, and the Ugly,” the parts of the brain associated with

self-consciousness are not merely quiet, they’re actually shut down

(“inhibited”) by the rest of the brain. The self seems simply to disappear.

Then who exactly is enjoying the film? And why should such egoless

pleasures enter into the decision calculus of the remembering self?

Clearly, much remains to be done in hedonic psychology. But Kahneman’s

conceptual innovations have laid the foundation for many of the empirical

findings he reports in this book: that while French mothers spend less time

with their children than American mothers, they enjoy it more; that

headaches are hedonically harder on the poor; that women who live alone

seem to enjoy the same level of well-being as women who live with a mate;

and that a household income of about $75,000 in high-cost areas of the

country is sufficient to maximize happiness. Policy makers interested in

lowering the misery index of society will find much to ponder here.

By the time I got to the end of “Thinking, Fast and Slow,” my skeptical frown

had long since given way to a grin of intellectual satisfaction. Appraising the

book by the peak-end rule, I overconfidently urge everyone to buy and read

it. But for those who are merely interested in Kahneman’s takeaway on the

Malcolm Gladwell question it is this: If you’ve had 10,000 hours of training in

a predictable, rapid-feedback environment — chess, firefighting,

anesthesiology — then blink. In all other cases, think.

Page 13: Summaries of Thinking Fast and Slow

3/31/12 10:07 PM

The “title page” idea is that humans use two systems to make decisions: one

fast and generally correct, and one slow and rational, but lazy. Sometimes

the slow system provides after-the-fact rationalization for the conclusions

drawn by the fast system, but other times it provides correction, reining in

erroneous conclusions.

The concluding chapter includes a critique of libertarianism that is rather

weak. By setting up the argument that libertarianism is built upon a

foundation of assumed hyper-rationality, Kahneman proceeds to show this

position untenable. Of course, the position is untenable if libertarians

actually based their ideas on 100% rational human behavior. However, a

brief acquaintance with Von Mises, Rothbard, Hayek, or many others of the

Austrian school of economics would reveal that no such assumption is

necessary to prove that the state always destroys, rather than creates,

value. I have previously linked to PraxGirl’s YouTube channel, which has

many educational videos relevant to the matter.

A good place to start with the science of praxeology is with her video Study

Praxeology.

Key Concepts

System 1 vs. System 2: System 1 is quick and instinctive. System 2 is

slow, lazy, and rational. System 1 is right “most of the time” but suffers from

bias and over-simplification. System 2 can correct System 1, but it has

limited resources to work with. System 1 works day-in and day-out, but

important decisions need System 2 involvement.

The Econs vs. The Humans: “Econs” are the “homo economicus” or

“rational man” used in modeling economics. [Austrian scholars are exempt

from this generalization.] Humans are generally rational, but are subject to

some eccentricities such as framing, biased decisions, and judgment

anomalies.

The Experiencing Self vs. The Remembering Self: The experiencing self

“does the living” while the remembering self “keeps score and makes

choices.”

A selection of other ideas follows.

Regression to the Mean

Four Step Process for “Un-Biasing” Predictions

Page 14: Summaries of Thinking Fast and Slow

Given limited information and asked to make a prediction, the usual, intuitive

method is to think of a result that matches the remarkability of the

information given. However, the subject could have been (un)lucky,

especially if the data is a mere snapshot.

Begin with an estimate of average results. ["The average GPA is 3.0."]

Determine your “intuitive” judgment or impression of what the result will be.

["Susie could read at age 4, which is pretty unusual. A similarly exceptional

GPA for a college senior would be 3.8."]

Estimate the correlation between your evidence and the result. ["But the

correlation between early reading and college GPA is probably small. Let's

call it 25%."]

Move the prediction the same distance from the mean to the intuitive

prediction. ["25% of the distance from 3.0 to 3.8 gives 3.2."]

The Fourfold Pattern

Page 317 contains a summary of the over-weighting of various outcomes

that are subject to chance. When probabilities are very low, or very high,

outcomes tend to be weighted in a manner not strictly in accord with

statistical rationality.

When there is a high (but not 100%) probability of gains, people become risk

averse, generally accepting an unfavorable settlement for a sure thing.

When there is a high (but not 100%) probability of losses, people become

risk seeking, rejecting favorable settlements.

When there is a very low (but not 0%) probability of gains, people again

become risk-seeking.

When there is a very low (but not 0%) probability of losses, people become

risk averse.

For example, if you were to encounter a situation wherein you had a 95%

change of winning $1,000,000 (for example, in a lawsuit), you would be

biased toward accepting a settlement of $900,000 and letting someone else

take the risk of not winning anything, even though the “value” of a 95%

probability of winning $1,000,000  is technically $950,000.

http://www.albertsuckow.com/thinking-fast-slow-daniel-kahneman/

Page 15: Summaries of Thinking Fast and Slow

3/31/12 10:07 PM

This book reminds me very much of another very good book called "The

Happiness Hypothesis". I enjoyed reading that one very much, got very

absorbed and rushed to finish it. After having finished it, though, I realized

that I could only vaguely recall what it was that I had found so inspiring at

the time. Damn :) I am actually reading it now a second time, little by little. I

do feel that, by getting carried away, I had somehow sacrificed the long term

for the sake of a short term gain :D So I am trying not to make the same

mistake with Daniel Kahneman's book, which seems to belong to the same

category - very interesting, and lots of stuff that will just fly off unless I write

them down.

Looks like the author did put a lot of effort when writing this book, for what

he sees as a worthy cause. He made it pretty clear that he was not exactly in

a state of "flow" while doing it. I think the reader should do a bit of that too. I

am making this summary as I am reading the book. This is mostly for my

own use :) I suggest that the reader should get the book, rather than waste

time with my notes. Looks like these will be very long notes.

Chapter 1

Quick overview of the framework - we have system 1 and system 2 inside

our heads, where system 1 is the auto pilot, who can handle routine

operations by itself, and system 2 that is our conscious effort to solve

problems, make decisions or control behavior. A bit like Jonathan Haidt's

rider and elephant, but not quite. Or like Robert Kurzban's "modules". Looks

like Kahneman can make a lot more sense with the two precisely defined

systems, than Mr. Kurzban with his infinite set of vaguely defined modules.

There seems to be strong evidence that self control, conscious attention

focusing and deliberate thought draw on the same pool of energy resources.

Exhausting the available resources on one of this kind of activities can be

expected to lead to poor performance of the other - for example, intense

effort to stay focused a longer period can lead to poor self control right after

(being more likely to choose the "sinful chocolate cake" over the "virtuous

fruit salad"). Also there seems to be some correlation between cognitive

abilities and capacity for self control.

Page 16: Summaries of Thinking Fast and Slow

Doing something demanding that you don't enjoy depletes resources twice

as fast - once, to do the task at hand, and twice, to keep yourself focused.

There is a good word for when you are low on this energy - "ego depletion".

This doesn't have to happen if you love what you are doing and achieve a

state of flow (I am beginning to think all psychology books published recently

mention Mr. Chiksent - this is the fifth time I run into him in a couple of

months - this time with a different suggestion for the pronunciation of his

name - used to be "cheeks sent me high", now is "six cent me high")

There is strong evidence that pupil dilation is a good indicator of the rate at

which mental energy is being used, much like the electricity meter we have

at home. Pupil contraction is also a good indicator that a solution has been

reached or the subject is just about to give up.

Some mental tasks, like remembering long sequences, are difficult because

the material stored in working memory decays very fast. One has to keep

repeating what is where (similar to playing blind chess).

When solving problems, System 1 (which can never be switched of) comes

up with some quick suggestions, if he can find any, and System 2 reviews

them. Our beliefs usually start up as "suggestions" from System 1, which

System 2 will later adopt. System 2 is lazy by design, and sometimes the

review it performs is quite superficial, with differences from person to

person. "Engaged" is a good word for people with a a more active System 2.

This is not so much a function of education, as shown by a neat example

"a bat and a ball cost $1.10 The bat costs $1 more than the ball. How much

does the ball cost".

Apparently more than 50% of Harvard and MIT students get it wrong. And

even more at regular universities. System 1 comes quickly with a very

appealing suggestion, and System 2 is easily persuaded to  approve it. I can

recall some instances of this from my own experience with financial

derivatives, but will mention them in another post.

Page 17: Summaries of Thinking Fast and Slow

Ideas are, at an unconscious level, strongly associated with other ideas.

Words evoke memories, those memories evoke emotions, and emotions will

change our facial expressions etc all very quickly, as some kind of ripple

effect inside our brains. Scientists think of ideas as nodes in a network called

"associative memory". Which introduces the next topic - priming - being

exposed to certain information, messages, images etc. will influence your

subsequent behavior without you being aware of it. This has strong cultural

implications. The concept of money has been shown to prime individualism.

Reminding people of their morality seems to increase the appeal of

authoritarian ideas.

There is a state we can call "cognitive ease"- when System 1 is running in a

"business as usual" mode, like when doing things we are very familiar with

that require little conscious attention focus. Its opposite is "cognitive strain",

and that's when System 1 encounters something unexpected, and calls in

System 2 for help. As strain increases, you get less superficial, more

suspicious, but also less intuitive and less creative. This seems to happen

regardless of the cause of the cognitive strain.

At an unconscious level, "cognitive ease" comes from positive feelings

associated with the object of our attention. Cognitive ease is also associated

with the idea of validity. We sometimes think of statements as true because

we perceive them as natural and familiar. System 1 does not alert System 2

that anything unusual has happened, and System 2 will endorse the view.

Prolonged exposure to anything leads to familiarity. There is something

called "mere exposure effect", common to most animals. Organisms react

with caution to new stimuli, but exposure to them and nothing bad

happening afterwards gradually lead from cognitive strain to familiarity and

cognitive ease. That stimulus becomes a safety signal, after a while. It can

turn into liking. In the case of humans, it applies to ideas, images, words,

people etc. Hearing the same thing many times around us makes System 1

perceive its familiarity, and System 2 tends to rely on that familiarity for a

true/false judgement. "Familiarity isn't easily distinguished from truth."

Page 18: Summaries of Thinking Fast and Slow

Changing conditions from cognitive ease to cognitive strain (strain can be

induced in many ways) will lead to predictable changes in assessment. The

reverse is also true - "anything that makes it easier for the associative

machine to run smoothly will also bias beliefs." 'You can enlist cognitive ease

to work in your favor" using lots of tricks. Mood affects System 1. It seems

good mood, creativity, gullibility, increased reliance on System 1 belong

together. Sadness, vigilance, suspicion, analytic approach, increased effort

also go together.

Now more details about System 1

maintains and updates all the time a model of your personal world

the model is our benchmark for "normality". can sense items / words /

actions that are out of context/norm

if they are close enough to the model norm, they go unnoticed - see Moses

illusion (and then replace with George W :D)

good word for normality here is "associative coherence"

violations of normality are detected with astonishing speed

our models of our personal worlds share a lot of items/ categories, and we

can communicate easily by referring to them

system 1 automatically looks for causal connections. big events are destined

to be the explanation of subsequent events (see market commentary)

we have a need for coherence

system 1 also perceives intention and emotion

can also understand language

system 1 perceives the world of objects as distinct from the world of minds -

> Paul Bloom speculates this could be the reason of the near universality of

religion

system 1 needs little repetition for an experience to feel normal (maybe

related to law of small numbers)

its only measure of success is the coherence of the narrative it manages to

generate. it assumes the information available is complete. WYSIATI what

you see is all there is (have been looking for this one)

Page 19: Summaries of Thinking Fast and Slow

ambiguous or missing information is automatically adjusted by something

similar to "interpolation" without us ever being aware of the existence of

ambiguity

System 1 is unable of conscious doubt

small amount of information is always easier to fit in a coherent pattern

Understanding a statement starts with an unconscious attempt to believe it,

which is equivalent to constructing the best possible interpretation of the

situation. System 2 later unbelieves it, if anything looks suspicious. Keeping

system 2 busy with other tasks will impair its ability to perform this function.

System 1 is gullible, System 2 is sometimes busy and always lazy.

Halo effect is filling in missing information as seen consistent with initial

impression. Our assessments of anything are "path dependent" - order

matters. We try to create consistency with our first impressions and avoid

the disturbing feeling of inconsistency = "supress ambiguity".

High confidence with a decision can be a sign of bad decisions, based on a

spurious impression of coherence. For example listening to only one side in a

trial will lead to high confidence with a biased decision.

It is better to do things in a way that decorrelates errors. Opinions spoken

early and assertively influence others.

Page 20: Summaries of Thinking Fast and Slow

Related biases : overconfidence, base rate neglect, framing effects

System 1 instantly evaluates the faces of strangers on two scales: 1)

dominance and 2) trustworthiness. They managed to predict outcomes of

elections by showing pictures of candidates to test subject who didn't know

them. The feature most correlated with the choice for candidate was

"competence" - approximated by combination of dominance and

trustworthiness. Uninformed voters that watch a lot of TV were more likely to

make up their mind this way.

When dealing with categories, System 1 is very good with averages and very

poor with sums. It can pick the "representative" items very quickly. Test

subjects were asked how much they would donate to save birds after Exxon

Valdez incident. Some were told that there were 2,000 birds' lives to be

saved, others 200,000 birds. The average contribution does not change in

these cases. Participants responded to the intensity emotional impact of the

idea of one bird dying and neglected their number. Intensity of feelings can

be converted to many other scales, including money.

In everyday life, instances when no answer comes to mind are very rare. The

normal state is that we have intuitive feelings about anything and

everything. To achieve that when faced with complex issues or questions, we

unconsciously substitute the actual questions with related easier ones. The

author calls this "substitution".

Example of substitution: Actual question - called target question - how happy

are you with your life these days? Simplified question - what is my mood

right now?

Another example - target - how should the financial advisers that prey on the

elderly be punished. Heuristic - how much anger do I feel when I think of

financial predators?

Answers to the heuristic question need to be fitted to the original question.

This is done by "intensity matching". Feelings and dollars are both on

intensity scales.

Page 21: Summaries of Thinking Fast and Slow

Our emotional attitude to a lot of stuff (nuclear power, motorcycles,

bankers :D) drive our beliefs about the benefits and risks. Then System 2

becomes an apologist for System 1. It looks for information that is consistent

with existing beliefs, and doesn't have the intention to examine them.

Chapter 2 Heuristics and biases

Goes through a lot of known biases one at a time. Roughly the same stuff

covered "Fooled by randomness".

One is the law of small numbers, seen as part of a larger story - we are

prone to exaggerate the consistency and coherence of what we see. It is a

lot harder to sustain doubt. System 1 is quick to produce a rich

representation of reality from scraps of evidence, and that reality will

make too much sense. The idea of pure randomness is very difficult to

accept - people saw a pattern in the areas of London bombed in WWII, and

suspected spies to be living in the areas that had not been damaged.

We pay more attention to the content of the message than to the

information about its realiability (I guess we are quick to check that if we

don't like the content).

Anchoring effect - both System 1 and System 2 are involved. Random

reference is perceived to be wrong, the direction of a necessary adjustment

is guessed right, but the adjustment process ends prematurely (System 2).

Adjustment stops at the near edge of the uncertain region. If you start from a

much lower reference, you will hit the other edge of the uncertain region.

System 1 also involved - compatible memories are activated automatically -

priming.

Both experts and common people are subject to it. The only difference is that

experts don't admit it. Anchorage effect can be plotted on a graph (provided

you have enough experiments). You will have the estimate as some kind of

function f of a value x as anchor.

Page 22: Summaries of Thinking Fast and Slow

Impact of anchoring in our daily lives - caps and floors act as anchors.

Availability heuristic = judging frequency by the ease with which things

come to mind. How many instances do we need to retrieve from memory to

get a feel for how easy it is? None. The sense of ease comes immediately,

somehow.

Asking people to list examples can influence their assessment. They can be

surprised by the difficulty with which more than a few instances come to

mind and reverse their initial guess. The initially perceived "availability" now

becomes "unexplained unavailability". If you start by offering subjects

plausible ways to explain unavailability, you will make it less surprising, so

the effect is reversed again. System 2 resets expectations of System 1 on

the fly. People are less confident with their choices when they are asked to

produce more arguments to support it.

In everyday life - media coverage, availability cascades. Ease with which

ideas of risk come to mind is strongly associated with the intensity of

emotional reactions to those risks.

Affect heuristic - people make decisions by consulting their emotions. It is

an instance of substitution - an easier question (how do I feel about it) is

substituted for a more difficult one (what do I think about it).

There is a strong negative correlation between our perceived benefits of

something, and our perceived risks of that same something. You would

expect higher risks to go with higher returns, like in finance. We actually

start with one emotional attitude towards that thing and transpose it on two

other scales - expected benefits and expected risks. We need to keep

"consistent affect" in our "associative coherence".

The emotional tail wags the rational dog (Jonathan Haidt).

Our inner world is much tidier than reality.

Page 23: Summaries of Thinking Fast and Slow

Importance of ideas is judged by fluency and emotional charge with which it

comes to mind.

Availability + affect can cause availability cascades. We have a basic

limitation to handle small risks. We either ignore them completely or blow

them out of proportion.

Representativeness heuristic - we judge probability and frequency by

how much the item in question resembles our stereotype image of it. A

nerdy looking guy is more likely to be considered an IT student even in a

university where only 5% study IT (I made this up based on other examples

in the book). The "base rate" of 5% is usually completely ignored, and the

judgement is based only on the description of the guy. Again substitution -

representativeness instead of frequency and probability.

Probability is a difficult concept. Although only vaguely defined in everyday

speech, people don't notice the ambiguity around it. Need to come back to

this.

Another example - Linda - probability of a subcategory is judged as greater

than the whole category it belongs to - because the subcategory is seen as

closer to the stereotype.

There are ways to reduce the bias. We are sensitive to the way in which

question are worded, although the algebraic meaning is the same.  For

example "how many of the 100 people" comes across different from "what

percentage (of a general population)".

Also, data can be presented in a way that already offers some causal

interpretation (or begs for one). Or can be shown as pure statistical data with

no apparent meaning. We will be more sensitive the causal - System 1.

Results that come against our intuition are quickly "explained away".

Students "quietly exempt themselves" from the conclusions of experiments

that surprise them. (some experiment where few people offered to help,

Page 24: Summaries of Thinking Fast and Slow

knowing that many others are around the person asking for help). "Subjects

unwillingness to deduce the particular from the general was matched only by

their willingness to infer the general from the particular".

Regression to the mean - I find the part from here to the end of the

chapter is a bit occurred, and will have to place the ideas under "quarantine"

for now. The idea that luck plays a bigger role in our lives than we want to

think is taken further, reaching some unusual conclusions. Those conclusions

are argued with the help of inadequate statistical language - something like

"statistics 101 for people who hated math all their lives and can't add". He

refers to what could be accurately be described by terms like bivariate

distributions, stochastic processes, correlations, confidence intervals, first

and second moment etc., but avoids these concepts, and this makes the

whole argument quite ambiguous. 

We make biased guesses, and the author suggests a rational way to make

them less biased - 1. get the base rate estimate 2. adjust for your intuition

while accounting for "correlation" !?... Isn't there a beta supposed to be there

somewhere, r squared and all? I have my suspicion about other stuff.

Thinking fast and slow, Daniel Kahneman. Summary 2

Page 25: Summaries of Thinking Fast and Slow

Chapter 3 - Overconfidence

Again the "fooled by randomness" stuff, explained in a bit more detail.

Hindsight bias - "The core of the illusion is that we believe that we

understand the past, which implies that the future should also be knowable,

but we understand the past less than we believe we do".

Recent events alter not only our view of the future, but also our view of the

past. If two teams with comparable track record play against each other, and

one is the clear winner, in our "revised model of the world", the losing team's

track record to that point will be a lot less impressive than it was before the

match. "A general limitation of the human mind is its imperfect ability to

reconstruct past states of knowledge, or beliefs that have changed". "Once

you adopt one view of the world (or any part of it), you immediately lose

much of your ability to recall what you used to believe before your mind

changed. Asked to reconstruct their former beliefs, people retrieve their

current ones instead - an instance of substitution - and many cannot believe

that they ever felt differently". We can call it the "I knew it all along" effect. If

the effect occurs, people exaggerate the probability that they had assigned

to it earlier.

This impacts the way people make decisions. Somebody who expects to

have his decisions scrutinized later will favor those that are the most

defensible, to protect themselves against the subsequent hindsight bias of

reviewers, in case something bad happens. The author lists physicians as an

example, but it is true for bankers too.

"Subjective confidence in a judgement is not a reasoned evaluation of the

probability that this judgement is correct. Confidence is a feeling, which

reflects the coherence of the information and the cognitive ease of

processing it. It is wise to take admissions of uncertainty seriously, but

declarations of high confidence mainly tell you that an individual has

constructed a coherent story in his mind, not necessarily that the story is

true"

Page 26: Summaries of Thinking Fast and Slow

Now he goes on to talk about the world of finance. Again "fooled by

randomness", "black swan" stuff. While I agree to the general idea, I think it

has been taken too far. Professional investors and advisers do no better than

dart throwing chimps, and careful scrutiny of financial statements and other

information is useless. I wonder whether the author considered what the

world would look like if everybody completely ignored financial information

and switched to dart throwing chimps for stock picking. I don't think the

world would look the same, and would definitely not be better. It is easy to

take the "efficient market hypothesis" for granted, as if some law of nature,

like gravity, automatically makes markets efficient. I think there is no such

law. If markets are close to being efficient, they are so precisely because of

people who think markets are not efficient, and actively look for

inefficiencies and trade accordingly. Those who believe that markets are

already efficient have a negative contribution to that efficiency.

Stock picking skill is super-over-rated in the culture of financial industry, and

it's extremely difficult to tell skill from luck, no matter which method you use.

But the "randomness" idea is also oversold, in some circles. In its extreme

form, it says that there is no such thing as stock picking skill. In replaces one

certainty with another - or one coherent narrative with another coherent

narrative :) Luckily, the author explains the biases that he himself falls prey

to elsewhere in the book :D

Hedgehogs and foxes - interesting way to look at experts. I have seen

before, but it is presented differently here, as far as I can remember. The

hedgehog is in love with one theory that explains everything and completely

defensive about anything else.  Foxes have more theories and are less

confident about each. Hedgehogs are by far more fun to see on TV.

The author moves on to compare experts against simple statistics. He

crosses the line between "expert predictions are no better than simple

algorithms" to "expert predictions are clearly worse than simple algorithms".

Lists the example of his economist friend and wine lover, who found a clever

formula to predict the price of fine wines:

Page 27: Summaries of Thinking Fast and Slow

"His formula provides accurate price forecasts years and even decades

into the future. Indeed, his formula forecasts future prices much more

accurately than the current prices of young wines do". This guy's formula "is

extremely accurate - the correlation between his predictions and actual

prices is above 0.9"...  90% correlation... decades into the future... Daniel, for

God's sake, read your own god-damn book man :( Particularly this current

chapter 3, called... "overconfidence". You are implying that your friend is a

genius and everybody else that takes an interest in the market for fine wines

is a complete retard. And, with 90% accuracy, he must also be the richest

most successful wine trader in history.

The rest of the chapter goes on about the contest between experts and

algorithms. Human experts can have valid skills, depending on the general

predictability of the domain in question (with chess masters and fire-gighters

at one end of the scale, in "high validity environments", and stick pickers and

political analysts at the other, in "low validity environments"). If there is

enough feedback, and if it comes soon enough, a lot can we learned from

experience - works well for drivers, not so well for psychotherapists. It seems

experts don't compare options - they go straight for the first guess, and only

if that one doesn't feel right, they look for another. It's pattern recognition

after hours of training. System 1 can recognize the cues, but System 2 can't

put it into words. This is just "the norm of mental life" for everybody, not just

experts.

But the boundaries of expert sills are not obvious, and certainly not obvious

to the experts themselves. The author prefers simple formulas to expert

advice, because the formula always gives the same result for the same

inputs (unlike humans), doesn't leave out evidence etc. It just leaves one

small question - which simple formula? Not sure why such basic point needed

to be made at all. It's pretty obvious to every grown up that in most fileds

some checklist or procedure is preferable to having the expert making

predictions based on nothing but his gut feel.

The author introduces the "inside view" and "outside view". A team or

company tend to predict the outcome of whatever it is they are engaged in

Page 28: Summaries of Thinking Fast and Slow

by giving weight only to information to do with their activities, skills,

resources etc. E.g. Hollywood studios try to predict box office success by how

good the film and the promotion campaign are. How good are other films

released the same weekend is also very important. People often ignore the

statistics for the class they belong to - this is the "outside view". For example

most entrepreneurs think their start-up will succeed, ignoring the odds of

doing so as implied from the experiences of others.

Very interesting one is the following - when people are asked to rate

themselves with regard to a specific task, they will actually rate the task

itself :) For example "are you better than average at starting conversations

with strangers". If the task is perceived as difficult => poor rating, and

reverse also true. I remember this from my school days - people were happy

when the exam questions were easy, even though we knew everybody had

the exact same questions and only the top x candidates got accepted.

Being overconfident is a social necessity, particularly for people in

management positions. In scientific research, optimism is essential. "I have

yet to meet a successful scientist who lacks the ability to exaggerate the

importance of what he or she is doing".

The last question - can optimism be overcome by training? The author goes

on to say he is skeptical. I am wondering whether we should wish for that to

happen in the first place. If there was a way to do that, Columbus would have

had to postpone his trip for a couple of hundred years. Perhaps before that,

we never would have made it out of Africa. It all depends on what we

consider to be the overall cost of failure, and overall reward for success, for

mankind taken as a whole. Optimism of "unreasonable men" seem to have

helped us out quite a bit.

Chapter 4 - Decisions

This is mainly a presentation of prospect theory. Classic economic theory

assumes all players to be rational. The author considers rational players to

be a different species, which he calls "Econs". This is different from us,

humans - we have a System 1, they don't.

Page 29: Summaries of Thinking Fast and Slow

The chapter presents the conclusions a large number of experiments in

which participants are asked for their choices on lots of hypothetical

gambles. These hypothetical gambles are "to the students of decision

making what the fruit flies are to geneticists".

The difference between utility perceived by Econs and Humans is that Econs

care only about aggregate wealth. Humans care about changes in wealth

with regard to some unspoken reference point. This reference point depends

on context, framing of the consequences of choice to be made etc. To use

financial option language, the utility function seems to be "path dependent".

The same thing can be perceived as a gain or as a loss, depending on each

individual's reference point - the psychological equivalent of the experiment

in which one hand feels same temperature water as warm and another hand

as cold. It is the emotional response to the perceived changes relative to

reference points that influence the decision. Reference points can be moved

around by the way the question is worded etc. The reference point is

sometimes referred as the "adaptation level", or "default case".

Loss aversion has evolutionary roots - organisms that treat threats more

urgent than opportunities do better. A loss aversion ratio can be calculated -

seems to be in the range of 1.5 to 2.5 - on average we are willing to take a

50/50 bet if outcomes are +200/-100, not worse.

Preferences are not constant, they depend on variable reference points. For

example the minimum price for which we are willing to sell something we

own is much higher than the price we are willing to pay for getting the same

thing when we don't already own it. This is called "endowment effect".

Suggested explanation - in one case we have the pain of giving up

something we own. In the other case, the pleasure of getting something new.

They are not symmetrical ultimately due to risk aversion.  This also shows

that emotional attachment to many things we own is a lot higher than to the

money we have. Money is perceived as a means for exchange (quite rightly, I

would say). Endowment effect doesn't apply to professional traders for the

same reason as the money example - emotional attachment doesn't exist.

It's the difference between "items held for use' and "items held for

Page 30: Summaries of Thinking Fast and Slow

exchange" (I bought of pair of Levi's jeans two weeks ago - and traded in my

old jeans for 50 dollars - I kind of regretted the decision, although it was a

good deal). There are "frames" in which people can be persuaded to be less

influenced by the endowment effect.

For a Econ, the buying price should be irrelevant with regard to the

subsequent decision to sell. Not the case for Humans (applies to stocks,

houses, anything).

"Being poor, in prospect theory, is living below one's reference point".

Neurologists proved that the brain reacts to a perceived threat way before

we become conscious of it - via some super fast neural channel. There is a

mechanism designed to give priority to bad news.  There is no equivalent

system for processing good news. Loss aversion is part of a bigger negativity

bias. Bad impressions are quicker to form and more resistant to

disconfirmation than good ones. Success in a marital relation seems to

depend much more on avoiding the negative than on seeking the positive.

Some expert even worked out a ratio of good to bad experiences 5:1. (John

Gottman).

Pleasure has the biological role of indicating the direction of a significant

improvement of circumstances.

Risk aversion and conservatism are the gravitational force that holds our

lives together near the reference point.

Moral judgements are also reference-dependent. And we are not designed to

reward generosity as reliably as we punish meanness (asymmetry again).

Example given by author : "This reform will not pass. Those who stand to

lose will fight harder than those who gain" - nearly identical to a quote from

"The Prince".

Possibility effect - the perceived change of likelihood from 0% to 5% is much

larger than 5% to 10%. We overweight small probabilities. Makes both

Page 31: Summaries of Thinking Fast and Slow

gambles and insurance policies more attractive than they are - price for

peace of mind, or price for the hope and dreams of winning big - in both

cases the reward is immediate. Decision weights are not identical to

probabilities. I don't particularly like this probability talk - I find the idea of

known probabilities of outcomes as some artificial construction of the

modern world. You can only find them in casinos and lotteries, not in real life.

Value is derived from perceived gains and losses, not from states of wealth.

Planning fallacy - people come up with "best case scenarios" when asked for

fair estimates. It is easy to overweight the plan itself than the reasons

unknown reasons for failure.

"Measurable reactions like hart rate show that the fear of an impending

electric shock was essentially uncorrelated with the probability of receiving

the shock". "Affect laden imagery" overwhelmed the response to probability

and can be shown to disrupt calculation. Representation of outcome is more

important than probability.

"denominator neglect fallacy" - 0.1% is smaller than 1 in ten, which is

smaller than 10 in 100. This is present when each case is taken individually.

When compared, the effect disappears. Without comparison, it takes an

exceptionally active System 2.

Choices from experience are very different from choices from description.

Broad framing usually leads to sounder decisions than narrow framing. We

are by nature narrow framers. "We have neither the inclination nor the

mental resources to enforce consistency of our preferences". The "outside

view" is a broad frame.

"Money is a proxy for points on a scale of self-regard and achievement".

We keep separate "mental accounts" for each decision we make - these are

a form of narrow framing. (Sometimes I find myself wanting my last trade to

work well, even when it is only a partial reduction of existing position -

Page 32: Summaries of Thinking Fast and Slow

meaning I would lose money if the last trade made money - not particularly

consistent). Massive preference for selling winners has been documented - if

you NEED to sell something, that is.

"Regret is one of the counterfactual emotions that are triggered by the

availability of alternatives to reality".

Acting and failing to act are perceived very differently. Emotional reaction to

results of action are stronger than to inaction. Black Jack example -

asymmetry of regret for bad outcomes in "hit" or "stand".

 It is departure from default that produces regret. (my boss was once

upset with me for having hedged a position which we hadn't intended on

keeping, that would have made money if left open - his default was the open

position).

Efficient risk management is sometimes at odds with intense loss averse

moral judgements. Spending too much money on insurance for some risks

leaves fewer resources available to cover other risks which are not so

present in people's minds. System 1 risk management actually minimize the

expectation of regret. And people seem to anticipate more regret than they

will actually face - they underestimate the power of the "psychological

immune system" (Daniel Gilbert).

Change of framing can reverse judgement on choices that have identical

outcomes - this is the clearest proof of irrationality to me. For example high

probability small loss vs low probability big gain is declined, but buying a

lottery ticket with identical details is taken on - the difference between

mental representations of "loss" vs "cost".

Evaluability hypothesis (Hsee) - info without a term of comparison is

underweighted.

"Italy won" and "Frnace lost" have identiacl meanings to Econs, but not to

Humans. They trigger different mental representations. They are not

emotionally equivalent.

Page 33: Summaries of Thinking Fast and Slow

"The credit card lobby pushed for making differential pricing illegal, but it

had a fallback position: the difference, if allowed, would be labelled a cash

discount, not a credit surcharge." It's easier to forgo a discount than to pay a

surcharge. The same can be shown for some moral judgements.

Some people are more rational than others, and there is evidence of this in

brain scans. Less rational choices seem to be associated with the amygdala,

suspected to be involved heavily in System 1 stuff.

Important stuff about framing - it is not that framing distorts our true

preferences, as much as our preferences themselves are about frames.

Loss of cash is charged to a different mental account than loss of objects

that can be bought with the same amount of cash.

Opting in and opting our have huge impact on organ donations.

Page 34: Summaries of Thinking Fast and Slow

The term "utility" has two different meanings. One corresponds to Bentham's

definition - the amount of pleasure or pain experienced. Another one, used

by economic theory, is some kind of "wantability". These two correspond to

what the author calls "experienced utility"and "remembered utility".

Experienced utility is some kind of integral of pain and pleasure dt (so to

speak), whereas remembered utility is not proportional to the integral. The

distinction is thus made between the experiencing self and remembering

self. One derives pleasure from the experience, the other from the memory

of the experience. One answers the question "does it hurt now?", the other

"how was it on the whole?".

Inconsistency between actual experience and remembered experience can

be demonstrated. Experiments show that sometimes people choose the

worse experience, based on the fact that they retain a better memory of that

experience compared to a less painful one. Duration of the pain is not so

important in the memory of it - we retain only several slices of time. It's

associated more to the peak to end part of the experience. Sudden end

leaves a worse experience than a gradual decrease in pain over a longer

period.

People confuse the experience itself with the memory of it, which seems to

be a strong cognitive bias - it's substitution again.

Experiencing self doesn't have a voice. We identify ourselves with the

remembering self, and the experiencing self is some kind of stranger (I

agree...). The remembering self works by composing stories and keeping

them for future reference. "Caring for people often takes the form of caring

for the quality of their stories, not for their feelings". "Tourism is about

helping people to construct stories and collect memories". "Ed Diener and his

team provided evidence that it is the remembering self that chooses

vacations" - I would guess the remembering self chooses a lot, not just that.

Something called the U-index (experienced unhappiness) can and has been

surveyed for large samples. Normally we draw pleasure and pain from what

is happening that moment, but there are exceptions - somebody who is in

love can be happily stuck in traffic. Similar surveys can be done for the

Page 35: Summaries of Thinking Fast and Slow

remembering self at the same time. Looks like some aspects of life influence

the overall evaluation of the quality of life than experienced utility.

Educational attainment for instance, is associated with higher evaluation of

one's life, but not with greater experienced well being. Ill health has a much

stronger adverse effect on experienced well-being than on life evaluation.

Religion doesn't seem to provide a reduction of feelings of depression or

worry, based on these studies.  Money beyond a certain point does not

increase experienced well being, despite the fact that money permits the

purchase of many pleasures. Higher income does seem to bring higher life

satisfaction. The importance that people attach to money at age 18 is a good

predictor of the satisfaction with their income as adults.

The decision to marry reflects for many people a massive error of "affective

forecasting". Experienced well being is on average unaffected by marriage -

some things change for the better, some for the worse.

Life satisfaction is influenced to some degree by recent events. In general it

is influenced by a small sample of highly available ideas. Both experienced

happiness and life satisfaction seem to be heavily influenced by the genes

(shown by experiments with twins separated at birth). People who appear

equally fortunate vary greatly in how happy they are.

Exclusive focus on happiness as experienced well being is wrong, in the

author's view. I would agree - people spend a lot of time with their memories

as well. I guess that the same experiencing self can focus on memories or

outside events.

"Any aspect of life to which attention is directed will loom large in a global

evaluation". This is the "focusing illusion". "Nothing in live is as important as

you think it is when you are thinking about it" :)

When you are asked to rate how much you enjoy your car, you answer a

simpler question - "how much pleasure do you get from your car when you

are thinking about it". Most of the time, even when driving, you think about

something else though.

Page 36: Summaries of Thinking Fast and Slow

Any aspect of life is likely to be more salient if a contrasting alternative is

highly available.

There is no adaptation to chronic pain, constant exposure to loud noises and

severe depression. They constantly attract attention. Adaptation works by

gradually switching focus on something else. We can adapt to most of the

long term circumstances of life though. For paraplegics, life satisfaction is

much more affected then experienced well being.

"Miswanting" is a term for the bad choices that arise from errors of "affective

forecasting" (I think I have a long list of those). The focusing illusion is a rich

source of miswanting. Difference in the impact on happiness of buying a new

car vs joining a club. Social interaction will always attract your attention. The

car won't.

Author's Conclusions

The author doesn't claim that people are "irrational" - a term that conjures

"impulsivity, emotionality and a stubborn resistance to reasonable

argument". Just that the rational agent model is not a good description. This

has some implications way beyond psychology, into public policy. Extreme

libertarian views (Chicago school of economics) claims that people are

rational, thus it is "unnecessary and even immoral to protect people against

their choices".

Interesting example of excessive rational interpretation of human behavior.

Obesity epidemic could be explained by people's belief that a cure for it will

be soon available :) The same thing happens sometimes in financial markets

- people enforce some weird implied rational view on behaviors that

contradict each other - like bonds going up at the same time as gold - the

"printing money"  and "inflation fears" story can't explain both.

"For behavioral economists freedom has a cost, borne by individuals who

made bad choices, and by a society that feels obligated to help them". For

the Chicago guys, freedom is free of charge. The author is in favor of some

Page 37: Summaries of Thinking Fast and Slow

libertarian paternalism - Thanler and Sustein, a book called Nudge - in which,

without restricting the freedom, available choices can be framed in such a

way that favor the better choice. For example, opting out of health

insurance, not opting in. Or making regulations that prevent companies from

exploiting the psychological weaknesses of Humans by pretending they are

Econs (like hiding important information in small print). I actually agree with

this, and have had arguments with extreme libertarians, which I consider a

crazy bunch :D

It's hard to keep ourselves from falling prey to all these cognitive biases, but

we can spot others when they do :) and that can improve decision making in

an organization (which is a factory for making decisons). We can also learn to

recognize "cognitive mine fields" and call in reinforcements from System 2.

Page 38: Summaries of Thinking Fast and Slow

3/31/12 10:07 PM

http://cataligninnovation.blogspot.in/2012/02/thinking-fast-slow-landmark-

book-in.html

Thinking, fast & slow: A landmark book in intuitive thinking & decision

making

"What is E=mc2 of this century?", our son Kabir asked me a few days after

we watched the NOVA film "Einstein's big idea" together sometime last year.

I thought about it for a few moments and said, "Psychology is the physics of

21st century, Kahneman is the Einstein and his work on happiness is like

E=mc2". My answer and speed with which it came surprised me more than it

surprised Kabir. I was clearly under the spell of Kahneman. Who is this

Kahneman? And what is his work about? If you want to know, your best bet is

to read Nobel Laureate Daniel Kahneman's bestselling book "Thinking, fast

and slow". It has come out a couple of months back in India. It is a landmark

among the psychology literature intended for layman, more specifically

about intuitive thinking and decision making under uncertainty. It covers a

vast spectrum. Let me articulate a few nuggets of wisdom from it.

Page 39: Summaries of Thinking Fast and Slow

Answering an easier question: When I look back at my answer to Kabir's

question, it is clear that the answer was given off-the-cuff. In fact, I was in no

position to answer the original question because I am clueless about most of

the recent breakthrough developments in various sciences including those in

psychology. Then what happened here? Kahneman explains it in chapter 9

titled "Answering an easier question". Whenever we are asked a question

whose answer we don't know, the fast thinking mode answers an easier

question. And unless the slow thinking mode checks & rejects the answer, it

is given out like it happened with me. In my case the easier question that got

answered was, "Among the scientific results I know, which one appeals to me

the most?" Kahneman calls such shortcuts "heuristics" and we use them all

the time. They work mostly and trick us real bad occasionally. When we are

asked, "How happy are you with life these days?" we tend to answer the

easier question, "What is my mood right now?"

Expert intuition: when can we trust it? I wrote an article last month

about mathematician Srinivas Ramanujan and the marvels of his expert

intuition. The article also highlighted how the intuition tricked Ramanujan

occasionally creating incorrect results. In chapter 22 which is titled "Expert

intuition: when can we trust it?" Kahneman presents the exact criteria under

which experts develop real expertise. You acquire skill when (a) you have an

environment that is sufficiently regular to be predictable and (b) you have an

opportunity to learn these regularities through prolonged practice. In some

cases like that of stock analysts, the environment isn't sufficiently regular.

On the other hand, chess players and basketball players do develop real

skills.

Page 40: Summaries of Thinking Fast and Slow

Illusion of understanding: Kodak filed for bankruptcy protection last

month. All the related stories published in Economist, Wall Street

Journal, Wharton School of Business, an HBR blog had one thing in common.

None of them contained the phrase "bad luck". Most of them sited some

story or the other related to poor management decision making, the most

common being the Kodak's invention of digital camera and how Kodak didn't

pursue it's development etc. Implicit in these articles was an assumption that

bad management decisions cause bad outcomes like bankruptcy and

randomness or bad luck plays insignificant role. Kahneman calls these

generalizations where randomness is substituted by causation "illusions of

understanding" (chapter 19). It stems from 3 things: (1) We believe that we

understand the past. However, in reality, we know a story of the past

constructed from very limited information easibly accessible to us. (2) We

can't reconstruct the uncertainty associated with the past events. It leads to

things like the HBR blog suggesting Kodak should have created a photo

sharing social network. (3) We have a strong tendency to see causation

where there is none. Many times a single story such as Kodak's digital

camera story is good enough for us to label Kodak management as the root

cause of its debacle. I don't know how much of strategy literature suffers

from "illusion of understanding".

Kahneman's law of happiness: Ask yourself "How special is your car to

you?" and then ask yourself "How much do you enjoy the commute?"

Psychologists have found a decent correlation (0.35) between the first

question and the blue book value of the car. And, they have found zero

correlation between the second question and the blue book value of the car.

Such observations have led Kahneman to formulate the following law

- Nothing in life is is as important as you think it is when you are thinking

about it. Your car's importance to you is exaggerated only when you think

about it. Unfortunately, you spend significantly more time driving it than

thinking about it.

Page 41: Summaries of Thinking Fast and Slow

3/31/12 10:07 PM

http://richarddeaves.com/behavioralfinanceresearch/?p=144

Dual process theory and behavioral bias

By admin

Thursday, March 29, 2012

From time to time I will be posting book reviews.  Here is the first.  It is

forthcoming in the Journal of Economic Psychology.

Thinking, Fast and Slow, Daniel Kahneman, Farrar, Straus and Giroux, New

York (2011).  499 pp., $30.00, ISBN: 978-0-374-27563-1.

In 2002 Daniel Kahneman was awarded the Nobel Prize in Economic Sciences

for his research in “decision-making under uncertainty, where he has

demonstrated how human decisions may systematically depart from those

predicted by standard economic theory…[and for documenting that] human

judgment may take heuristic shortcuts that systematically depart from basic

principles of probability” (Royal Swedish Academy of Sciences 2002).  Much

of this research of course was undertaken in collaboration with the late Amos

Tversky.  In his memoir-like Thinking, Fast and Slow Kahneman reviews

much of this seminal research.  What makes this book far more than just a

retrospective though is Kahneman’s choice to begin with a lengthy

description of dual process theory and then to move forward with a

reinterpretation through this lens of the main body of his research.

Page 42: Summaries of Thinking Fast and Slow

Dual process theory is based on the view that the mind operates using two

parallel systems.  These have been termed the Unconscious and the

Conscious (e.g., Wilson 2002), Intuition and Reasoning (e.g., Kahneman

2003), or (after Stanovich and West 2000) System 1 and System 2. 

Kahneman adopts the latter usage and appealing to a broad readership – the

book is a bestseller – likens them to characters in a movie: “In the unlikely

event of this book being made into a film, System 2 would be a supporting

character who believes herself to be the hero” (p. 31).  System 1 operates

continuously and automatically, without reflection and any sense of

voluntary control, while System 2 is powered by effortful mental activity and

is associated with agency, choice and concentration.  While we like to think

that System 2 (our reasoning) makes all but a few trivial decisions for us,

proponents of dual process theory argue that this is an illusion.  In Strangers

to Ourselves, Timothy Wilson writes that “a better working definition of the

unconscious [i.e., System 1] is mental processes that are inaccessible to

consciousness but that influence judgments, feelings and behavior” (p. 23). 

So clearly dual process theory should be of potential concern to students of

judgment and decision making.

Page 43: Summaries of Thinking Fast and Slow

Kahneman devotes the first quarter of the book (chapters 1-9) to dual

process theory.  After introducing the “characters,” he goes on to describe

their roles using a series of entertaining examples based on research. 

System 2 is inherently lazy and is often content to accept answers provided

by System 1.  This is evidenced by priming research.  For example, there is

evidence that money-primed people are more independent, persevere longer

and show a preference for being alone (chapter 4).  System 2, being in

charge of self-control and susceptible to fatigue, is particularly prone to

making poor choices when it is “ego-depleted.”  For example, Israeli judges

are less likely to grant parole when they are tired and hungry (p. 43). 

Kahneman argues that System 1 holds in memory a kind of inner model of

our normal world.  When something untoward occurs, triggering surprise,

System 2 is likely to become engaged.  On the other hand, when in a state of

“cognitive ease,” which is characterized by related experience and good

mood, one is more likely to be gullible and to believe that the familiar is

“good” (Zajonc (1968)).   System 1 is a “machine for jumping to conclusions”

(chapter 7).  It digests the data on hand and quickly comes up with a good

story.  Kahneman calls this WYSIATI: “what you see is all there is.”  Unlike

the scientific method which searches for disconfirming evidence, in looking

for coherence System 1 is subject to confirmation bias.  Halo effects can

result: one observes a positive attribute and then (sometimes

inappropriately) “connects the dots.”  Basic assessments are constantly

being generated by System 1.  When a difficult question requiring the

resources of System 2 is asked, answers to questions arising from these

basic assessments may be substituted for hard questions.  When one is

asked “Are you happy?” the answer given may really be to the question “Are

you currently in a good mood?”  People allow likes and dislikes to determine

beliefs: for example, your political preferences can determine which

arguments you find compelling (chapter 9).

Page 44: Summaries of Thinking Fast and Slow

Next in Part 2 (chapters 10-18) Kahneman turns to his work on heuristics and

biases.  This research agenda, to be sure, has attracted its share of critics

(e.g., Hertwig and Ortmann 2005) and some of their points are well-taken. 

For example, overconfidence (in the sense of miscalibration) is often inflated

because researchers (whether unconsciously or otherwise) choose questions

most likely to take in System 1’s tendency to jump to wrong conclusions

(Gigerenzer 1991).  Still this review is not the place to fan these flames. 

Consider some examples of heuristics and biases and how dual process

theory can potentially elucidate their origins.  The tendency to anchor to

available cues, even if they are patently meaningless, famously documented

in Kahneman and Tversky’s famous spinning wheel and African nations in the

U.N. experiment, is argued to have both a System 1 and System 2 basis

(chapter 11).  System 2 being grounded in rationality knows that it must

move away from the cue in (what is likely to be) the direction of the correct

answer.  But being lazy it does not go far enough, allowing the anchor to

influence the answer.  In line with the notion that those in a state of

cognitive ease tend to be less cognitively vigilant, those nodding/shaking

their heads stay closer to/depart further from the anchor (Epley and Gilovich

2001).  As for System 1, research indicates that its gullibility and tendency to

look for confirming evidence induces it to search memory for related

answers.  Next consider availability: when people are to assess frequency

they search memory for known instances of an event.  If retrieval is easy it

will be guessed the frequency is large.  Of course retrieval may be easy

because the frequency is truly large.  On the other hand, it may be the case

that recent, salient occurrences foster easy retrieval.  The substitution of an

easy question (how easy is it to think of occurrences?) for a hard question

(how large is the category?), done by System 1 and endorsed by a lazy

System 2, is argued to be at the root of this bias.  Finally, consider the

tendency to be surprised by regression to the mean.  The tendency of

System 1 to look for a coherent story leads to an exaggeration of

predictability.  Indeed, decomposing randomness from systematic (i.e., story-

based) factors is complex and it is not surprising that those untrained in

statistics have difficulty with the task.  And so System 2 defers to intuitions

supplied by System 1.

Page 45: Summaries of Thinking Fast and Slow

The proclivity to overconfidently believe that the world is more predictable

than it really is the major theme of Part 3 (chapters 19-24) of the book.  The

illusion of understanding, reflecting our tendency “to constantly fool

ourselves by constructing flimsy accounts of the past and believing they are

true” (p. 199) leads us to erroneously believe that the future is much more

predictable than it really is.  Like Nicholas Taleb in The Black Swan (2007),

Kahneman rails against false prophets who are at the most convincing when

they believe their own patter.  The fact is that many domains are close to

unpredictable: while it is true that the weather has a significant predictable

component, financial, economic and political systems are much less

predictable.  Indeed in unpredictable environments there is evidence that

one does better utilizing simple algorithmic decision rules while eschewing

the forecasts of the experts – and simple is usually better than complex

because of the overfitting problem (chapter 21).  Experts fall short because

they “try to be clever, think outside the box, and consider complex

combinations of features” (p. 224).  This will not be surprising to those who

have investigated the performance of professional forecasters and found it

wanting relative to naïve benchmarks (e.g., Deaves 1994).

One of the delights of Kahneman’s book is that he peppers it with personal

anecdote, sometimes at his own expense.  For example, he tells the story of

a curriculum project to teach judgment and decision making in Israeli high

schools that he headed (chapter 23).  Early on he asked committee members

to opine on how long they thought it would take to complete the project, with

forecasts ranging between one and half and two and half years.  One

committee member with experience in such initiatives was next asked what

was typical, replying that of those groups that successfully completed their

tasks (40% didn’t), the completion range was 7-10 years – yet this same

individual when taking the “inside view” was as subject to planning fallacy as

the rest of the team.  Clearly it is advisable to take the “outside view” and be

appropriately influenced by the base rate.   Kahneman confesses that years

later he concluded that he too was at fault (he calls himself “chief dunce and

inept leader” ,p. 253).  Despite the fact that the project was eventually

finished (eight years later, after Kahneman had left the team), if he had not

been subject to the sunk cost fallacy he would have terminated the initiative

once historical data were made available to him.

Page 46: Summaries of Thinking Fast and Slow

In Part 4 (chapters 25-34) Kahneman turns to his famous work on choices

made under uncertainty.  He begins with another anecdote: “Amos handed

me a mimeographed essay by a Swiss economist which said: ‘the agent in

economic theory is rational, selfish, and his tastes do not change’…[but] to a

psychologist it is self-evident that people are neither fully rational nor

completely selfish, and that their tastes are anything but stable…Soon

after…Amos suggested that we make the study of decision making our next

project” (p. 269).  So began Kahneman and Tversky’s research which led to

prospect theory and related issues such as the endowment effect and

framing.  The emotional side of System 1 plays a larger role in this

discussion.  Indeed, there is a growing view – and Kahneman shares it (p. 12)

– that emotion plays a significant role in decision making.  While decision

making has been traditionally viewed as a purely cognitive activity with at

most anticipated emotions (such as regret) being considered, in the “risk-as-

feelings” view of Loewenstein, Weber, Hsee and Welch

(2001), immediatevisceral reactions concurrent with the decision itself may

occur and potentially influence the cognitive process.  Importantly,

Kahneman does not equate System 1 with affective assessment.  For one

thing “there is no one part of the brain that either of the systems would call

home” (p. 29) — not that we really know with certainty where emotions or

any particular brain function truly reside despite the sometimes inflated

claims of neuroeconomics (e.g., Harrison (2008)).

Loss aversion is at the heart of prospect theory.  Positive-expected-value

mixed prospects are routinely rejected by most people because while the

rejections of such gambles are System 2 decisions “the critical inputs are

emotional responses that are generated by System 1” (p. 284).  Such

behavior is likely adaptive since the view that threats are urgent (or losses

exceedingly painful) likely increased the probability of survival.  The

endowment effect which some view as the non-random counterpart to loss

aversion is also argued to have an emotional basis (chapter 27).  “Thinking

like a trader” can induce people to avoid both loss aversion and endowment

effects.  Probability weighting is another key component of prospect theory. 

While typical probability weighting suggests that very low-probability events

are overweighted, Kahneman discusses evidence that this is especially so

when such events are emotionally vivid, such as in the work of Rottenstreich

and Hsee (2001) on kissing movie stars.

Page 47: Summaries of Thinking Fast and Slow

Finally, in Part 5 (chapters 35-38), more recent research undertaken by

Kahneman in the area of happiness is reviewed.  This work will be least

known to many.  One of the major insights coming from this research is that

in the realm of well-being there can be a divergence between its experience

and its memory, and it is not necessarily clear which trumps (or should

trump) the other.

Naturally reviewers are expected to quibble.  The one that comes to mind is

that sometimes references are wanting.  While the decision to do away with

standard footnotes was perhaps imposed – likely publishers have decided lay

readers will not pick up a book with full-blown footnotes – at times references

are missing in the pseudo-footnotes at the end of the book.  To take one

example, in his discussion of vivid outcomes and how they might impact

probability weighting (p. 326), Kahneman spends some time talking about

research done by a “Princeton team” without identifying either the paper or

the authors.  This is naturally frustrating for researchers like myself who wish

to do some follow-up reading.

Thinking, Fast and Slow is a well-written, truly enjoyable book.  While

accessible to the lay reader, even researchers well acquainted with much of

Kahneman’s work will be engaged by its sense of (academic) history, its

variegated anecdotes and its refreshing dual process theory-based

perspective (though such a perspective will perhaps not be to everyone’s

taste).  Likely this is a book that will be read sooner or later by all readers of

the Journal of Economic Psychology. This reviewer recommends sooner.

REFERENCES

Deaves, R., 1994, “How good are Canadian macroeconomic

forecasters?”Canadian Business Economics 2 (No. 3): 60-66.

Epley, N., and T. Gilovich, 2001, “Putting the adjustment back in the

anchoring and adjustment heuristic: Differential processing of self-generated

and experimenter-provided anchors,” Psychological Science 12, 391-96.

Page 48: Summaries of Thinking Fast and Slow

Harrison, G. W., 2008, “Neuroeconomics: A critical

reconsideration.” Economics and Philosophy 24, 303-44.

Hertwig, R., and A. Ortmann, 2005, “The cognitive illusion controversy: A

methodological debate in disguise that matters to economists,” in Zwick, R.,

and A. Rapaport (eds.), Experimental Business Research 3, 113-30

Kahneman, D., 2003, “A perspective on judgment and choice,” American

Psychologist 58, 697-720.

Loewenstein, G., E. U. Weber, C. K. Hsee and N. Welch, 2001, “Risk as

feelings,” Psychological Bulletin 127, 267-286.

Rottenstreich, Y., and C. K. Hsee, 2001, “Money, kisses and electric shocks:

On the affective psychology of risk,” Psychological Science 12, 185-90.

Royal Swedish Academy of Sciences, 2002, Press release announcing the

Nobel Prize in Economic

Sciences,http://www.nobelprize.org/nobel_prizes/economics/laureates/2002/

press.html.

Stanovich, K. E., and R. F. West, 2000, “Individual differences in reasoning:

Implications for the rationality debate,” Behavioral and Brain Sciences 23,

645-65.

Taleb, N. N., The Black Swan: The Impact of the Highly Probable, Random

House, New York.

Wilson, T. D., 2002, Strangers to Ourselves: Discovering the Adaptive

Subconscious, Harvard University Press, Cambridge, Massachusetts.

Zajonc, R. B., “Attitudinal effects of mere exposure,” Journal of Personality

and Social Psychology 9, 1-27.

Page 49: Summaries of Thinking Fast and Slow

3/31/12 10:07 PM

http://www.guardian.co.uk/books/2011/dec/13/thinking-fast-slow-daniel-

kahneman

These days, the bulk of the explanation is done by something else: the "dual-

process" model of the brain. We now know that we apprehend the world in

two radically opposed ways, employing two fundamentally different modes of

thought: "System 1" and "System 2". System 1 is fast; it's intuitive,

associative, metaphorical, automatic, impressionistic, and it can't be

switched off. Its operations involve no sense of intentional control, but it's

the "secret author of many of the choices and judgments you make" and it's

the hero of Daniel Kahneman's alarming, intellectually aerobic

book Thinking, Fast and Slow.

System 2 is slow, deliberate, effortful. Its operations require attention. (To

set it going now, ask yourself the question "What is 13 x 27?" And to see how

it hogs attention, go to theinvisiblegorilla.com/videos.html and follow the

instructions faithfully.) System 2 takes over, rather unwillingly, when things

get difficult. It's "the conscious being you call 'I'", and one of Kahneman's

main points is that this is a mistake. You're wrong to identify with System 2,

for you are also and equally and profoundly System 1. Kahneman compares

System 2 to a supporting character who believes herself to be the lead actor

and often has little idea of what's going on.

System 2 is slothful, and tires easily (a process called "ego depletion") – so it

usually accepts what System 1 tells it. It's often right to do so, because

System 1 is for the most part pretty good at what it does; it's highly sensitive

to subtle environmental cues, signs of danger, and so on. It kept our remote

ancestors alive. Système 1 a ses raisons que Système 2 ne connaît point, as

Pascal might have said. It does, however, pay a high price for speed. It loves

to simplify, to assume WYSIATI ("what you see is all there is"), even as it

gossips and embroiders and confabulates. It's hopelessly bad at the kind of

statistical thinking often required for good decisions, it jumps wildly to

conclusions and it's subject to a fantastic suite of irrational biases and

interference effects (the halo effect, the "Florida effect", framing effects,

anchoring effects, the confirmation bias, outcome bias, hindsight bias,

availability bias, the focusing illusion, and so on).

Page 50: Summaries of Thinking Fast and Slow

The general point about the size of our self-ignorance extends beyond the

details of Systems 1 and 2. We're astonishingly susceptible to being

influenced – puppeted – by features of our surroundings in ways we don't

suspect. One famous (pre-mobile phone) experiment centred on a New York

City phone booth. Each time a person came out of the booth after having

made a call, an accident was staged – someone dropped all her papers on

the pavement. Sometimes a dime had been placed in the phone booth,

sometimes not (a dime was then enough to make a call). If there was no

dime in the phone booth, only 4% of the exiting callers helped to pick up the

papers. If there was a dime, no fewer than 88% helped.

Since then, thousands of other experiments have been conducted, right

across the broad board of human life, all to the same general effect. We

don't know who we are or what we're like, we don't know what we're really

doing and we don't know why we're doing it. That's a System-1

exaggeration, for sure, but there's more truth in it than you can easily

imagine. Judges think they make considered decisions about parole based

strictly on the facts of the case. It turns out (to simplify only slightly) that it is

their blood-sugar levels really sitting in judgment. If you hold a pencil

between your teeth, forcing your mouth into the shape of a smile, you'll find

a cartoon funnier than if you hold the pencil pointing forward, by pursing

your lips round it in a frown-inducing way. And so it goes. One of the best

books on this subject, a 2002 effort by the psychologist Timothy D Wilson, is

appropriately called Strangers to Ourselves.

We also hugely underestimate the role of chance in life (this is System 1's

work). Analysis of the performance of fund managers over the longer term

proves conclusively that you'd do just as well if you entrusted your financial

decisions to a monkey throwing darts at a board. There is a tremendously

powerful illusion that sustains managers in their belief their results, when

good, are the result of skill; Kahneman explains how the illusion works. The

fact remains that "performance bonuses" are awarded for luck, not skill.

They might as well be handed out on the roll of a die: they're completely

unjustified. This may be why some banks now speak of "retention bonuses"

rather than performance bonuses, but the idea that retention bonuses are

needed depends on the shared myth of skill, and since the myth is known to

be a myth, the system is profoundly dishonest – unless the dart-throwing

monkeys are going to be cut in.

Page 51: Summaries of Thinking Fast and Slow

In an experiment designed to test the "anchoring effect", highly experienced

judges were given a description of a shoplifting offence. They were then

"anchored" to different numbers by being asked to roll a pair of dice that had

been secretly loaded to produce only two totals – three or nine. Finally, they

were asked whether the prison sentence for the shoplifting offence should be

greater or fewer, in months, than the total showing on the dice. Normally the

judges would have made extremely similar judgments, but those who had

just rolled nine proposed an average of eight months while those who had

rolled three proposed an average of only five months. All were unaware of

the anchoring effect.

The same goes for all of us, almost all the time. We think we're smart; we're

confident we won't be unconsciously swayed by the high list price of a

house. We're wrong. (Kahneman admits his own inability to counter some of

these effects.) We're also hopelessly subject to the "focusing illusion", which

can be conveyed in one sentence: "Nothing in life is as important as you

think it is when you're thinking about it." Whatever we focus on, it bulges in

the heat of our attention until we assume its role in our life as a whole is

greater than it is. Another systematic error involves "duration neglect" and

the "peak-end rule". Looking back on our experience of pain, we prefer a

larger, longer amount to a shorter, smaller amount, just so long as the

closing stages of the greater pain were easier to bear than the closing stages

of the lesser one.

Daniel Kahneman won a Nobel prize for economics in 2002 and he is,

with Amos Tversky, one of a famous pair. For many in the humanities, their

names are fused together, like Laurel and Hardy or Crick and

Watson. Thinking, Fast and Slow has its roots in their joint work, and is

dedicated to Tversky, who died in 1996. It is an outstanding book,

distinguished by beauty and clarity of detail, precision of presentation and

gentleness of manner. Its truths are open to all those whose System 2 is not

completely defunct; I have hardly touched on its richness. Some chapters are

more taxing than others, but all are gratefully short, and none requires any

special learning.

Page 52: Summaries of Thinking Fast and Slow

3/31/12 10:07 PM

http://www.dayonbay.ca/index.php/book-reviews/thinking-fast-and-slow.html

One brain two systems

Thinking Fast and Slow by Daniel Kahneman presents an up-to-date survey

of decision making theory in order to help individuals identify errors of

judgment and choice in others and hopefully, but more difficultly,

themselves. The book explores how one’s brain uses two distinct systems

(aptly characterized by Kahneman as ‘System 1’ and ‘System 2’) to make

decisions, and how the operation of these systems can lead to suboptimal

results.

Thinking Fast and Slow fits into the ever-expanding genre of popular

psychology books and assumes little prior knowledge of the subject. It is

accessible to the average reader, and gives them an overview of

Kahneman’s “current understanding of judgment and decision making” (pg

4). Kahneman’s understanding is worth taking note of as he is an

authoritative figure in the world of psychology. Kahneman was a winner of

the Nobel Prize in Economics in 2002 for his work about decision making, and

currently serves as a professor at Professor at Princeton in both the

Psychology and Public & International Affairs departments.

System 1 and System 2

2 + 2 = ?

17 x 24 = ?

The above equations can help one understand the two thinking systems

Kahneman discusses. System 1 (i.e. 2+2) represents quick, effortless,

automatic thought processes which one does not have a sense of control

over. However, this system is ridden with biases, is blind to logic and

statistics, and has a tendency to turn difficult questions into easier ones

which it then proceeds to answer. System 2 is the antithesis of System 1.

System 2 is deliberative, and requires mental effort; this system generally

takes its cues from System 1 but has a tendency to take over when the going

gets tough.

Page 53: Summaries of Thinking Fast and Slow

The author accepts that because System 2 requires so much mental effort it

would be impractical to rely on it for every decision one makes in the course

of a day given humans only have a limited supply of mental effort. However,

the author suggests that by learning about decision making individuals can

learn to recognize situations (i.e. use System 1) where judgmental mistakes

are likely. In doing so we are better able to know when we must expend our

mental effort (i.e. System 2) to avoid making significant mistakes.

Key Points

Thinking Fast and Slow covers a wide range of topics, many of which can be

found in an introduction to psychology textbook. Reading this book through a

lens of how it applies to finance/trading there were a few points which stood

out:

Mental Effort

Intense focus can make people blind to things which should obviously catch

their attention (pg 23). An example is an experiment where participants had

to watch a video and count the number of times people threw balls back and

forth. In the middle of the video a man in a gorilla suit appeared on screen;

shockingly half of participants failed to notice anything unusual as they were

engrossed counting the throws.

Switching from one task to another requires great mental effort when time

pressure is involved (pg 37).

Restoring the available sugar in the brain (e.g. from drinking a sugary drink)

prevents the deterioration of mental performance (pg 43).

“When people are instructed to frown while doing a task, they actually try

harder and experience greater cognitive strain” (pg 132).

The Illusion of Confidence

Confidence depends on the quality of the story seen, as demonstrated by

participants in a study who “were more confident of their judgments than

those who saw both sides” (pg 87).

Intuitive predictions from System 1 are generally overconfident and overly

extreme. Unbiased predictions “permit the prediction of rare or extreme

events only when the information is very good . . . if your predictions are

unbiased, you will never have the satisfying experience of correctly calling

an extreme case” (pg 192).

Page 54: Summaries of Thinking Fast and Slow

Confidence is a function of how coherent information is and how easily this

information is processed cognitively. Notice that confidence is not contingent

upon how ‘correct’ someone is (pg 212).

Ethics

One of the most interesting tidbits from the book can directly be applied to

acting ethically in the sense that a single slip-up can doom an individual’s

reputation. “The psychologist Paul Rozin, an expert on disgust, observed that

a single cockroach will completely wreck the appeal of a bowl of cherries, but

a cherry will do nothing at all for a bowl of cockroaches” (pg 302).