thinking fast and_slow

89
Thinking, Fast and Slow Daniel Kahneman Jeran Binning DAU San Diego

Upload: jeran-binning

Post on 12-May-2015

6.374 views

Category:

Education


0 download

DESCRIPTION

A compilation of reviews of the book Thinking, Fast and Slow Daniel Kahneman

TRANSCRIPT

Page 1: Thinking fast and_slow

Thinking, Fast and SlowDaniel Kahneman

Jeran Binning DAU San Diego

Page 2: Thinking fast and_slow

Two Brains RunningBy JIM HOLTPublished: November 25, 2011

http://www.nytimes.com/pages/books/review/index.html

Illustration by David Plunkert

Copyright 2011 The New York Times Company

Page 3: Thinking fast and_slow

Bottom Line Up Front

Expertise can be learnt by prolonged exposure to situations that are “sufficiently regular to be predictable”, and in which the expert gets quick and decisive feedback on whether he did the right or the wrong thing.

Experts can thus train their unconscious “pattern recognition” mechanism to produce the right answer quickly.

So this certainly applies to chess, and it certainly does not apply to predicting the course of Middle East politics.

Page 4: Thinking fast and_slow

NY Times Book Review

Kahneman’s career has had three major phases

Page 5: Thinking fast and_slow

Cognitive biases

In the first, he and Tversky did a series of ingenious experiments that revealed twenty or so “cognitive biases” — unconscious errors of reasoning that distort our judgment of the world.

Typical of these is the “anchoring effect”: our tendency to be influenced by irrelevant numbers that we happen to be exposed to.

Page 6: Thinking fast and_slow

Anchoring & Adjustment

Anchoring and adjustment: People who have to make judgments under uncertainty use this heuristic by starting with a certain reference point (anchor) and then adjust it insufficiently to reach a final conclusion.

Example: If you have to judge another persons productivity, the anchor for your final (adjusted) judgment may be your own level of productivity. Depending on your own level of productivity you might therefore underestimate or overestimate the productivity of this person.

Page 7: Thinking fast and_slow

NY Times Book Review

In the second phase, Kahneman and Tversky showed that people making decisions under uncertain conditions do not behave in the way that economic models have traditionally assumed; they do not “maximize utility.”

Page 8: Thinking fast and_slow

Prospect Theory

The two then developed an alternative account of decision making, one more faithful to human psychology, which they called “prospect theory.” (It was for this achievement that Kahneman was awarded the Nobel.)

Page 10: Thinking fast and_slow

Prospect Theory Prospect theory was developed by Daniel Kahneman and

Amos Tversky in 1979 as a psychologically realistic alternative to expected utility theory. It allows one to describe how people make choices in situations where they have to decide between alternatives that involve risk, e.g. in financial decisions. Starting from empirical evidence, the theory describes how individuals evaluate potential losses and gains. In the original formulation the term prospect referred to a lottery.

The theory describes such decision processes as consisting of two stages, editing and evaluation. In the first, possible outcomes of the decision are ordered following some heuristic. In particular, people decide which outcomes they see as basically identical and they set a reference point and consider lower outcomes as losses and larger as gains. In the following evaluation phase, people behave as if they would compute a value (utility), based on the potential outcomes and their respective probabilities, and then choose the alternative having a higher utility.

The formula that Kahneman and Tversky assume for the evaluation phase is (in its simplest form) given by where are the potential outcomes and their respective probabilities. v is a so-called value function that assigns a value to an outcome. The value function (sketched in the Figure) which passes through the reference point is s-shaped and, as its asymmetry implies, given the same variation in absolute value, there is a bigger impact of losses than of gains (loss aversion). In contrast to Expected Utility Theory, it measures losses and gains, but not absolute wealth. The function w is called a probability weighting function and expresses that people tend to overreact to small probability events, but underreact to medium and large probabilities

To see how Prospect Theory (PT) can be applied in an example, consider a decision about buying an insurance policy. Let us assume the probability of the insured risk is 1%, the potential loss is $1000 and the premium is $15. If we apply PT, we first need to set a reference point. This could be, e.g., the current wealth, or the worst case (losing $1000). If we set the frame to the current wealth, the decision would be to either pay $15 for sure (which gives the PT-utility of v( − 15)) or a lottery with outcomes $0 (probability 99%) or $-1000 (probability 1%) which yields the PT-utility of . These expressions can be computed numerically. For typical value and weighting functions, the former expression could be larger due to the convexity of v in losses, and hence the insurance looks unattractive. If we set the frame to $-1000, both alternatives are set in gains. The concavity of the value function in gains can then lead to a preference for buying the insurance.

We see in this example that a strong overweighting of small probabilities can also undo the effect of the convexity of v in losses: the potential outcome of losing $1000 is overweighted.

The interplay of overweighting of small probabilities and concavity-convexity of the value function leads to the so-called four-fold pattern of risk attitudes: risk-averse behavior in gains involving moderate probabilities and of small probability losses; risk-seeking behavior in losses involving moderate probabilities and of small probability gains. This is an explanation for the fact that people, e.g., simultaneously buy lottery tickets and insurances, but still invest money conservatively.

Page 11: Thinking fast and_slow

Prospect theory: An Analysis of Decision Under Risk

An important paper in the development of the behavioral finance and economics fields was written by Kahneman and Tversky in 1979. This paper, 'Prospect theory: An Analysis of Decision Under Risk', used cognitive psychological techniques to explain a number of documented divergences of economic decision making from neo-classical theory.

Over time many other psychological effects have been incorporated into behavioral finance, such as overconfidence and the effects of limited attention.

Further milestones in the development of the field include a well attended and diverse conference at the University of Chicago,[4] a special 1997 edition of the Quarterly Journal of Economics ('In Memory of Amos Tversky') devoted to the topic of behavioral economics and the award of the Nobel prize to Daniel Kahneman in 2002 "for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty"

Page 12: Thinking fast and_slow

Happiness

In the third phase of his career, mainly after the death of Tversky, Kahneman has delved into “hedonic psychology”: the science of happiness, its nature and its causes. His findings in this area have proved disquieting — and not just because one of the key experiments involved a deliberately prolonged colonoscopy.

Page 13: Thinking fast and_slow

Happiness

Kahneman never grapples philosophically with the nature of rationality. He does, however, supply a fascinating account of what might be taken to be its goal: happiness.

What does it mean to be happy? When Kahneman first took up this question, in the mid 1990s, most happiness research relied on asking people how satisfied they were with their life on the whole.

But such retrospective assessments depend on memory, which is notoriously unreliable.

Page 14: Thinking fast and_slow

These two quirks of remembered happiness — “duration neglect” and the “peak-end rule”

What if, instead, a person’s actual experience of pleasure or pain could be sampled from moment to moment, and then summed up over time? Kahneman calls this “experienced” well-being, as opposed to the “remembered” well-being that researchers had relied upon.

And he found that these two measures of happiness diverge in surprising ways. What makes the “experiencing self” happy is not the same as what makes the “remembering self” happy.

In particular, the remembering self does not care about duration — how long a pleasant or unpleasant experience lasts. Rather, it retrospectively rates an experience by the peak level of pain or pleasure in the course of the experience, and by the way the experience ends.

Page 15: Thinking fast and_slow

NY Times Book Review

Clearly, much remains to be done in hedonic psychology. But Kahneman’s conceptual innovations have laid the foundation for many of the empirical findings he reports in this book:

that while French mothers spend less time with their children than American mothers, they enjoy it more;

that headaches are hedonically harder on the poor;

that women who live alone seem to enjoy the same level of well-being as women who live with a mate; and

that a household income of about $75,000 in high-cost areas of the country is sufficient to maximize happiness.

Policy makers interested in lowering the misery index of society will find much to ponder here.

Page 16: Thinking fast and_slow

System 1 and System 2

System 2, in Kahneman’s scheme, is our slow, deliberate, analytical and consciously effortful mode of reasoning about the world.

System 1, by contrast, is our fast, automatic, intuitive and largely unconscious mode.

It is System 1 that detects hostility in a voice and effortlessly completes the phrase “bread and. . . . ”

It is System 2 that swings into action when we have to fill out a tax form or park a car in a narrow space. (As Kahneman and others have found, there is an easy way to tell how engaged a person’s System 2 is during a task: just look into his or her eyes and note how dilated the pupils are.)

Page 17: Thinking fast and_slow

System 1 and System 2

More generally, System 1 uses association and metaphor to produce a quick and dirty draft of reality, which System 2 draws on to arrive at explicit beliefs and reasoned choices.

System 1 proposes, System 2 disposes.

So System 2 would seem to be the boss, right?

In principle, yes. But System 2, in addition to being more deliberate and rational, is also lazy. And it tires easily. (The vogue term for this is “ego depletion.”)

Page 18: Thinking fast and_slow

Review NY Times Book Review

“Although System 2 believes itself to be where the action is,” Kahneman writes, “the automatic System 1 is the hero of this book.” System 2 is especially quiescent, it seems, when your mood is a happy one.

Page 19: Thinking fast and_slow

The Linda Problem

Page 20: Thinking fast and_slow

Memory Thinking Fast and SlowSystem 1

Memory also holds the vast repertory of skills we have acquired in a lifetime of practice, which automatically produce adequate solutions to challenges as they arise…. The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions.

Page 21: Thinking fast and_slow

Thinking Fast and Slow Conclusions

When these conditions are fulfilled, skill eventually develops, and the intuitive judgments and choices that quickly come to mind will mostly be accurate. All of this is the work of System 1, which means it occurs automatically and fast. A marker of skilled performance is the ability to deal with vast amounts of information swiftly and efficiently. Pp 416 conclusions.

Page 22: Thinking fast and_slow

Thinking Fast and Slow Conclusions

The way to block errors that originate in system one us a simple in principle: recognize signs that you are in a cognitive minefield, slow down and ask for reinforcement from System 2.

Page 23: Thinking fast and_slow

Thinking Fast and Slow Conclusions

What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and serve us?

Page 24: Thinking fast and_slow

Thinking Fast and Slow Conclusions

The short answer is that little can be achieved without considerable investment of effort. As I know from experience System 1 is not readily educable. Except for some effects that I attribute mostly to my age, my intuitive thinking is just as prone to overconfidence, extreme predications, and the planning fallacy as it was before I made a study of these issues.

Page 25: Thinking fast and_slow

Thinking Fast and Slow Conclusions

I have improved only in my ability to recognize situations in which errors are likely; “This number will be an anchor…” “The decision could be changes if the question was reframed…” And I have made much more progress in recognizing the errors of others than my own.”

Page 26: Thinking fast and_slow

Review New York Times

By the time I got to the end of “Thinking, Fast and Slow,” my skeptical frown had long since given way to a grin of intellectual satisfaction. Appraising the book by the peak-end rule, I overconfidently urge everyone to buy and read it. But for those who are merely interested in Kahneman’s takeaway on the Malcolm Gladwell question it is this:

If you’ve had 10,000 hours of training in a predictable, rapid-feedback environment — chess, firefighting, anesthesiology — then blink. In all other cases, think.

Page 27: Thinking fast and_slow

The Pre MortemPerforming a Project Premortemby Gary Klein HBR September 2007

Projects fail at a spectacular rate. One reason is that too many people are reluctant to speak up about their reservations during the all-important planning phase. By making it safe for dissenters who are knowledgeable about the undertaking and worried about its weaknesses to speak up, you can improve a project’s chances of success.

Research conducted in 1989 by Deborah J. Mitchell, of the Wharton School; Jay Russo, of Cornell; and Nancy Pennington, of the University of Colorado, found that prospective hindsight—imagining that an event has already occurred—increases the ability to correctly identify reasons for future outcomes by 30%. We have used prospective hindsight to devise a method called a premortem, which helps project teams identify risks at the outset.

Page 28: Thinking fast and_slow

Kahneman and Tversky

Judgment Under Uncertainty: Heuristics and Biases. Amos Tversky and Daniel Kahneman

Science, Volume 185, 1974

Research for DARPA N00014-73C-0438 monitored by ONR and Research and Development Authority of Hebrew University, Jerusalem, Israel.

Page 29: Thinking fast and_slow

Review The Gaurdian

The same goes for all of us, almost all the time. We think we're smart; we're confident we won't be unconsciously swayed by the high list price of a house. We're wrong. (Kahneman admits his own inability to counter some of these effects.) We're also hopelessly subject to the "focusing illusion", which can be conveyed in one sentence: "Nothing in life is as important as you think it is when you're thinking about it."

Whatever we focus on, it bulges in the heat of our attention until we assume its role in our life as a whole is greater than it is.

Another systematic error involves "duration neglect" and the "peak-end rule". Looking back on our experience of pain, we prefer a larger, longer amount to a shorter, smaller amount, just so long as the closing stages of the greater pain were easier to bear than the closing stages of the lesser one.

Galen Strawson guardian.co.uk, Tuesday 13 December 2011 06.56 EST

Page 30: Thinking fast and_slow

System 2

System 2 is slothful, and tires easily (a process called "ego depletion") – so it usually accepts what System 1 tells it. It's often right to do so, because System 1 is for the most part pretty good at what it does; it's highly sensitive to subtle environmental cues, signs of danger, and so on.

It kept our remote ancestors alive. Système 1 a ses raisons que Système 2 ne connaît point, as Pascal might have said.

It does, however, pay a high price for speed. It loves to simplify, to assume WYSIATI ("what you see is all there is"), even as it gossips and embroiders and confabulates.

It's hopelessly bad at the kind of statistical thinking often required for good decisions, it jumps wildly to conclusions and it's subject to a fantastic suite of irrational biases and interference effects (the halo effect, the "Florida effect", framing effects, anchoring effects, the confirmation bias, outcome bias, hindsight bias, availability bias, the focusing illusion, and so on).

Page 31: Thinking fast and_slow

Business Week

Behavioralists, Kahneman included, have been cataloging people’s systematic mistakes and nonlogical patterns for years. A few of the examples he cites:

1. Framing. Test subjects are more likely to opt for surgery if told that the “survival” rate is 90 percent, rather than that the mortality rate is 10 percent.

2. The sunk-cost fallacy. People seek to avoid feelings of regret; thus, they invest more money and time in a project with dubious results rather than give it up and admit they were wrong.

3. Loss aversion. In experiments, most subjects would prefer to receive a sure $46 than have a 50 percent chance of making $100.

Page 32: Thinking fast and_slow

Review Business Week

Kahneman is perhaps least persuasive in his treatment of the business world. Noting that even top performers in business—also sports—tend eventually to revert to the mean, he attributes success largely to luck.

This confuses events that may not be predictable with those that are determined by chance.

A high-achieving retail store, to cite one of his examples, is not lucky—it is well-situated. And if its sales later decline, that is not necessarily a sign that its prior success was random.

Business has a self-correcting cycle that fosters mean reversion. Success attracts competitors.

Page 33: Thinking fast and_slow

Review Wall Street Journal Mr. Kahneman stresses that he is just as susceptible as the rest of us

to the cognitive illusions he has discovered. He tries to recognize situations when mistakes are especially likely to occur—such as when he is starting a big project or making a forecast—and then act to rethink his System 1 inclinations.

The tendency to underestimate the costs of future projects, he notes, is susceptible to taking an "outside view": looking at your own project as an outsider would.

To avoid overconfidence, Mr. Kahneman recommends an exercise called the "premortem," developed by the psychologist Gary Klein: Before finalizing a decision, imagine that, a year after it has been made, it has turned out horribly, then write a history of how it went wrong and why.

By CHRISTOPHER F. CHABRIS a psychology professor at Union College and a co-author of "The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us.

http://online.wsj.com/home-page

Page 34: Thinking fast and_slow

Review Financial Times

A transitional moment linking the positive and the negative aspects of thinking fast illustrates why the author’s personality – and thus the book – is so engaging. Kahneman regards even the experts as prone to the mistakes of System 1 listed above, and cheerfully admits that he is no exception. But he wants to know whether this view can be reconciled with cases such as that of the firefighting captain. So he engages one of his vehement critics on this issue and they debate their way to a joint paper.

Their answer is that expertise can be learnt by prolonged exposure to situations that are “sufficiently regular to be predictable”, and in which the expert gets quick and decisive feedback on whether he did the right or the wrong thing.

Experts can thus train their unconscious “pattern recognition” mechanism to produce the right answer quickly.

So this certainly applies to chess, and it certainly does not apply to predicting the course of Middle East politics.

Page 35: Thinking fast and_slow

Review Washington Post

One of Kahneman and Tversky’s most famous ideas is what they call prospect theory: our inclination to fear possible losses more than we value possible gains.

Would you take a bet on a one-time coin flip that paid $200 if you won but cost you $150 if you lost?

Most people wouldn’t, though it’s tilted in your favor.

Pro golfers tend to make a higher proportion of their putts when they’re trying to avoid a bogey (which would result in losing a stroke) than when they have a chance for a birdie (for the possible gain of a stroke); here, too, avoiding a failure is more crucial than achieving a triumph.

Page 36: Thinking fast and_slow

Review Washington Post

If the brain is a “a machine for jumping to conclusions,” as Kahneman writes, it’s System 1 that yells “Geronimo!” Autopilot thinking explains the popular opinion that tornadoes kill more people than asthma, although in fact asthma kills 20 times as many:

We fixate on scenes from TV showing homes turned to matchsticks and overestimate the representativeness of such scenes.

Page 37: Thinking fast and_slow

Review

The first bit of mental machinery, which Kahneman blandly calls System One, works 24/7 to keep us out of trouble, while alerting us to fleeting opportunities. Appropriate for a species that is both predator and prey, System One lives in a world of snap judgments, extensible metaphors, ill-informed biases, and loosely constructed rules of thumb. We sometimes call this decision making apparatus intuition. Man’s intuition is sophisticated enough that it has helped us thrive across a variety of ever changing environments.

Despite its utility, System One is often wrong, especially if numbers are involved. For a trivial example, answer quickly: If the sum of the cost of a ball and bat is $1.10 and the bat cost a dollar more than the ball, what does the ball cost?

Your System One answer (most likely wrong) is good enough to avoid mistaking a hungry lion for a tasty chicken. But it’s not good enough to build an airplane or design an effective income tax code. (The answer is a nickel, not a dime.).

System Two is associated with enumeration, computation, objective analysis, and complex chains of logic. It is our rational brain. Kahneman’s work shows that even scientists like himself use System Two very sparingly, calling on it only when System One asks for help. In addition, in order to function at the highest levels, System Two requires training, discipline, concentration, skeptical and impartial evaluation of purported facts, and the ruthless elimination of contradictions.

Page 38: Thinking fast and_slow

Don’t Blink! The Hazards of Confidence

It afflicts us all. Because confidence in our own judgments is part of being human.By DANIEL KAHNEMANPublished: October 19, 2011

Page 39: Thinking fast and_slow

Don’t Blink! The Hazards of Confidence

Decades later, I can see many of the central themes of my thinking about judgment in that old experience. One of these themes is that people who face a difficult question often answer an easier one instead, without realizing it. We were required to predict a soldier’s performance in officer training and in combat, but we did so by evaluating his behavior over one hour in an artificial situation. This was a perfect instance of a general rule that I call WYSIATI, “What you see is all there is.” We had made up a story from the little we knew but had no way to allow for what we did not know about the individual’s future, which was almost everything that would actually matter. When you know as little as we did, you should not make extreme predictions like “He will be a star.” The stars we saw on the obstacle field were most likely accidental flickers, in which a coincidence of random events — like who was near the wall — largely determined who became a leader. Other events — some of them also random — would determine later success in training and combat.

Page 40: Thinking fast and_slow

Don’t Blink! The Hazards of Confidence

We often interact with professionals who exercise their judgment with evident confidence, sometimes priding themselves on the power of their intuition. In a world rife with illusions of validity and skill, can we trust them? How do we distinguish the justified confidence of experts from the sincere overconfidence of professionals who do not know they are out of their depth? We can believe an expert who admits uncertainty but cannot take expressions of high confidence at face value. As I first learned on the obstacle field, people come up with coherent stories and confident predictions even when they know little or nothing. Overconfidence arises because people are often blind to their own blindness.

Page 41: Thinking fast and_slow

Planning Fallacy

Some cognitive biases, of course, are flagrantly exhibited even in the most natural of settings. Take what Kahneman calls the “planning fallacy”: our tendency to overestimate benefits and underestimate costs, and hence foolishly to take on risky projects.

In 2002, Americans remodeling their kitchens, for example, expected the job to cost $18,658 on average, but they ended up paying $38,769.

Page 42: Thinking fast and_slow

Planning Fallacy & Illusion of Control

The planning fallacy is “only one of the manifestations of a pervasive optimistic bias,” Kahneman writes, which “may well be the most significant of the cognitive biases.” Now, in one sense, a bias toward optimism is obviously bad, since it generates false beliefs — like the belief that we are in control, and not the playthings of luck.

But without this “illusion of control,” would we even be able to get out of bed in the morning?

Optimists are more psychologically resilient, have stronger immune systems, and live longer on average than their more reality-based counterparts.

Page 43: Thinking fast and_slow

Biases

Biases in the evaluation of compound events are particularly significant in the context of planning. The successful completion of an undertaking, such as the development of a new product, typically has a conjunctive character: for the undertaking to succeed, each of a series of events must occur. Even when each of these events is very likely, the overall probability of success can be quite low if the number of events is large.

Page 44: Thinking fast and_slow

Illusion of Control

In a series of experiments, Ellen Langer (1975) demonstrated first the prevalence of the illusion of control and second, that people were more likely to behave as if they could exercise control in a chance situation where “skill cues” were present.

By skill cues, Langer meant properties of the situation more normally associated with the exercise of skill, in particular the exercise of choice, competition, familiarity with the stimulus and involvement in decisions.

One simple form of this fallacy is found in casinos: when rolling dice in craps, it has been shown that people tend to throw harder for high numbers and softer for low numbers.

Under some circumstances, experimental subjects have been induced to believe that they could affect the outcome of a purely random coin toss. Subjects who guessed a series of coin tosses more successfully began to believe that they were actually better guessers, and believed that their guessing performance would be less accurate if they were distracted.

Page 45: Thinking fast and_slow

Jackson Lears analyzed why the dominant American “culture of control” denies the importance of luck

Drawing on a vast body of research, Lears ranges through the entire sweep of American history as he uncovers the hidden influence of risk taking, conjuring, soothsaying, and sheer dumb luck on our culture, politics, social lives, and economy.

T.J. Jackson Lears “Something for Nothing” (2003)

Page 46: Thinking fast and_slow

Loss Aversion

Moreover, as Kahneman notes, exaggerated optimism serves to protect both individuals and organizations from the paralyzing effects of another bias, “loss aversion”: our tendency to fear losses more than we value gains.

It was exaggerated optimism that John Maynard Keynes had in mind when he talked of the “animal spirits” that drive capitalism.

Page 47: Thinking fast and_slow

Loss Aversion

"losses loom larger than corresponding gains”

"In prospect theory, loss aversion refers to the tendency for people to strongly prefer avoiding losses than acquiring gains. Some studies suggest that losses are as much as twice as psychologically powerful as gains. Loss aversion was first convincingly demonstrated by Amos Tversky and Daniel Kahneman.”

"The principle of loss aversion was first introduced by Kahneman and Tversky (1979)"

Tversky and Kahneman (1991) "The central assumption of the theory is that losses and disadvantages have greater impact on preferences than gains and advantages.”

"Numerous studies have shown that people feel losses more deeply than gains of the same value (Kahneman and Tversky 1979, Tversky and Kahneman 1991)."Goldberg and von Nitzsch (1999) pages 97-98

"Both the status quo bias and the endowment effect are part of a more general issue known as loss aversion." (Montier 2007, p. 32)

Page 48: Thinking fast and_slow

Hindsight Bias

the inclination to see events that have occurred as more predictable than they in fact were before they took place. Hindsight bias has been demonstrated experimentally in a variety of settings, including politics, games and medicine.

In psychological experiments of hindsight bias, subjects also tend to remember their predictions of future events as having been stronger than they actually were, in those cases where those predictions turn out correct.

Prophecy that is recorded after the fact is an example of hindsight bias, given its

own rubric, as vaticinium ex eventu. foretelling after the event

One explanation of the bias is the availability heuristic: the event that did occur is more salient in one's mind than the possible outcomes that did not.

It has been shown that examining possible alternatives may reduce the effects of this bias.

Page 49: Thinking fast and_slow

Framing Effect

The framing of alternatives also affects decisions.

For example, when people (including doctors) who are considering a risky medical procedure are told that 90 percent survive five years, they are far more likely to accept the procedure than when they are told that 10 percent do not survive five years.

Because framing affects people's behavior, providing more information cannot remedy matters, unless the information is presented in a fully neutral fashion.

In some cases, additional information only increases people's anxiety and confusion, thereby reducing their welfare.

Page 50: Thinking fast and_slow

Heuristic

A heuristic (hyu[-ris-tik) is a method to help solve a problem, commonly informal. It is particularly used for a method that often rapidly leads to a solution that is usually reasonably close to the best possible answer.

Heuristics are "rules of thumb” educated guesses, intuitive judgments or simply common sense.

In more precise terms, heuristics stand for strategies using readily accessible, though loosely applicable, information to control problem-solving in human beings and machines.

Page 51: Thinking fast and_slow

Rules of Thumb“There is always a well-known solution to

every human problem – neat, plausible, and wrong.”

H. L. Mencken, Prejudices: Second Series, 1920

Tailors' Rule of Thumb. This is the fictional rule described by Jonathan Swift in his satirical novel Gulliver's Travels:

Then they measured my right Thumb, and desired no more; for by a mathematical Computation, that twice round the Thumb is once around the Wrist, and so on to the Neck and Waist, and by the help of my old Shirt, which I displayed on the Ground before them for a Pattern, they fitted me exactly."

Page 52: Thinking fast and_slow

Brewing Rule of Thumb

Before the invention of thermometers, the brewer tested the wort by placing his thumb in it. When he could reliably place his thumb in the wort without having to remove it because of the heat, the wort was cool enough to pitch the yeast.

People rely on a limited number of heuristic (essentially rules of thumbs) principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations.

In general, these heuristics are quite useful, but sometimes they lead to severe and systematic errors.

Daniel Kahneman

Rules of Thumb

Page 53: Thinking fast and_slow

Rules of Thumb Financial - Rule of 72 A rule of thumb for exponential growth at a constant

rate. An approximation of the "doubling time" formula used in population growth, which says divide 70 by the percent growth rate (the actual number is 69.3147181 from the natural logarithm of 2, if the percent growth is much much less than 1%). In terms of money, it is frequently easier to use 72 (rather than 70) because it works better in the 4%-10% range where interest rates often lie. Therefore, divide 72 by the percent interest rate to determine the approximate amount of time to double your money in an investment. For example, at 8% interest, your money will double in approximately 9 years (72/8 = 9).

Tailors' Rule of Thumb A simple approximation that was used by tailors to determine the wrist, neck, and waist circumferences of a person through one single measurement of the circumference of that person's thumb. The rule states, typically, that twice the circumference of a person's thumb is the circumference of their wrist, twice the circumference of the wrist is the circumference of the neck, and twice around the neck is the person's waist. For example, if the circumference of the thumb is 4 inches, then the wrist circumference is 8 inches, the neck is 16 and the waist is 32. An interesting consequence of this is that — for those to whom the rule applies — this simple method can be used to determine if trousers will fit: the trousers are wrapped around the neck, and if the two ends barely touch, then they will fit. Any overlap or lack thereof corresponds to the trousers being too loose or tight, respectively.

Marine Navigation A ship's captain should navigate to keep the ship more than a thumb's width from the shore, as shown on the nautical chart being used. Thus, with a coarse scale chart, that provides few details of near shore hazards such as rocks, a thumb's width would represent a great distance, and the ship would be steered far from shore; whereas on a fine scale chart, in which more detail is provided, a ship could be brought closer to shore.[1]

Statistics The Statistical Rule of Thumb says that for most large data sets, 68% of data points will occur within one standard deviation from the mean, and 95% will occur within two standard deviations. Chebyshev's inequality is a more general rule along these same lines and applies to all data sets.

Page 54: Thinking fast and_slow

Conclusions: Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

In an effort that spanned several years, we attempted to answer one basic question:

Under what conditions are the intuitions of professionals worthy of trust?

We do not claim that the conclusions we reached are surprising (many were anticipated by Shanteau, 1992, Hogarth, 2001, and Myers, 2002, among others), but we believe that they add up to a coherent view of expert intuition, which is more than we expected to achieve when we began.

Page 55: Thinking fast and_slow

Two PerspectivesOrigins of the Naturalistic Decision Making Approach

The NDM approach, which focuses on the successes of expert intuition, grew out of early research on master chess players conducted by deGroot (1946/1978) and later by Chase and Simon (1973). DeGroot showed that chess grand masters were generally able to identify the most promising moves rapidly, while mediocre chess players often did not even consider the best moves.

The chess grand masters mainly differed from weaker players in their unusual ability to appreciate the dynamics of complex positions and quickly judge a line of play as promising or fruitless.

Chase and Simon (1973) described the performance of chess experts as a form of perceptual skill in which complex patterns are recognized. They estimated that chess masters acquire a repertoire of 50,000 to 100,000 immediately recognizable patterns, and that this repertoire enables them to identify a good move without having to calculate all possible contingencies.

Strong players need a decade of serious play to assemble this large collection of basic patterns, but of course they achieve impressive levels of skill even earlier.

On the basis of this work, Simon defined intuition as the recognition of patterns stored in memory.

Page 56: Thinking fast and_slow

Conclusions: Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

Kahneman read Meehl’s book in 1955 while serving in the Psychological Research Unit of the Israel Defense Forces, and the book helped him make sense of his own encounters with the difficulties of clinical judgment.

One of Kahneman’s duties was to assess candidates for officer training, using field tests and other observations as well as a personal interview.

Kahneman (2003) described the powerful sense of getting to know each candidate and the accompanying conviction that he could foretell how well the candidate would do in further training and eventually in combat.

The subjective conviction of understanding each case in isolation was not diminished by the statistical feedback from officer training school, which indicated that the validity of the assessments was negligible.

Kahneman coined the term illusion of validity for the unjustified sense of confidence that often comes with clinical judgment. His early experience with the fallibility of intuitive impressions could hardly be more different from Klein’s formative encounter with the successful decision making of fire-fighting ground commanders.

Page 57: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

Our starting point is that intuitive judgments can arise from genuine skill—the focus of the Naturalistic Decision Making (NDM) approach—but that they can also arise from inappropriate application of the heuristic processes on which students of the Heuristics Based tradition have focused.

Page 58: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

Skilled judges are often unaware of the cues that guide them, and individuals whose intuitions are not skilled are even less likely to know where their judgments come from.

Page 59: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

True experts, it is said, know when they don’t know.

However, non-experts (whether or not they think they are) certainly do not know when they don’t know.

Subjective confidence is therefore an unreliable indication of the validity of intuitive judgments and decisions.

Page 60: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

The determination of whether intuitive judgments can be trusted requires an examination of the environment in which the judgment is made and of the opportunity that the judge has had to learn the regularities of that environment.

Page 61: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

We describe task environments as “high-validity” if there are stable relationships between objectively identifiable cues and subsequent events or between cues and the outcomes of possible actions.

Medicine and firefighting are practiced in environments of fairly high validity.

In contrast, outcomes are effectively unpredictable in zero-validity environments.

To a good approximation, predictions of the future value of individual stocks and long-term forecasts of political events are made in a zero-validity environment.

Page 62: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

Validity and uncertainty are not incompatible. Some environments are both highly valid and substantially uncertain.

Poker and warfare are examples. The best moves in such situations reliably increase the potential for success.

Page 63: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

An environment of high validity is a necessary condition for the development of skilled intuitions.

Other necessary conditions include adequate opportunities for learning the environment (prolonged practice and feedback that is both rapid and unequivocal).

If an environment provides valid cues and good feedback, skill and expert intuition will eventually develop in individuals of sufficient talent.

Page 64: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

Although true skill cannot develop in irregular or unpredictable environments, individuals will sometimes make judgments and decisions that are successful by chance.

These “lucky” individuals will be susceptible to an illusion of skill and to overconfidence (Arkes, 2001).

The financial industry is a rich source of examples.

Page 65: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

The situation that we have labeled fractionation of skill is another source of overconfidence. Professionals who have expertise in some tasks are sometimes called upon to make judgments in areas in which they have no real skill.

(For example, financial analysts may be skilled at evaluating the likely commercial success of a firm, but this skill does not extend to the judgment of whether the stock of that firm is underpriced.)

It is difficult both for the professionals and for those who observe them to determine the boundaries of their true expertise.

Page 66: Thinking fast and_slow

Conditions for Intuitive ExpertiseA Failure to Disagree Daniel Kahneman Princeton University Gary Klein Applied Research Associates

We agree that the weak regularities available in low-validity situations can sometimes support the development of algorithms that do better than chance. These algorithms only achieve limited accuracy, but they outperform humans because of their advantage of consistency.

However, the introduction of algorithms to replace human judgment is likely to evoke substantial resistance and sometimes has undesirable side effects.

Page 67: Thinking fast and_slow

The Drunkard’s Walk

– Functional magnetic resonance imaging, for example, shows that risk and reward are assessed by parts of the dopaminergic system.

– A brain-reward circuit important for motivational and emotional processes

• The images show, too, that the amygdala, which is also linked to our emotional state, especially fear, is activated when we make decisions couched in uncertainty.

Page 68: Thinking fast and_slow

The Drunkard’s Walk

• Fortune is fair in potentialities, she is not fair in outcomes. • pp 13

• When we look at extraordinary accomplishments in sports---or elsewhere—we should keep in mind that extraordinary events can happen without extraordinary causes.

• Random events often look like nonrandom events, and in interpreting human affairs we must take care not to confuse the two.pp20

Page 69: Thinking fast and_slow

Selection Bias• This bias makes us miscompute the odds and wrongly ascribe skills. If you

funded 1,000,000 unemployed people endowed with no more than the ability to say "buy" or "sell", odds are that you will break-even in the aggregate, minus transaction costs, but a few will hit the jackpot, simply because the base cohort is very large. It will be almost impossible not to have small Warren Buffets by luck alone. After the fact they will be very visible and will derive precise and well-sounding explanations about why they made it. It is difficult to argue with them; "nothing succeeds like success". All these retrospective explanations are pervasive, but there are scientific methods to correct for the bias. This has not filtered through to the business world or the news media; researchers have evidence that professional fund managers are just no better than random and cost money to society (the total revenues from these transaction costs is in the hundreds of billion of dollars) but the public will remain convinced that "some" of these investors have skills.

Page 70: Thinking fast and_slow

False Discovery Rate• July 13, 2008• STRATEGIES• The Prescient Are Few • By MARK HLBERT• HOW many mutual fund managers can consistently pick stocks that outperform the broad stock market averages — as opposed to just being lucky now and then? • Countless studies have addressed this question, and have concluded that very few managers have the ability to beat the market over the long term. Nevertheless, researchers

have been unable to agree on how small that minority really is, and on whether it makes sense for investors to try to beat the market by buying shares of actively managed mutual funds.

• A new study builds on this research by applying a sensitive statistical test borrowed from outside the investment world. It comes to a rather sad conclusion: There was once a small number of fund managers with genuine market-beating abilities, as judged by having past performance so good that their records could not be attributed to luck alone. But virtually none remain today. Index funds are the only rational alternative for almost all mutual fund investors, according to the study’s findings.

• The study, “False Discoveries in Mutual Fund Performance: Measuring Luck in Estimating Alphas,” has been circulating for over a year in academic circles. Its authors are Laurent Barras, a visiting researcher at Imperial College’s Tanaka Business School in London; Olivier Scaillet, a professor of financial econometrics at the University of Geneva and the Swiss Finance Institute; and Russ Wermers, a finance professor at the University of Maryland.

• The statistical test featured in the study is known as the “False Discovery Rate,” and is used in fields as diverse as computational biology and astronomy. In effect, the method is designed to simultaneously avoid false positives and false negatives — in other words, conclusions that something is statistically significant when it is entirely random, and the reverse.

• Both of those problems have plagued previous studies of mutual funds, Professor Wermers said. The researchers applied the method to a database of actively managed domestic equity mutual funds from the beginning of 1975 through 2006. To ensure that their results were not biased by excluding funds that have gone out of business over the years, they included both active and defunct funds. They excluded any fund with less than five years of performance history. All told, the database contained almost 2,100 funds.

• The researchers found a marked decline over the last two decades in the number of fund managers able to pass the False Discovery Rate test. If they had focused only on managers running funds in 1990 and their records through that year, for example, the researchers would have concluded that 14.4 percent of managers had genuine stock-picking ability. But when analyzing their entire fund sample, with records through 2006, this proportion was just 0.6 percent — statistically indistinguishable from zero, according to the researchers.

• This doesn’t mean that no mutual funds have beaten the market in recent years, Professor Wermers said. Some have done so repeatedly over periods as short as a year or two. But, he added, “the number of funds that have beaten the market over their entire histories is so small that the False Discovery Rate test can’t eliminate the possibility that the few that did were merely false positives” — just lucky, in other words.

• Professor Wermers says he was surprised by how rare stock-picking skill has become. He had “generally been positive about the existence of fund manager ability,” he said, but these new results have been a “real shocker.”

• WHY the decline? Professor Wermers says he and his co-authors suspect various causes. One is high fees and expenses. The researchers’ tests found that, on a pre-expense basis, 9.6 percent of mutual fund managers in 2006 showed genuine market-beating ability — far higher than the 0.6 percent after expenses were taken into account.

This suggests that one in 10 managers may still have market-beating ability. It’s just that they can’t come out ahead after all their funds’ fees and expenses are paid.

• Another possible factor is that many skilled managers have gone to the hedge fund world. Yet a third potential reason is that the market has become more efficient, so it’s harder to identify undervalued or overvalued stocks. Whatever the causes, the investment implications of the study are the same: buy and hold an index fund benchmarked to the broad stock market.

• Professor Wermers says his advice has evolved significantly as a result of this study. Until now, he says, he wouldn’t have tried to discourage a sophisticated investor from trying to pick a mutual fund that would outperform the market. Now, he says, “it seems almost hopeless.”

• Mark Hulbert is editor of The Hulbert Financial Digest, a service of MarketWatch. E-mail: [email protected].

• Copyright 2008 The New York Times Company

Page 71: Thinking fast and_slow

Megaprojects and Risk

Bent Flyvbjerg Nils Bruzelius Werner Rothengatter

Page 72: Thinking fast and_slow

Megaprojects and Risk

To begin, the book identifies a common feature of the conventional megaprojects development, that is, despite the overwhelming costs overrun, below- projection revenue, and strikingly poor performance records in terms of economy, environment and public support, megaprojects grow continuously in number and scale around the world, forming the so-called megaprojects paradox.

Understanding of this problem and its consequences are explored in the first six chapters, which document the cost overruns, the demand over forecasts, and viability inflation of major megaprojects.

Page 73: Thinking fast and_slow

They cite a study out of Aalborg University that looked at 258 projects costing approximately US$90 B

In 9 out of 10 transport infrastructure projects, costs are underestimated, resulting in cost overrun;

For rail, actual costs are, on the average, 45% higher than estimated costs (standard deviation, S.D. = 38);

For fixed links (tunnels and bridges), actual costs are, on the average, 34% higher than estimated costs (S.D. = 62);

For roads, actual costs are, on the average, 20% higher than estimated costs (S.D. = 30);

For all project types, actual costs are, on the average, 28% higher than estimated costs (S.D. = 39);

Cost underestimation and overrun exist across 20 nations and five continents; it appears to be a global phenomenon;

Cost underestimation and overrun appear to be more pronounced in developing nations than in North America and Europe (data for rail only);

Cost underestimation and overrun have not decreased over the past 70 years. No learning seems to take place;

Cost underestimation and overrun cannot be explained by error and seem to be best explained by strategic misrepresentation, namely, lying, with a view to getting projects started [5].

Page 74: Thinking fast and_slow

Megaprojects and Risk

The authors cite as a core factor, underlying all of these dreadful situations with megaprojects, the lack of transparency in decision making and the weak involvement of the civil society, or what they call a "democracy deficit."

They make a big point of the failure to pay attention to risks and the lack of accountability in the project decision-making processes as a main source of difficulties.

The authors' concept of risk is almost entirely limited to financial risk.

Page 75: Thinking fast and_slow

Problems with Heuristics Heuristic algorithms are often employed because they may be seen

to "work" without having been mathematically proven to meet a given set of requirements.

Great care must be given when employing a heuristic algorithm. One common pitfall in implementing a heuristic method to meet a requirement comes when the engineer or designer fails to realize that the current data set doesn't necessarily represent future system states.

While the existing data can be pored over and an algorithm can be devised to successfully handle the current data, it is imperative to ensure that the heuristic method employed is capable of handling future data sets.

This means that the engineer or designer must fully understand the rules that generate the data and develop the algorithm to meet those requirements and not just address the current data sets.

A simple example of how heuristics can fail is to answer the question "What is the next number in this sequence: 1, 2, 4?". One heuristic algorithm might say that the next number is 8 because the numbers are doubling - leading to a sequence like 1, 2, 4, 8, 16, 32... Another, equally valid, heuristic would say that the next number is 7 because each number is being raised by one higher interval than the one before - leading to a series that looks like 1, 2, 4, 7, 11, 16...

Statistical analysis must be conducted when employing heuristics to ensure that enough data points are utilized to make incorrect outcomes statistically insignificant

Page 76: Thinking fast and_slow
Page 77: Thinking fast and_slow

Review of Hubbard

A major problem with expert estimates is overconfidence. To overcome this, Hubbard advocates using calibrated probability assessments to quantify analysts’ abilities to make estimates. Calibration assessments involve getting analysts to answer trivia questions and eliciting confidence intervals for each answer. The confidence intervals are then checked against the proportion of correct answers.  Essentially, this assesses experts’ abilities to estimates by tracking how often they are right. It has been found that  people can improve their ability to make subjective estimates through calibration training – i.e. repeated calibration testing followed by feedback.

Page 78: Thinking fast and_slow

Dunning–Kruger effect

The Dunning–Kruger effect is a cognitive bias in which unskilled people make poor decisions and reach erroneous conclusions, but their incompetence denies them the metacognitive ability to recognize their mistakes.[1]

The unskilled therefore suffer from illusory superiority, rating their ability as above average, much higher than it actually is, while the highly skilled underrate their own abilities, suffering from illusory inferiority.

Actual competence may weaken self-confidence, as competent individuals may falsely assume that others have an equivalent understanding.

As Kruger and Dunning conclude, "the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others" (p. 1127).[2] The effect is about paradoxical defects in cognitive ability, both in oneself and as one compares oneself to others.

Page 79: Thinking fast and_slow

See this site for more on probability calibration.

Page 80: Thinking fast and_slow
Page 81: Thinking fast and_slow
Page 82: Thinking fast and_slow

Happiness November 2010

Jeran Binning

Page 83: Thinking fast and_slow

Would You Be Happier If You Were Richer? A Focusing Illusion

Daniel Kahneman,1 Alan B. Krueger,1,2* David Schkade,3 Norbert Schwarz,4 Arthur A. Stone5

The belief that high income is associated with good mood is widespread but mostly illusory.

People with above-average income are relatively satisfied with their lives but are barely happier than others in moment-to-moment experience, tend to be more tense, and do not spend more time in particularly enjoyable activities.

Moreover, the effect of income on life satisfaction seems to be transient. We argue that people exaggerate the contribution of income to happiness because they focus, in part, on conventional achievements when evaluating their life or the lives of others.

1 Princeton University, Princeton, NJ 08544, USA.2 National Bureau of Economic Research, Cambridge, MA 02138, USA.3 Rady School of Management, University of California, San Diego, San Diego, CA 92093, USA.4 Department of Psychology, University of Michigan, Ann Arbor, MI 48106, USA.5 Stony Brook University, Stony Brook, NY, 11794, USA.

Page 84: Thinking fast and_slow

Life Changes can put a smile on your face. USA Today October 6, 2010 Your Life

Proceedings of the National Academy of Sciences Preferences and choices can affect long-term

happiness. German Study▪ Marry someone who is not neurotic ▪ Focus more on friends, less on material goods▪ Get involved with making the world a better place▪ Have a job but also enough time for leisure▪ Stay physically active▪ For men, don’t be underweight. For women, don’t be

obese.

Page 85: Thinking fast and_slow

Set Point

Don't worry, be happy" may be more than just a wishful mantra. A new study finds that people's happiness levels can change substantially over their lifetimes, suggesting that happiness isn't predetermined by genes or personality.

Psychologists have long argued that people have a "set point" for happiness. Regardless of what life brings, the set-point theory goes, happiness levels tend to be stable. A big life event could create a boost of joy or a crush of sorrow, but within a few years, people return to a predetermined level of life satisfaction, according to the theory.

The new study, which used a nationally representative sample of almost 150,000 German adults, finds the opposite. People's long-term life satisfaction can change, the researchers report today (Oct. 4) in the online early edition of the Proceedings of the National Academy of Sciences. In fact, a substantial number of people followed over 25 years saw their happiness levels shift by one-third or more.

Page 86: Thinking fast and_slow

Money

The study also echoed previous happiness research in finding that money doesn't buy happiness.

"People with a lot of money are more satisfied with their lives... but mainly due to the more interesting and challenging jobs they have," study author Gert Wagner, a researcher at the Max Planck Institute for Human Development in Germany, told LiveScience. "Money is simply a byproduct of good and satisfying jobs. If you want to be satisfied with your life, you must spend time with your friends and your family."

Wagner said that previous work suggests findings on happiness from one developed country, like Germany, should also hold true for another, such as the United States. In fact, a study in May found that in the United States. happiness tends to increase with age.

Page 87: Thinking fast and_slow

I'm happier than you The researchers used data from a study of German adults spanning from 1984 to 2008. Each year, the

participants answered questions on their life satisfaction, life goals and other measures like how much they exercise and socialize.

By averaging life-satisfaction responses to even out any short-term effects, the researchers plotted out the respondents' happiness by percentiles. Someone in the 99th percentile, for example, would be happier than 99 percent of the study participants.

People shifted in the rankings — and thus in their levels of happiness — quite a bit. Just over 38 percent changed their position in the distribution by 25 percentiles or more during the study period. About 25 percent changed by 33.3 percentiles or more, and 11.8 percent changed by 50 percentiles.

Feel-good factors So what contributed to long-term happiness? The researchers found several correlations between life

choices and life satisfaction:

Marry well: The personality traits of partners influenced people's happiness. Neuroticism, or a tendency toward anxiety, emotional instability and depression, was most influential. People who married or partnered with neurotic people were less likely to be happy than people who married non-neurotic types.

Focus on the family: People who assigned relatively high value to altruistic and family goals compared with career goals were happier. Women were also happier when their male partners ranked family goals high.

Go to church: People who went to church more often were happier, though the study can't determine whether the happiness is related to religious views or to the social circle religious organizations offer.

Work, but not too much (or too little): People's happiness matched how well they felt their work hours matched their desired work hours. In other words, people who worked more or fewer hours than they preferred were less happy. Working less or being unemployed was worse than working too much, presumably because underemployment is a financial blow, the researchers wrote.

Get social, and get moving: Social interaction and exercise were both associated with happiness. Working out made people happier regardless of body weight. The only correlation between body weight and happiness was that underweight men and obese women were more likely to be unhappy.

Happiness Study

Page 88: Thinking fast and_slow

Mysteries of happiness "In its extreme form, set-point theory was never credible," Daniel Kahneman, an emeritus professor of

psychology at Princeton University and the winner of the 2002 Nobel Prize in Economic Sciences, told LiveScience. "If it was taken to mean that the only factor that determines happiness or life satisfaction is genetic, so that people always come back to exactly to the same point, this was utterly incredible."

The current study is a useful demonstration that life changes can influence people's life satisfaction, said Kahneman, who was not involved in the research. However, the correlations between certain goals and traits and happiness doesn't necessarily answer the nature-versus-nurture question.

"They're suggesting that the goals are chosen. But the goals may be part of personality," and thus partially genetic, he said. "The fact that goals matter, like altruism and materialism, that really doesn't help us distinguish between personality and circumstances."

More studies are needed that track large populations of people after influential changes, like the enactment of new laws, said Andrew Oswald, a professor of behavioral science at the University of Warwick who studies happiness but was not involved in the current study. By comparing people who lived under, say, a new state tax law that affected income to those who lived in a nearby state without the law, researchers could begin to look at happiness in a more experimental way, he said.

"The key thing is that life events good and bad do shape happiness over long periods," Oswald said. "We are, in part, the product of our experiences. It's not all born into us."

7 Ways the Mind and Body Change with Age

Happiness is Being Old, Male and Republican

10 Ways to Keep Your Mind Sharp

Happiness Study

Page 89: Thinking fast and_slow