why we deny

9
14 WHY WE DENY

Upload: united-academics

Post on 07-Mar-2016

218 views

Category:

Documents


0 download

DESCRIPTION

From evolution to climate change to the holocaust, there are always those who deny claims despite overwhelming evidence. What drives these people? Psychologist and professional skeptic Michael Shermer's new book The Believing Brain describes the mental mechanisms that are at work here and paints a picture of our alarmingly primitive reasoning capacity.

TRANSCRIPT

Page 1: Why we deny

14

WHY WE DENY

Page 2: Why we deny

15

FROM EVOLUTION TO CLIMATE CHANGE TO THE HOLOCAUST, THERE ARE THOSE WHO DENY CLAIMS DESPITE OVERWHELMING EVIDENCE. WHAT DRIVES THESE PEOPLE? MICHAEL SHERMER’S NEW BOOK THE BELIEVING BRAIN DESCRIBES THE MENTAL MECHANISMS THAT ARE AT WORK HERE AND PAINTS A PICTURE OF OUR ALARMINGLY PRIMITIVE REASONING CAPACITY.

Page 3: Why we deny

16

Not only crackpot conspiracy theorists or religious fanatics deny certain facts in a way that astonish the average person. Republican American presidential candidate Rick Perry, for instance, denied in interviews earlier this year that evolution and man-made global warming were convincingly demonstrated by science. What makes people believe such improbable things?

Beliefs before explanationsThe American psychologist Michael Shermer has made it his life’s work figuring out how people come to believe things and where their reasoning process goes haywire. Founding publisher of Skeptic Magazine, editor of Skeptic.com and writer of eleven books on the subject, Shermer is the leading man of the skepticism community and a professional debunker. When the US media need a rational voice against pseudoscience, the paranormal or the supernatural, they call Shermer to have him explain that the latest alien abduction might also be attributed to hallucinations, sleep anomalies or hypnosis.

His latest book, The Believing Brain, is a fascinating synthesis of 30 years of research on the subject. Shermer’s conclusion about

Page 4: Why we deny

17

our belief-forming machinery is disturbing. Most beliefs are not formed by carefully evaluating the evidence in favour or against a particular claim but are snap decisions made for psychological, emotional and social reasons in the context of an environment created by family, friends, colleagues, culture and society at large. Only after the belief is formed, do people try to rationalize it and subconsciously seek out confirmatory evidence which, upon finding, reinforces the belief in a positive feedback loop.

American physiologist Mark Hoofnagle, one of the originators of the concept of ‘denialism’ and blogger on denialist anti-science tactics, finds this a plausible process. He adds: “At the basis of almost all denialism is some ideology that overrides people’s rational mind. Most people are probably irrational about one thing or another. It’s not a liberal or conservative thing; all sides have something that is threatening to them.” Following this line of thought you can, for instance, imagine people so blinded by religious ideology that they take its scripture literally leading them to deny that the earth is round, that it’s older than 6,000 years or that evolution is true.

Page 5: Why we deny

18

Seeking patterns and agentsHow did we end up with such a flawed belief system? Shermer argues that one ancient brain process at work here is our tendency to find patterns everywhere we look. This tendency has been useful from the early days in our ancestral environment (the African savannah) where, for instance, quickly establishing the pattern ‘rustle in the grass means dangerous predator’ could save your life. Of course, sometimes a pattern is false and the rustle in the grass is just the wind. However, Shermer makes the case that the costs of missing a pattern (missing the presence of a predator in this case) often greatly outweigh the costs of believing a false pattern (thinking it’s a predator while it’s only the wind). This, in turn, easily leads to false patterns.

Another characteristic of our brain is that, once we have established a pattern, we tend to infuse it with meaning, intention and agency. So in the example above, when we are dealing with a predator, we correctly assume that we are dealing with an intentional agent instead of an inanimate force like the wind. Shermer suspects this tendency is related to the fact that people have a ‘theory of mind’, or the capacity to be aware of mental states like desires and intentions in both ourselves and others. Problems arise of course when we assume agency when there actually is none, for instance when dealing with the wind, thinking it is an angry higher power instead of plain physics. In fact, most patterns in the world lack agents and are governed by bottom-up causal laws and randomness and assuming agency in those cases have led to practices like shamanism, animism and magical thinking in the past and to religion, superstition and

New Age spiritualism today.

Brain biasesTo make matters worse, once committed to a belief, it is extremely hard to change your mind. Shermer identified no less than 39 cognitive biases that make us stick to our guns (see” Bias Bonanza”, page 20). The most important of them all, he argues, is the confirmation bias which is our tendency to seek confirmatory evidence in support of our already existing beliefs and ignore or reinterpret disconfirming evidence. This effect has been found in many studies including one where participants had to assess somebody’s personality after reading a (fictive) profile of that person which consequently led their assessment to become strikingly similar to

Page 6: Why we deny

19

the profile. In another study involving a murder trial, participants did not evaluate the evidence first as you might expect but quickly concocted a narrative in their mind about what happened and then riffled through the evidence and picked out what most closely fit the story.

An especially revealing study was a neuroimaging experiment done by American psychologist Drew Westen during the 2004 American presidential election. Westen found that both republicans and democrats were much more critical of the candidate of the opposite party when confronted with contradictory statements made by both candidates. Strikingly, the brain areas most active in this process were not those involved with

reasoning but those associated with emotions and conflict resolution and once the participants had arrived at a conclusion that made them emotionally comfortable, the brain’s reward area became active. Shermer concludes that instead of rationally evaluating a candidate’s position on an issue, the participants had an emotional reaction to conflicting data and got neurochemically rewarded after rationalizing the conflicting data away.

The reluctance to change one’s mind could ultimately again be a legacy from our evolutionary past. Shermer argues that our tribal tendencies lead us to form coalitions with fellow like minded members of our group and to demonize others who hold differing beliefs. Possibly this effect supported group cohesion in the past and thereby promoted its survival. Furthermore, our faulty reasoning process could have to do with, what Shermer calls, folk numeracy, or our natural tendency to misperceive probabilities, to think anecdotally instead of statistically, and to focus on short-term trends and small-number runs (e.g. we notice a short stretch of cool days and ignore the long-term global warming trend). When roaming the African savannah in the past, this way of thinking probably was adequate for survival but in the modern world it can painfully fall short.

Science as antidoteYou might wonder how we can avoid all of these irrational belief pitfalls. According to Shermer, the best tool we have is science. Before accepting a claim, the scientific process requires an impressive number of checks and balances like control groups, double-blind tests, replication studies by independent labs and peer reviewed publications.

Page 7: Why we deny

20

In addition, science has a built in self-correcting mechanism where, eventually, after enough data comes in, the truth will come out.

All the more worrisome then, that according to a 2002 survey by the National Science Foundation, 70 percent of Americans do not understand the scientific process (defined by them as grasping probability, the experimental method and hypothesis testing). To tackle this problem Shermer recommends better communication about science in the media and especially explaining how science works versus only explaining what science knows.

Mark Hoofnagle adds that conspiracy theories are often an important element of denialism because, in order to deny well proven facts, you have to assume a huge number of people are lying. He writes that pointing out the absurdity of these theories can be a successful strategy as well in convincing some deniers they are wrong.

Unfortunately, as we have seen, the majority of our deeply held beliefs have turned out immune to attack by direct educational tools, especially for those who are not ready to hear contradictory evidence. The pope won’t become an atheist anytime soon and conservatives suddenly turning into liberals or vice versa are rare. Shermer concludes belief change ultimately comes from a combination of personal psychological readiness and a deeper social and cultural shift in the underlying zeitgeist, which is affected in part by education but is more the product of harder-to-define political, economic, religious, and social changes. In other words, it can take a lifetime for someone to change their mind if they ever change at all.

CONFIRMATION BIAS TENDENCY TO SEEK AND FIND CONFIRMATORY EVIDENCE IN SUPPORT OF ALREADY EXISTING BELIEFS AND IGNORE OR REINTERPRET DISCONFIRMING EVIDENCE

HINDSIGHT BIAS TENDENCY TO RECONSTRUCT THE PAST TO FIT WITH PRESENT KNOWLEDGE

SELF-JUSTIFICATION BIAS TENDENCY TO RATIONALIZE DECISIONS AFTER THE FACT TO CONVINCE OURSELVES THAT WHAT WE DID WAS THE BEST THING WE COULD HAVE DONE

ATTRIBUTION BIAS TENDENCY TO ATTRIBUTE DIFFERENT CAUSES FOR OUR OWN BELIEFS AND ACTIONS THAN THAT OF OTHERS

SUNK-COST BIAS TENDENCY TO BELIEVE IN SOMETHING BECAUSE OF THE INVESTMENT ALREADY MADE INTO THAT BELIEF

STATUS QUO BIAS TENDENCY TO OPT FOR WHATEVER IT IS WE ARE USED TO, THAT IS, THE STATUS QUO.

BIAS BLIND SPOT TENDENCY TO RECOGNIZE THE POWER OF COGNITIVE BIASES IN OTHER PEOPLE BUT TO BE BLIND TO THEIR INFLUENCE UPON OUR OWN BELIEFS

BIAS BONANZA

MICHAEL SHERMER DESCRIBES AN IMPRESSIVE NUMBER OF COGNITIVE BIASES LEADING OUR BRAINS TO CONSTRUCT FALSE BELIEFS AND STICK TO THEM. HERE A SMALL SELECTION:

BY MARC SMEEHUIZEN

Page 8: Why we deny

21

CONFIRMATION BIAS TENDENCY TO SEEK AND FIND CONFIRMATORY EVIDENCE IN SUPPORT OF ALREADY EXISTING BELIEFS AND IGNORE OR REINTERPRET DISCONFIRMING EVIDENCE

HINDSIGHT BIAS TENDENCY TO RECONSTRUCT THE PAST TO FIT WITH PRESENT KNOWLEDGE

SELF-JUSTIFICATION BIAS TENDENCY TO RATIONALIZE DECISIONS AFTER THE FACT TO CONVINCE OURSELVES THAT WHAT WE DID WAS THE BEST THING WE COULD HAVE DONE

ATTRIBUTION BIAS TENDENCY TO ATTRIBUTE DIFFERENT CAUSES FOR OUR OWN BELIEFS AND ACTIONS THAN THAT OF OTHERS

SUNK-COST BIAS TENDENCY TO BELIEVE IN SOMETHING BECAUSE OF THE INVESTMENT ALREADY MADE INTO THAT BELIEF

STATUS QUO BIAS TENDENCY TO OPT FOR WHATEVER IT IS WE ARE USED TO, THAT IS, THE STATUS QUO.

BIAS BLIND SPOT TENDENCY TO RECOGNIZE THE POWER OF COGNITIVE BIASES IN OTHER PEOPLE BUT TO BE BLIND TO THEIR INFLUENCE UPON OUR OWN BELIEFS

BIAS BONANZA

MICHAEL SHERMER DESCRIBES AN IMPRESSIVE NUMBER OF COGNITIVE BIASES LEADING OUR BRAINS TO CONSTRUCT FALSE BELIEFS AND STICK TO THEM. HERE A SMALL SELECTION:

Page 9: Why we deny

22

Get your Books now!

studymanager.nl

scan this code and Get €10,- discount

Get your Books now!

studymanager.nl

scan this code and Get €10,- discount