science, values and risk rd300 15 october 2001. “it is by no means uncommon to find decision...

28
Science, Values and Science, Values and Risk Risk RD300 RD300 15 October 2001 15 October 2001

Upload: davon-bearfield

Post on 15-Jan-2016

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Science, Values and Science, Values and RiskRisk

RD300RD300

15 October 200115 October 2001

Page 2: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

““It is by no means uncommon It is by no means uncommon to find decision makers to find decision makers interpreting the same interpreting the same scientific information in scientific information in different ways in different different ways in different countries.”countries.”

(Jasanoff, 1991, p.29)(Jasanoff, 1991, p.29)

Page 3: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Cultural VariationCultural Variation

• U.S. Environmental Regulators more U.S. Environmental Regulators more highly value formal analytical methods highly value formal analytical methods (testable validity) than do their European (testable validity) than do their European counterparts. US regulators tend to counterparts. US regulators tend to address scientific uncertainty through address scientific uncertainty through quantitative analysis.quantitative analysis.

• Result: Evidence sufficient to trigger Result: Evidence sufficient to trigger action in one country may not do so in action in one country may not do so in another.another.

Page 4: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

The Problem with Policy-Relevant The Problem with Policy-Relevant ScienceScience

• When knowledge is uncertain or When knowledge is uncertain or ambiguous facts alone are ambiguous facts alone are inadequate to compel a choice.inadequate to compel a choice.

• Policymakers inevitably look beyond Policymakers inevitably look beyond just the science and blend scientific just the science and blend scientific and policy considerations together in and policy considerations together in their preferred reading of the their preferred reading of the evidence.evidence.

Page 5: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Risk AssessmentRisk Assessment• Different risk assessment methodologies Different risk assessment methodologies

can produce widely varying risk estimates.can produce widely varying risk estimates.

• Can animal data be extrapolated to humans?Can animal data be extrapolated to humans?

• Do policy makers hide behind the numbers?Do policy makers hide behind the numbers?

• Most lay persons don’t understand Most lay persons don’t understand quantitative risk assessments.quantitative risk assessments.

Page 6: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• Value judgments and uncertainties Value judgments and uncertainties in risk assessments may not be in risk assessments may not be stated by the experts.stated by the experts.

• Risks of less than one in a million Risks of less than one in a million are often considered negligible are often considered negligible from a regulatory standpoint.from a regulatory standpoint.

Page 7: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Judgmental Probability Judgmental Probability EncodingEncoding• Field of US health risk assessment.Field of US health risk assessment.

• Attempts to ascertain the range of Attempts to ascertain the range of scientific expert opinion on a particular scientific expert opinion on a particular risk as well as the levels of confidence risk as well as the levels of confidence attached to each of those judgments. attached to each of those judgments. (e.g. ambient air quality standards)(e.g. ambient air quality standards)

• Has proven to be problematic (e.g. Has proven to be problematic (e.g. biased selection of experts).biased selection of experts).

Page 8: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

British ApproachBritish Approach

• Multi-stakeholder commissions with Multi-stakeholder commissions with noted academics and major interest noted academics and major interest groups. Collective credibility.groups. Collective credibility.

• Unlike US approach, risk assessment Unlike US approach, risk assessment and risk management are examined and risk management are examined together. (science and policy)together. (science and policy)

Page 9: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• With respect to lead and the risk to With respect to lead and the risk to children’s health, they were equivocal children’s health, they were equivocal in their findings and reported no in their findings and reported no persuasive evidence of a risk.persuasive evidence of a risk.

• Described the risk in qualitative Described the risk in qualitative (“small”) rather than numerical (“small”) rather than numerical terms.terms.

• Yet they recommended that lead Yet they recommended that lead additives be phased out of gasoline.additives be phased out of gasoline.

• Interpreted the Precautionary Interpreted the Precautionary Principle as: “dangerous until proven Principle as: “dangerous until proven safe”. Dealing with uncertainty.safe”. Dealing with uncertainty.

Page 10: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

USA vs Britain:USA vs Britain:Administrative and Political Administrative and Political

CulturesCultures• Regulatory processes:Regulatory processes:

– Britain – consensual, non-litigious, Britain – consensual, non-litigious, relatively closed.relatively closed.

– USA – adversarial, litigious, open.USA – adversarial, litigious, open.

• USA – regulatory process more open USA – regulatory process more open to political pressures. Quantitative to political pressures. Quantitative analysis becomes a “lifeline to analysis becomes a “lifeline to legitimacy”.legitimacy”.

Page 11: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Slovic ArticleSlovic Article

• ““the goal of informing the public the goal of informing the public about risk issues – which in principle about risk issues – which in principle seems easy to attain – is surprisingly seems easy to attain – is surprisingly difficult to accomplish.”difficult to accomplish.”

• Why?Why?

Page 12: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Three Categories of Three Categories of ReasonsReasons

• Limitations of risk assessment.Limitations of risk assessment.

• Limitations of public understanding.Limitations of public understanding.

• The problems of communicating The problems of communicating complex technical information.complex technical information.

Page 13: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Limitations of Public Limitations of Public UnderstandingUnderstanding

• The public’s perceptions of risk are The public’s perceptions of risk are sometimes inaccurate.sometimes inaccurate.– Memorable past eventsMemorable past events– Imaginability of future eventsImaginability of future events– Media coverage can influenceMedia coverage can influence– Overestimate dramatic causes of death.Overestimate dramatic causes of death.

Page 14: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

How good are the public at How good are the public at estimating risks?estimating risks?

• Rare causes of death tend to be overestimated Rare causes of death tend to be overestimated while common causes are underestimated.while common causes are underestimated.

• Example: Most people think their chances of Example: Most people think their chances of dying of a heart attack is about 1 in 20. The truth dying of a heart attack is about 1 in 20. The truth is closer to 1 in 4.is closer to 1 in 4.

• Judgmental bias - people’s predilection for Judgmental bias - people’s predilection for exaggerating their personal immunity from exaggerating their personal immunity from many hazards. “Optimistic bias”.many hazards. “Optimistic bias”.

Page 15: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• Risk information may frighten and Risk information may frighten and frustrate the public.frustrate the public.– Simply mentioning a risk may enhance Simply mentioning a risk may enhance

perceptions of danger.perceptions of danger.

– Even neutral information may elevate Even neutral information may elevate fears (e.g. transmission lines)fears (e.g. transmission lines)

– People may try to reduce their anxiety People may try to reduce their anxiety about a hazard and its uncertainty by about a hazard and its uncertainty by denying its existence or in their minds denying its existence or in their minds making the risk smaller than it is.making the risk smaller than it is.

Page 16: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• Strong beliefs are hard to modify.Strong beliefs are hard to modify.

““strong beliefs about risks, once strong beliefs about risks, once formed, change very slowly and are formed, change very slowly and are extraordinarily persistent in the face extraordinarily persistent in the face of contrary evidence”. of contrary evidence”. Vincent CovelloVincent Covello

People gravitate or tend to accept People gravitate or tend to accept evidence that supports their pre-existing evidence that supports their pre-existing beliefs on the subject.beliefs on the subject.

Page 17: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• When people lack strong opinions When people lack strong opinions they can be easily manipulated by they can be easily manipulated by presentation format.presentation format.– ““framing effects”framing effects”– Ethical issuesEthical issues

Page 18: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Expert versus Lay Conceptions of Expert versus Lay Conceptions of RiskRisk

• Risk experts employ a technical Risk experts employ a technical evaluation of risk:evaluation of risk:

Risk = Probability x Risk = Probability x ConsequencesConsequences

• The public applies a broader The public applies a broader conception of risk that also conception of risk that also incorporates: accountability, incorporates: accountability, economics, values, and trust.economics, values, and trust.

Page 19: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• As our technical control has increased As our technical control has increased in the technological age, our social in the technological age, our social control has decreased.control has decreased.

• ““Most citizens’ calls for ‘scientific’ Most citizens’ calls for ‘scientific’ decisions, in reality, are a request for decisions, in reality, are a request for something a bit broader ---in most something a bit broader ---in most cases, a call for ways of assuring that cases, a call for ways of assuring that ‘the human element’ of societal ‘the human element’ of societal decision making will be not just decision making will be not just technically competent, but equitable, technically competent, but equitable, fair, and responsive to deeply felt fair, and responsive to deeply felt concerns” concerns” FreudenburgFreudenburg

Page 20: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Should Zero-risk be the Should Zero-risk be the goal?goal?

As Harvard professor John Graham has As Harvard professor John Graham has said,said,

““We all want zero risk. The problem is if We all want zero risk. The problem is if every citizen in this country demands every citizen in this country demands zero risk, we’re going to bankrupt the zero risk, we’re going to bankrupt the country”.country”.

Page 21: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• Perceptual cues (e.g. odor) may signal Perceptual cues (e.g. odor) may signal more ominous events.more ominous events.

• Risk as a ‘collective construct’ - Risk as a ‘collective construct’ - cultural theory of risk.cultural theory of risk.

• Studies have found cross-national Studies have found cross-national differences in risk judgments.differences in risk judgments.

• Value orientation influences risk Value orientation influences risk perceptions as do worldviews.perceptions as do worldviews.

Page 22: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

The Mad Cow CrisisThe Mad Cow Crisis

• In March 1996, the British In March 1996, the British government announced that government announced that scientists had linked Creutzfeldt-scientists had linked Creutzfeldt-Jakob disease with the human Jakob disease with the human consumption of cattle with bovine consumption of cattle with bovine spongiform encephalopathy (BSE) or spongiform encephalopathy (BSE) or “mad cow disease”.“mad cow disease”.

Page 23: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

• For almost a decade British authorities had For almost a decade British authorities had insisted there was no risk of BSE being insisted there was no risk of BSE being transferred to humans.transferred to humans.

• With the March 1996 announcement, the With the March 1996 announcement, the British beef market collapsed virtually British beef market collapsed virtually overnite.overnite.

• The EU banned the export of British beef.The EU banned the export of British beef.

• Consumption of all beef in countries such as Consumption of all beef in countries such as France, Germany and Japan dropped France, Germany and Japan dropped significantly.significantly.

Page 24: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

The scientific question at the The scientific question at the heart of the BSE crisis:heart of the BSE crisis:

Can humans develop CJD after Can humans develop CJD after eating beef from cattle infected eating beef from cattle infected with BSE?with BSE? In other words, can the In other words, can the infectious agent jump the infectious agent jump the species barrier?species barrier?

Page 25: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Public PerceptionPublic Perception

• That the British government was more That the British government was more interested in propping up the beef interested in propping up the beef industry rather than admitting that industry rather than admitting that there may be a risk, however small there may be a risk, however small that risk might be.that risk might be.

• People stopped buying beef because People stopped buying beef because they no longer trusted the government.they no longer trusted the government.

Page 26: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

Risk Characteristics of the Mad-Risk Characteristics of the Mad-cow Disease Crisis.cow Disease Crisis.

• High level of dread of the disease.High level of dread of the disease.

• Scientific uncertainty.Scientific uncertainty.

• Possible involvement of children.Possible involvement of children.

• Catastrophic potential.Catastrophic potential.

• Non-voluntary exposure.Non-voluntary exposure.

• Lack of trust in decision-makers.Lack of trust in decision-makers.

• A history of food safety controversies.A history of food safety controversies.

Page 27: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

What mistakes did the British What mistakes did the British Government make in Government make in

handling the issue of mad handling the issue of mad cow disease ?cow disease ?

Page 28: Science, Values and Risk RD300 15 October 2001. “It is by no means uncommon to find decision makers interpreting the same scientific information in different

What lessons can be What lessons can be learned from the mad cow learned from the mad cow

crisis?crisis?