stirling nature

Upload: jack-stilgoe

Post on 06-Apr-2018

229 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 Stirling Nature

    1/3

    COMMENTOBITUARY Brian Marsden,keeper of comets,remembered p.1042

    RevIewIng Pool of peers growsto cope with submissionssurge p.1041

    mAThemATIcs Roger Penrosereflects on 50 years and 6volumes of work p.1039

    cOnseRvATIOn Threatsto Adlie penguinsassessed p.1034

    Worldwide and across many fields,there lurks a hidden assumptionabout how scientific expertise

    can best serve society. Expert advice is oftenthought most useful to policy when it is pre-sented as a single definitive interpretation.Even when experts acknowledge uncer-tainty, they tend to do so in ways that reduceunknowns to measurable risk. In this way,policy-makers are encouraged to pursue (andclaim) science-based decisions. It is also not

    uncommon for senior scientists to assert thatthere is no alternative to some scientificallycontestable policy. After years researching and participating in science advisoryprocesses, I have come to the conclusion thatthis practice is misguided.

    An overly narrow focus on risk is an inad-equate response to incomplete knowledge. Itleaves science advice vulnerable to the socialdynamics of groups and to manipulationby political pressures seeking legitimacy,

    justification and blame management. Whenthe intrinsically plural, conditional natureof knowledge is recognized, I believe thatscience advice can become more rigorous,robust and democratically accountable.

    A rigorous definition of uncertainty can betraced back to the twentieth-century econo-mist Frank Knight1. For Knight, a measur-able uncertainty, or risk proper ... is so fardifferent from an unmeasurable one that itis not in effect an uncertainty at all. This isnot just a matter of words, or even methods.The stakes are potentially much higher. Apreoccupation with assessing risk meansthat policy-makers are denied exposure todissenting interpretations and the possibilityof downright surprise.

    Of course, no-one can reliably foresee

    the unpredictable, but there are lessons tobe learned from past mistakes. For example,the belated recognition that seemingly inertand benign halogenated hydrocarbons wereinterfering with the ozone layer. Or the slow-ness to acknowledge the possibility of noveltransmission mechanisms for spongiformencephalopathies, in animal breeding andin the food chain. In the early stages, thesesources of harm were not formally charac-terized as possible risks they were earlywarnings offered by dissenting voices. Policyrecommendations that miss such warningscourt overconfidence and error.

    The question is how to move away

    Keep it complexWhen knowledge is uncertain, experts should avoidpressures to simplify their advice. Render decision-

    makers accountable for decisions, says Andy Stirling.

    A UK crop circle, created by activists to signify uncertainty over where genetic contamination can occur.

    G.

    Graf/Greenpeace

    2 3 / 3 0 D E C E M B E R 2 0 1 0 | V O L 4 6 8 | N A T U R E | 1 0 2 9

    20 Macmillan Publishers Limited. All rights reserved10

  • 8/2/2019 Stirling Nature

    2/3

    from this narrow focus on risk to broaderand deeper understandings of incompleteknowledge. Many practical quantitativeand qualitative methods already exist (seeUncertainty matrix), but political pres-sure and expert practice often prevent thembeing used to their full potential. Choosingbetween these methods requires a more

    rigorous approach to assessing incompleteknowledge, avoiding the temptation to treatevery problem as a risk nail, to be reducedby a probabilistic hammer. Instead, expertsshould pay more attention to neglected areasof uncertainty (in Knights strict sense) aswell as to deeper challenges of ambiguity andignorance2. For policy-making purposes, themain difference between the risk methodsshown in the matrix and the rest is that theothers discourage single definitive policyinterpretations.

    AnY jUsTIfIcATIOn

    There are still times when risk-basedtechniques are appropriate and can yieldimportant information for policy. This canbe so for consumer products in normal use,general road or airline-safety statistics, orthe epidemiology of familiar diseases. Yeteven in these seemingly familiar andstraightforward areas, unforeseen pos-sibilities, and over-reliance on aggre-gation, can undermine probabilisticassessments. There is a need for humil-ity about science-based decisions.

    For example, consider the riskassessment of energy technologies.The other graphic (see The perils ofscience-based advice) summarizes63 studies on the economic costs aris-ing from health and environmentalimpacts of different sets of energy tech-nologies. The aim of the studies is tohelp policy-makers identify the optionsthat are likely to have the lowest impact.This is one of the most sophisticatedand mature fields for quantitative risk-based comparisons. Individual policyreports commonly express their find-ings as if there were little room fordoubt. Many of the studies present no or tiny uncertainty ranges. But

    taken together, these 63 studies tell avery different story3 one usuallyhidden from policy-makers. The dis-crepancies between equally authoritative,peer-reviewed studies span many orders ofmagnitude, and the overlapping uncertaintyranges can support almost any ranking orderof technologies, justifying almost any policydecision as science based.

    This is not just a problem with quantita-tive analysis. Qualitative science advice isalso usually presented in aggregated andconsensual form: there is always pressureon expert committees to reach a consensus

    opinion. This raises profound questions over

    what is most accurate and useful for policy. Isit a picture asserting an apparent consensus,even where one does not exist? Or would itbe more helpful to set out a measured arrayof contrasting specialist views, explainingunderlying reasons for different interpreta-tions of the evidence? Whatever the politicalpressures for the former, surely the latter is

    more consistent both with scientific rigourand with democratic accountability?I believe that the answer lies in supporting

    more plural and conditional methods for sci-ence advice (the non-risk quadrants shownin Uncertainty matrix). These are pluralbecause they even-handedly illuminate avariety of alternative reasonable interpreta-tions. And conditional because they exploreexplicitly for each alternative, the associatedquestions, assumptions, values or inten-tions4. Under Knightian uncertainty, forinstance, pessimistic and optimistic inter-pretations can be treated separately, each

    explicitly associated with assumptions, dis-ciplines, values or interests so that these canbe clearly appraised. It reminds experts thatabsence of evidence of harm is not the sameas evidence of absence of harm. It also allowsscenario analysis and the consideration of

    sensitivity, enabling more accountable evalu-ation. For example, it could allow experts tohighlight conditional decision rules aimed atmaximizing best or worst possible outcomes,or minimizing regrets5.

    The few sporadic examples of the appli-cation of this approach show that it can bepractical. One particularly politicized andhigh-stakes context for expert policy adviceis the setting of financial interest rates. TheBank of Englands Monetary Policy Commit-

    tee, for example, describes its expert advisory

    process as a two-way dialogue with apriority placed on public accountability.Great care is taken to inform the commit-tee, not just of the results of formal analysisby the sponsoring bodies, but also of com-plex real-world conditions and perspectives.Reports detail contrasting recommendationsby individual members and explain reasons

    for differences6

    . Why is this kind of thing notnormal in science advice?When scientists are faced with unmeas-

    urable uncertainties, it is much more usualfor a committee to spend hours negotiatinga single interpretation across a spread of con-tending contexts, analyses and judgements.From my own experiences of standard-setting for toxic substances, it would oftenbe more accurate and useful to accept thesedivergent expert interpretations and focusinstead on documenting the reasons. In myview, concrete policy decisions could stillbe made and possibly more efficiently.

    Moreover, the relationship between thedecision and the available science would beclearer and the inherently political dimen-sions more honest and accountable.

    Problems of ambiguity arise when expertsdisagree over the framing of possible options,

    contexts, outcomes, benefits or harms.Like uncertainty, these cannot bereduced to risk analysis, and demandplural and conditional treatment. Suchmethods can highlight rather thanconceal different regulatory ques-tions, such as: what is best?, whatis safest?, is this safe?, is this toler-able? or (as is often routine) is thisworse than what we have now? Nobel-winning work in rational choice showsthat when ambiguity rules there is noguarantee, as a matter of logic, thatscientific analysis will lead to a uniquepolicy answer7. Consequently, defini-tive science-based decisions are not

    just potentially misleading they are afundamental contradiction in terms.

    meThODs ThAT wORK

    One practical example of ways to beplural and conditional when consid-ering questions and options, as well

    as in deriving answers, is multicri-teria mapping. Other participatoryand deliberative procedures include

    interactive modelling and scenario work-shops, as well as Q-method and dissensusmethods. Multicriteria mapping makes useof simple but rigorous scoring and weight-ing procedures to reveal the ways in whichoverall rankings depend on divergent waysof framing the possible options. In 1999,Unilever funded me and colleagues to usemulticriteria mapping to study the perspec-tives of different leading science advisers ongenetically modified (GM) crops8. The back-

    ing of this transnational company helped

    UNCERTAINTY MATRIXA tool to catalyse nuanced deliberations: experts must look beyondrisk (top left quadrant) to ambiguity, uncertainty and ignoranceusing quantitative and qualitative methods.

    Unproblematic

    Problematic

    Unproblematic Problematic

    Knowledgeaboutprobabilities

    AMBIGUITY

    IGNORANCE

    RISK

    UNCERTAINTY

    Risk assessment Optimizing models

    Expert consensus

    Costbenefit analysis

    Aggregated beliefs

    Interval analysis

    Scenario methods

    Sensitivity testing

    Decision rules

    Evaluative judgement

    Interactive modelling Participatory deliberation

    Focus & dissensus groups

    Multicriteria mapping

    Q-method, repertory grid

    Monitoring & surveillance

    Reversibility of effects

    Flexibility of commitments

    Adaptability, resilience

    Robustness, diversity

    Knowledge about possibilities

    Political pressures tend to push attention from plural conditional

    (dark shading) to single definitive (light shading) methods.

    1 0 3 0 | N A T U R E | V O L 4 6 8 | 2 3 / 3 0 D E C E M B E R 2 0 1 0

    COMMENT

    20 Macmillan Publishers Limited. All rights reserved10

  • 8/2/2019 Stirling Nature

    3/3

    draw high-level UK government attention.A series of civil servants told me, in quitecolourful terms, that results mapped out inplural, conditional fashion would be abso-lutely no use in practical policy-making. Yetwhen a chance finally emerged to presentresults to Mo Mowlam, the relevant cabinetminister, the reception was very positive.She immediately appreciated the value ofhaving alternative perspectives laid out for arange of policy options. It turned out in thiscase, that the real block to a plural, condi-tional approach was not the preferences ofthe decision-maker herself, but of some ofthose around her.

    In my experience, it is the single defini-tive representations of science that are most

    vulnerable to political manipulation. Plural,conditional approaches are not immune, butthey can help make political pressures morevisible. Indeed, this is what happened dur-ing another GM policy process in which Iwas involved: the 2003 UK science review ofGM crops. Reporting included explicit dis-cussion of uncertainties, gaps in knowledgeand divergent views and was described asneither a red nor a green light for GM tech-

    nology. A benefit of this more open approachis that it helped GM proponents and criticsto work more effectively together during thecommittee deliberations, without a high-stakes, winner takes all dynamic. There wasmore space to express alternate interpreta-tions, free from implications that one partyor another was wrong. This is important in ahighly-politicized area such as GM science,where there are entrenched interests on bothsides. Yet this unusual attempt to acknowl-edge uncertainty was not universally popu-lar. Indeed, it was also the only occasion, tomy knowledge, on which the minutes of a

    UK science advisory committee formally

    documented covert attempts to damagethe career of one of its members (me, inthis case)9. Perhaps for political ratherthan scientific reasons, this experimenttowards plural and conditional advice hasnot been repeated.

    A further argument for using more pluralapproaches arises from the state of igno-rance, in which we dont know what wedont know. Ignorance typically looms inthe choice of which of a range of feasible,economically viable future paths to support either through funding or regulation for emerging technologies. In a finite andglobalizing world, no single path can be fullyrealized without detracting from the poten-tial for others. Even in the most competitive

    consumer markets, forinstance, developmentroutinely locks in todominant technologiessuch as the QWERTYkeyboard or VHS tape.The same is true ofinfrastructures, suchas narrow-gauge rail,AC electricity or light-

    water reactors. This is not evidence of inevi-tability, but of the crowding out of potentialalternatives. Likewise, locking-in occurs inthe prioritizing of certain areas of scientificenquiry over others. The paths taken byscientific and technological progress are farfrom inevitable. Deliberately or blindly, thedirection of progress is inherently a matterof social choice10.

    A move towards plural, conditional advicewould help avoid erroneous one-track,race to the future visions of progress. Suchadvice corrects the fallacy that scepticismover a specific technology implies a general

    anti-science sentiment. It defends against

    simplistic or cynical support for some par-ticular favoured direction of change that isbacked on the spurious grounds that it issomehow synonymous with sound science,or uniquely pro innovation.

    Instead, plural, conditional advice helpsenable mature and sophisticated policydebate on broader questions. How reversible

    are the effects of a particular path, if we learnlater that it was ill-advised? How flexible arethe associated industrial and institutionalcommitments, allowing us later to shiftdirection? How adaptable are the innovationsystems? What part might be played by thedeliberate pursuit of diverse approaches to hedge ignorance, defend against lock-inor foster innovation in any given area?

    Thus, such advice provides the basis fora more-equal partnership between socialand natural science in policy advice. Pluraland conditional advice may also help resolvesome polarized fault-lines in current debates

    about science in policy. It shows how wemight better: integrate quantitative andqualitative methods; articulate risk assess-ment and risk management; and reconcilescience-based and precautionary appraisalmethods.

    A move towards plural and conditionalexpert advice is not a panacea. It cannotpromise escape from the deep intractabilitiesof uncertainty, the perils of group dynamicsor the perturbing effects of power. It differsfrom prevailing approaches in that it makesthese influences more rigorously explicit anddemocratically accountable.

    Andy Stirlingis research director at SPRU(Science and Technology Policy Research)and co-directs the joint Centre for SocialTechnological & Environmental Pathways toSustainability at Sussex University, Falmer,Brighton BN1 9QE, UK.e-mail: [email protected]

    1. Kight, f.Risk, Uncertainty and Profit(HoughtoMili, 1921).

    2. Wy, B. Glob. Environ. Change2, 111127(1992).

    3. Stilig, a.Ann. NY Acad. Sci.1128, 95110(2008).

    4. Stilig, a. Sci. Technol. Hum. Valu.33, 262294(2008).

    5. fm, J. i Late Lesson from Early Warnings:the precautionary principle 1898-2000(ds Hmos, p. et al.) 7683 (euoeviomt agy, 2001).

    6. Treasury Committee Inquiry into the MonetaryPolicy Committee of the Bank of England: TenYears On (Bk o egld, 2007); vilbl tgo.tu.om/v2h4l

    7. Lh, M., Soos, I. & Stilig, a. DynamicSustainabilities: Technology, Environment, Social

    Justice (eths, 2010).8. Stilig, a. & My, S.Environ. Plann. C19,

    529555 (2001).9. UK GM Si rviw plMinutes of the

    Seventh Meeting(2003); vilbl t go.tu.om/lxbmb

    10. eSrc ct o Soil, Thologil deviomtl pthwys to Sustibility, a nwMisto, (Uiv. Sussx, 2010); vilbl t:

    go.tu.om/zqkg

    Variability across studies

    Low High

    Energy

    options

    THE PERILS OF SCIENCE-BASED ADVICEA survey of 63 peer-reviewed studies of health and environmental risks associated with energy technologies.Individual studies oer conclusions with surprisingly narrow uncertainty ranges, yet together the literatureoers no clear consensus for policy makers.

    Numberof studies

    0.01 1 100 10,000

    Gas

    Nuclear

    Hydro

    Wind

    Solar

    Biomass

    Economic risk (dollars per megawatt hour)

    Lowest result Highest result25% Median 75%

    Oil

    Coal

    XX

    36

    20

    3

    2

    6

    8

    22

    An overlynarrow focuson risk is aninadequateresponse toincompleteknowledge.

    SoUrce:ref.

    3

    2 3 / 3 0 D E C E M B E R 2 0 1 0 | V O L 4 6 8 | N A T U R E | 1 0 3 1

    COMMENT

    20 Macmillan Publishers Limited. All rights reserved10