fixing the problem of analytical mind-sets

Upload: alexandra-toma

Post on 30-Oct-2015

29 views

Category:

Documents


2 download

DESCRIPTION

Intelligence

TRANSCRIPT

  • ROGER Z. GEORGE

    Fixing the Problem of Analytical Mind-Sets:Alternative Analysis

    The main dierence between professional scholars or intelligence ocers onthe one hand, and all other people on the other hand, is that the former aresupposed to have had more training in the techniques of guarding againsttheir own intellectual frailties.

    Sherman Kent, Strategic Intelligence, 1949

    The surprise to me is not that there have been and will continue to besurprises, but that we are surprised that there are surprises. . . . As vonClausewitz wrote, The unexpected is the prince of the battleeld.

    Donald Rumsfeld, 1998

    I am conscious every day of how important it is for our analysts to challengethe conventional wisdom, to separate what we really know from what wemerely think, to consider alternative outcomesin short not to fallvictim to mindset, overcondence, or anyones pet paradigm.

    DDCI John McLaughlin, 2001

    Dr. Roger Z. George is currently the Director of Central Intelligence FacultyRepresentative to the National War College at the National DefenseUniversity, Washington, D.C., where he teaches international politics,national security policymaking, and intelligence studies. A career intelligenceanalyst at the Central Intelligence Agency, he was a senior analyst in itsOfce of Policy Support, where he fostered the development and use ofAlternative Analysis methodologies. Dr. George, a specialist on German andEuropean security policies, was the National Intelligence Ofcer for Europefrom 19911995, and director of a policy planning unit within the Ofce ofthe Assistant Secretary of Defense for International Security Affairs from19951997. The opinions expressed in this article are solely those of theauthor and do not recessarily represent the views of the United Statesgovernment or any of its agencies.

    385

    International Journal of Intelligence and CounterIntelligence, 17: 385404, 2004

    Copyright # Taylor & Francis Inc.ISSN: 0885-0607 print/1521-0561 online

    DOI: 10.1080/08850600490446727

  • Ever since Pearl Harbor, the United States Intelligence Community, seniorcivilian policymakers, and military strategists have been xated onimproving their understanding of Americas adversaries, and on avertingmajor surprises that would threaten the nations security. The ongoingcontroversy over the October 2002 National Intelligence Estimate (NIE)on Iraqi weapons of mass destruction (WMD) is merely the latest exampleof how reality can deviate from the best attempts to characterize it. Inmany, if not most cases, the paternity of surprise is usually traced to anintelligence failure, which can take many forms: failure of access, suchas the inability to penetrate a terrorist cell; the missed collection of anevent or activity, such as Indias 1998 nuclear test; and from the ineectivedissemination of intelligence information, as was the case prior to theevents of September 2001.Failure can also be the product of awed work by intelligence analysts,

    and by how intell igence information is understood and used bypolicymakers and military commanders.

    THE SECOND OLDEST PROFESSION AND A CONTEMPORARYCONCERN

    Lamenting the state of intelligence during the Napoleonic War, Carl vonClausewitz noted that, Many intelligence reports are contradictory; evenmore are false, and most are uncertain.1 He aptly captured the frustrationand disappointment that so many generals and statesmen expressed withthe quality of the intelligence on which they must base critical decisions.Good intelligence analysis can aid strategists in understanding an enemysintentions and capabilities, in considering the broader context of eventsand how others might behave, and in forecasting the short- and long-termrisks and opportunities presented by dierent courses of action.Ultimately, however, strategists, no less than intelligence analysts, must

    develop a sharp appreciation for what intelligence can and cannot beexpected to provide. They must also rely on their own intellectual abilitiesand expertise when reaching momentous decisions of war and peace. Thesuccess of their policies often depends on how well strategistsnot justintelligence analystsaccurately assess, understand, and exploit theinternational environment in which they are operating.Mind-sets can pose a fatal trap in that process: history is full of

    examples in which commanders have erred because they held to aninaccurate picture of the others value, or their goals, intentions, orcapabilities. A simple denition of a mind-set might be a series ofexpectations through which a human being sees the world. Over time, thestrategist and intelligence analyst develop these expectations, based on howpast events have occurred; each will draw general conclusions about the

    386 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • relationships among important international phenomena, about how statestypically behave (e.g., maximizing power vis-a`-vis others), or about foreignleaders motivations. As new events occur, data consistent with earlierpatterns of beliefs are more likely to be accepted as valid, while data thatconicts with an analysts expectations is discounted or set aside. It ishuman nature, according to many psychological studies, for individuals toperceive what they expect to perceive, and holding such mind-sets isvirtually unavoidable.2 The more expert one becomes, the more rmbecome ones set of expectations about the world. While these mind-setscan be very helpful in sorting through incoming data, they become anAchilles heel to a professional strategist or intelligence analyst when theybecome out of date because of new international dynamics. Knowing whena mind-set is becoming obsolete and in need of revision can test the mettleof the best expert.This challenge has no perfect or permanent solutions. But the past decade

    has brought a greater recognition that the application of rigorous analytictechniques can help signicantly in averting the likelihood of surprise byuncovering analytical mind-sets and sensitizing policymakers to theinherent uncertainty surrounding major international developments thatthey confront each day. U.S. strategists would do well to understand theseadvances in analytical tradecraft, in order to encourage the IntelligenceCommunity to better exploit them and to guard against susceptibility todistorted or inaccurate views of the world.The study of intelligence analysis is a relatively new eld. While spying is an

    old art, and the exploitation and analysis of the spys information has alwaysbeen a part of the business, few practitioners have concentrated their attentionon how best to organize analysis for the best possible results. In the UnitedStates, multi-disciplinary intelligence analysisas a central and discreetfunctionbegan only as recently as the 1940s, when a small group ofhistorians was assembled as the Research and Analysis (RA) branch, withinwhat soon became the Oce of Strategic Services (OSS). RA was theprecursor to todays Central Intelligence Agency (CIA) Directorate ofIntelligence (DI). From the very beginning, the problem of mind-sets wasrecognized as a key impediment to eective intelligence analysis.Sherman Kent, a Yale historian, wartime member of the OSS, and faculty

    member at the National War College in 1947, was one of the earliestintelligence ocers to identifyamong many thingsthe problem ofmind-sets as a barrier to proper interpretation of internationaldevelopments. Kent became known as the father of modern Americanintelligence analysis for his eorts to codify the elements of goodintelligence analysis, and to argue that intelligence analysis is anothersocial science discipline with its own set of methodological problems. As amember of the CIAs Board of National Estimates, established in 1949,

    387FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • Kent spent most of his career trying to instill academic rigor into the newprofession of intelligence analysis. In a ground-breaking book, StrategicIntelligence for American World Policy, he underlined the importance ofmarshalling the best minds with access to the most complete information,in order to give U.S. policymakers the foresight necessary to make gooddecisions. If expertise and information were available, applying thescientic method would ensure the development of the right conclusions.Assembling facts, developing hypotheses, and critically weighing evidencewas the mission of the intelligence analyst. Moreover, the measure of goodanalysis was the ability to establish one or more hypotheses that bestapproximate a situations ground truth.Yet Kent also warned that the intelligence profession had its own unique

    methodological problems. First, in our business we are as likely to be facedby the problem of a plethora of raw intelligence as by one of its paucity. Inmany of our tasks we have so large a volume of data that no single personcan read, evaluate, and mentally le it all.3 How, then, can intelligenceanalysts guard against the diculties inherent in analyzing a mountain ofdata, much of which may be very incomplete, inaccurate, or false, withoutrunning the risk of improperly interpreting it? Second, Kent acknowledgedthat analysts come to their profession possessing dierent temperamentsand abilities in conceiving alternative hypotheses about the world. Someminds are fertile in the generation of new hypotheses and roam freely andwidely among them, while other minds not merely are sterile in thisrespect but actively resist the new idea.4 To produce the best strategicintelligence requires people who are imaginative in their hypotheses butalso able to identify and adjust for their own analytical preconceptions andprejudices. Despite all the technological advances made in U.S. intelligencecapabilities, the battle against mind-sets, or what Kent described asintellectual frailties, remains unnished business.

    THE ENDURING PROBLEM: MIND-SETS

    One of an analysts most dicult tasks is the challenge of identifying theproper analytical framework for interpreting incomplete information. Forexample, in hindsight, information existed in 1941 which might havewarned Washington of a surprise Japanese attack against the U.S. Fleet inHonolulu; instead, partly because of Japans war in China and occupationof Indochina in 1941, U.S. policymakers anticipated an attack in East Asiaand were unprepared to think about an attack on what might now becalled the Homeland. A comment attributed to President Franklin D.Roosevelts Secretary of the Interior, Harold Ickes, reveals a mind-set thatalso contributed to the disaster: It seems to be pretty well understood . . .that the Japanese are pretty poor airmen . . . . In 1950, reports had

    388 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • indicated increased Chinese military movements, but dismissed thepossibility of China entering the Korean War. In September 1962, theU.S. Intelligence Community reported an increase in Soviet shipments ofmilitary equipment to Cuba, but discounted the possibility of the Sovietsintroducing oensive missiles as too provocative to be a rationaldecision. In defending his own mis-estimate, Sherman Kent wrote in1964, No estimating process can be expected to divine exactly when theenemy is about to make a dramatically wrong decision. We were notbrought up to underestimate our enemies. Whether this was mirror-imaging (imposing an American denition of rationality on anadversary) or simply a lack of understanding how a foreign leadershipassesses risks and opportunities, the results were disastrous for the U.S.intelligence community, and for Kent personally.5

    The misinterpretation of Soviet intentions in 1962 was not to be the lasttime the CIA and other U.S. intelligence analysts fell into the mind-settrap. Nor has the U.S. been alone in suering from this symptom. InOctober 1973, the Yom Kippur War broke out, despite repeated U.S.intelligence assessments that the probability of war was low. In the face ofmounting evidence that Egypt and Syria were mobilizing, both U.S. andIsraeli intelligence analysts continued to insist that these movements couldbe read as defensive rather than oensive. The dominant analytical mind-set was that the Arab states were inferior to Israeli military might, andcould not possibly harbor realistic expectations of victory against the IsraelDefense Forces (IDF).6 U.S. and Israeli analysts were ultimately provencorrect; the combined Syrian and Egyptian armies were not able to defeatIsrael. Nonetheless, Arab political leaders were able to achieve far moreimportant political objectivesnamely, inicting a costly tactical surpriseon Israel, puncturing its sense of invulnerability and forcing the U.S. andUSSR to reinvigorate a stalled negotiating process. As former Secretary ofState Henry A. Kissinger recounts, Egyptian President Anwar Sadatliterally told Washington what he was doing and U.S. ocials refused tobelieve him.7 Again and again, whether it was the Soviet invasion ofAfghanistan or Saddam Husseins 1990 attack on Kuwait, analysts havehad to admit that their set of assumptions about foreign actors intentionshave been frequently wide of the mark.Writing more recently about analytical tradecraft, Richards Heuer has

    identied a lack of imagination rather than a lack of information as theculprit. Analysts naturally favor information that ts their well-formedmental models, and often dismiss other information that might support analternative hypothesis. Or, they minimize the importance of intelligencegaps and foreign deception and denial eorts, believing U.S. intelligencesystems are spoof-proof or American analysts too clever to be outwittedby third-class intelligence services. Even more invidious, the more expert

    389FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • one becomes, the more likely one is to discount the possibility of surprise.The disadvantage of a mind-set, notes Heuer, is that it can color andcontrol our perception to the extent that an experienced specialist may beamong the last to see what is really happening when events take a new andunexpected turn.8

    Ironically, the CIAs impressive record of developing sophisticatedanalytical methodologies to understand closed societies and complexinternational trends has not been able to overcome the fundamentalchallenge of eliminating these human cognitive blinders to unexpectedevents. Despite the eorts of Kent and others, as the Cold War drew to aclose, senior intelligence managers became more aware of their disciplinesweak intellectual underpinnings.9

    TRADECRAFT 2000: BUILDING BLOCKS FOR ALTERNATIVE ANALYSIS

    With the end of the Soviet Union and outbreak of the rst Gulf War, seniorintelligence managers began to take stock of their analytical profession andits decits. In 19931994 a major reappraisal of DI tradecraft found thatwhat really mattered to policymakers was not the opinion of DI analysts,but What do we know and how do we know it? A clear distinction wasneeded between known facts and analysts ndings. Moreover, thesendings were no longer considered unassailable, particularly ifpolicymakers were not able to readily understand the analytic assumptionsand logic that supported those conclusions. As the somewhat critical PaulD. Wolfowitz put it, I frequently think I am as capable of coming upwith an informed opinion about a matter as any number of the peoplewithin the Intelligence Community who feel that they have been uniquelyanointed with this responsibility.10

    Out of this review came the most noticeable change in the CIAs analyticaltradecraft since the establishment of the Intelligence Directorate. In simpleterms, new DI guidelines were crafted to make CIA analysis transparent tothe policymaker, which paralleled similar guidance that General ColinPowell once gave to his military intelligence sta:

    What do we know? How do we know it? What do we not know? And only lastly, what do we think?11

    To inculcate these basic principles of good analysis, the DI conducted two-week long workshops for all intelligence analysts and many managers.Termed Tradecraft 2000, these workshops became virtually mandatory forjunior analysts to help them strengthen their craft. Many rst-linemanagers of analysts had also internalized these principles by serving asinstructors for the Tradecraft 2000 courses.

    390 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • A new art-form was created. Called linchpin analysis, intelligenceassessments were to be constructed by using the following steps:

    Identify the main uncertain factors or key variables (drivers) that will determinean outcome.

    Identify working assumptions (linchpin premises) about how the key drivers willoperate.

    Advance convincing evidence and reasoning to support the linchpin premises. Address any indicators or signposts that would render linchpin premises

    unreliable. Ask what dramatic events or triggers could reverse the expected outcomes.12

    Tradecraft 2000

    Behind this seemingly simple recipe for good analysis stood a radicaladmission. Namely, the CIA had to regain credibility with the nationspolicymakers by establishing more rigor and transparency in its analysis.Moreover, the CIA was convinced that to remain relevant to the policyprocess, i t would have to provide more customized support topolicymakers. Instead of marketing general assessments to the broadestpossible set of consumersfrom assistant secretaries down to the juniordesk ocerthe DI would aim to support senior level policymakers andtailor its analysis to their specic intelligence needs. What owed from thisnew vision was a redesign of many intelligence products, in which the DIwould target a very small number of customers who already knew theirissues extremely well; moreover, in some cases it also meant providingsensitive information regarding how we know it in order to give seniorpolicymakers a rmer basis for placing condence in CIA assessments.Simultaneously, the DI began deploying increasing numbers of seniorocers to key policymaking agencies (NSC, State, Defense, etc.) andproviding Cabinet and sub-Cabinet level customers with daily personalbriefers who could provide them with the latest information tailored totheir agendas and give them a direct channel to CIA expertise.

    THE JEREMIAH AND RUMSFELD COMMISSIONS: THE PUSH FOR GREATERCHANGE AND ALTERNATIVE ANALYSIS

    By the mid-1990s, in the midst of what would prove to be nearly a decade ofshrinking intelligence budgets, the recognition was growing that a customer-focused approach to intelligence analysis also served to prioritize limitedanalytical resources. By 1995, the CIAs analytic ranks had shrunk by 17percent from where they had been in 1990. By the end of the 1990s, the DIhad declined by 22 percent.13 In the postCold War order, fewer analystsmeant satisfying the intelligence needs of only senior civilian and military

    391FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • leaders, focusing analytical attention to their specic agenda items, andaccepting a lower level of global coverage against second- and third-orderintelligence topics.Then, in May 1998, India exploded ve nuclear devices at an underground

    test facility, without any CIA warning to United States policymakers. Soonafter, Pakistan did the expected and conducted tests of its own.Congressional oversight committees demanded to know how such anintelligence surprise could have occurred. Responding to this pressure, aswell as Clinton administration ocials grumbling about the lack ofadvanced warning, DCI George J. Tenet asked retired Admiral DavidJeremiah to review the record to see what had led to this failure to warnthe policymakers. While the report remains classied, Admiral Jeremiahnoted at his June 1998 press conference that his bottom line is that boththe intelligence and the policy communities had an underlying mind-setgoing into these tests that the BJP [newly governing party] would behaveas we behave. As in the 1962 Cuban SNIE, CIA analysts were accused ofharboring a mind-set that hampered their ability to see the world as aforeign government might. Going further, Jeremiah proposed that CIAanalysts be more aggressive in thinking through how the other side mightbehave: You could argue that you need to have a contrarian view thatmight be part of our warning process, ought to include some divergentthinkers who look at the same evidence and come to a dierent conclusionand then you test that dierent set of conclusions against other evidence tosee if it could be valid.14

    Almost simultaneously, the 1998 Commission to Assess the BallisticMissile Threat to the United States issued a similar assessment. It foundanalysts unwilling to make estimates that extended beyond the hardevidence they had in hand, which eectively precluded developing andtesting alternative hypotheses about actual foreign programs takingplace.15 The Commission, headed by Donald H. Rumsfeld, was asked byCongress to review the Intelligence Communitys analysis of the globalballistic missile forecast because there was signicant concern that the NIEhad played down the long-term threat at a time when the Clintonadministration was deliberating on the deployment of a national missiledefense system. The Commission provided its views to the congressionaloversight committees in July 1998. Its general conclusions were that thethreat to the U.S. posed by these emerging capabilities is broader, moremature and evolving more rapidly than has been reported in estimates andreports by the Intelligence Community.16 The Commission found that,had analysts considered not just what they knew, but also what they didnot know, they might have been able to employ alternative hypothesesand thereby develop better indicators and collection priorities that couldhave narrowed information gaps, and thus led to better assessments. More

    392 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • importantly, Commissioners believed that the result would be earlierwarning than if analysts wait for proof of a capability in the form of hardevidence of a test or a deployment.17

    The recommendations of these two commissions, along with a separatein-house Inspector General Report on Alternative Analysis, renewedinterest and attention to the state of DI analytical tradecraft.18 Amongother things, Richards Heuers own work on mind-sets became requiredreading among senior intelligence ocers. Published in 1999, his book,Psychology of Intelligence Analysis, became the intellectual foundation fora major revamping of the Directorate of Intelligences analytical training.As Heuer saw it, the Directorates analytical objectives should be to:

    Encourage products that clearly delineate their assumptions and chains ofinference and specify the degree and source of uncertainty involved in theconclusions.

    Support analyses that periodically reexamine key problems from the ground upin order to avoid the pitfalls of the incremental approach.

    Emphasize procedures that expose and elaborate alternative points of view. Educate consumers about the limitations as well as the capabilities of intelligence

    analysis; dene a set of realistic expectations as a standard against which to judgeanalytical performance.19

    Collectively, these developments led to a new push for improvements inanalytic practice, and specically the adoption of techniques to deal withthe recurring problem of mindsets that shaped analytic judgments.

    ALTERNATIVE ANALYSIS

    Alternative Analysis (AA) seeks to impose an explicit self-review by usingspecic techniques to reveal unconscious analytical assumptions orchallenge weak evidence or logic, and consider alternative hypotheses oroutcomes, even in the absence of convincing evidence. Simply put,intelligence analysts are now obliged to question explicitly and rigorouslythe assumptions that underlie their conclusions, and guard againstconventional wisdom masking a fundamental change in the dynamics of anissue. In many ways, AA merely builds upon the earlier Tradecraft 2000emphasis that encouraged analysts to identify linchpin assumptions, keydrivers, indicators, and triggers to future events. Unlike Tradecraft 2000,however, AA seeks to display this analytical self-review to the policymakerand not merely to use it as an in-house critique. The most powerfultechniques include: Key Assumptions Checks, Devils Advocacy, Team A=Team B, Red Cell exercises, Contingency What If Analysis, HighImpact/Low Probability analysis, and Scenario Development. Similaritiesand overlap are present in some of these techniques, and most are designedto highlight uncertainty and identify intelligence collection gaps.

    393FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • A. Key Assumptions Check

    When drafting important analysis that contains far-reaching conclusions,analysts must identify key linchpin assumptions as well as key driversor factors that shape these assumptions. By explicitly citing theassumptions and drivers, analysts can test the validity of the relationshipsbetween the two. For example, analysis may conclude that a foreigngovernment is likely to introduce painful economic reforms withoutcreating major instability. In this case, the analyst might need to identifyas a key assumption that security forces will remain loyal and be willingto use lethal force to maintain order. Should that assumption not hold,or if signs of dissatisfaction should emerge within the security services,then perhaps the analysts judgment that the leadership will imposereforms will become less well-founded. By making a list of criticalassumptions and drivers transparent, analysts allow the policymaker to seethe argumentation behind the key conclusion, consider whether theassumptions are valid, and understand what evidence might then leadanalysts to alter their judgments.

    B. Devils Advocacy

    This technique is most valuable for challenging a deeply ingrained view of anissue. When condence in an important, perhaps war-or-peace, judgment ishigh, the use of such contrarian analysis can be more than justied. To besure, making such a case sometimes involves a contrived argument thatselectively uses information to challenge the conventional wisdom. The artof devils advocacy is to turn accepted linchpin assumptions and keydrivers on their heads. For example, the conventional assumption aboutIndias BJP party in 1998 was that it would not test a nuclear devicebecause it was leading a fragile coalition. A devils advocate might haveargued to the contrary, that the key to stabilizing a weak coalition wouldbe a dramatic symbolic act to mobilize Indian nationalism behind thenewly elected government. In May 1998, that would have been contrarian,and not supported by ocial Indian statements given to the United Statesgovernment; in hindsight however, a contrarian analysis might havesensitized U.S. policymakers to uncertainty about how a newly elected andunproven government might perceive its own national interests. Similarly,a devils advocate might have been assigned the task of arguing thatSaddam Hussein intended to invade Kuwait in 1990, in order to challengethe conventional view that Iraq was using a high-stakes blu to extractconcessions from other Arab states.Ever since 1973, Israeli military intelligence has performed devils

    advocacy on a selective basis to convince itself that its neighbors militarymaneuvers are not disguised preparations for war. The Defense Intelligence

    394 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • Agency (DIA) has also employed this technique periodically to challengeconventional views of strategic military issues. Belief in the position is notnecessary when arguing a contrarian case. Indeed, the reality is that mostdevils advocacy proves to be unpersuasive. Still, the exercise has value inraising condence levels in, and perhaps rening, the prevailing analyticjudgment. But its chief drawback is that contrived advocacy can be tooeasily dismissed if senior intelligence ocials and policymakers do not putmuch credence in the technique.

    C. Team A=Team B

    This technique has been used periodically when major strategic issues havebeen judged too important to let conventional wisdom drive policy. Forexample, in 1976, then-DCI George H. W. Bush invited a Team B ofoutside experts to examine National Intelligence Estimates of Sovietstrategic force developments and to propose its own conclusions.20

    Similarly, the 1998 Rumsfeld Commission was, in essence, a Team Bexercise that challenged the underlying assumptions of CIA analysis offoreign ballistic missile developments.But the technique need not be used only when national policy issues are at

    stake. Whenever opposing views of an issue are strongly held there is utilityin laying out each sides linchpin assumptions and key drivers, then explicitlydescribing how the data supports their conclusions. In presenting both sidesof an argument, advocates on either team are exposed to an alternative wayof thinking about an issue. Thus, a dialogue over analytical assumptions andlogic is opened rather than a simple focus on the key conclusions of eithersides analysis. Unlike Devils Advocacy, this approach is not contrivedanalysis. In the real world, analysts on both sides of an issue are usuallypresent. Thus, the Team A=Team B method can take advantage ofanalytical disagreements and put both sides of important issues before thepolicymaker.

    D. Red Cell Analysis

    As the Jeremiah Commission noted, the Intelligence Community should bemaking more frequent use of a red cell approach to analyzing foreigngovernment behavior, and avoid the tendency to mirror-image U.S.adversaries. Accordingly, a red cell is a group of analysts assembled torole-play senior leaders of a foreign government or entity, and to proposecourses of actions that would complicate American foreign and securitypolicy objectives. Through this technique, analysts attempt to step out ofan American strategic logic and instead reect an adversarys mode ofthinking. To be eective, the red cell draws from country experts, whounderstand the other nations culture, as well as its political system; to the

    395FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • extent possible, these analysts try to look at the situation from the enemysvantage point and produce recommendations that would approximate thecultural norms, decision-making style, and nationalist rhetoric used withinactual foreign governing circles. As a written art-form, then, a red cellmight produce a brieng book for a putative Saddam Hussein or SlobodanMilosevic on how to counter American diplomatic and military pressures.These red cell products are aimed at provoking thinking among

    policymakers and strategists, and are not designed to be consensus-drivenassessments. Seldom are they formally coordinated with mainline analyticalunits. Various government agencies have used red cell analysis to modelhow foreign leaders might develop diplomatic strategies, conductnegotiations, or wage asymmetric warfare against the United States. Mostrecently, in response to the attack of 11 September 2001, the Director ofCentral Intelligence has established his own DCI Red Cell to thinkunconventionally about the full range of analytic issues. This work isshared, but not coordinated, with the CIAs DI oces. It is explicitlyidentied as Intended to provoke thought rather than provideauthoritative assessment. The subject matter can range widely, fromthinking like al-Qaeda cells to understanding what kinds of informationcampaigns would be most eective in countering terrorist recruitmentpropaganda. But such outside the box analysis can also blur the linebetween intelligence analysis and policy advocacy. For example,postulating what credible terrorist recruitment strategy might be is only ashort step to identifying the vulnerabilities of such a strategy andproposing measures that could undermine it. Red cell analysts aresometimes encouraged to devise creative courses of action that mightundermine adversaries, without any expectation that they would reect theanalysts policy preferences. Properly done, and with explicit caveats aboutits intended purposes, red cell work has found an eager policymakingaudience and contributed to more creative strategic planning.

    E. Contingency What If Analysis

    While most conventional intelligence analysis focuses on what analystsbelieve is the most likely outcome, contingency what if analysis focuseson the possible causes and consequences of an unlikely event. Forexample, what if India were to decide to conduct nuclear tests (as it laterdid) and seek to deceive the United States? The analysts would then becompelled to consider the motives behind such a decision; they also woulddetermine the signposts or indicators preceding that decision oralternately, how the Indians might try to hide preparations from U.S.intelligence, and the indications of such deception. Moreover, how mightthe Indian government justify conducting a nuclear test despite expected

    396 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • American displeasure? Thinking through these sorts of questions forcesanalysts and policymakers to appreciate what they do not know as muchas what they do know. Could India be actively misleading the U.S.? DoU.S. diplomats have access to the small circle of Delhi decisionmakers whowould actually make such a decision? Would a foreign government ociallie to U.S. diplomats? How might a newly elected government judge theresults to be more positive than the incumbent administration? Such whatif thinking forces analysts to discard the obvious American way ofthinking. And, unlike Devils Advocacy, the thinking is not contrived toargue a specic outcome, regardless of the merits of the case; instead, itasks the awkward question that might lead to further questions regardingthe quality of U.S. intelligence information, the presence of foreigndeception, and the possibly faulty linchpin assumption that underliescurrent intelligence judgements. It might also have permitted policymakersto see that India could easily tolerate U.S. criticism and sanctions if itsactions caused more trouble for Pakistanwhich shortly afterwards testedin responseand led to wider popular support for a newly elected Indiangovernment.

    F. High Impact=Low Probability Analysis (HI=LP)

    Like what if analysis, this technique focuses on the examination of unlikelyevents of the kind that would have huge consequences for the United States.In this situation, analysts accept as a given that the event itselfsuch as anIndian nuclear testhas signicant implications and focus instead on thepossible ways by which the Indian government might decide to conduct atest. In this case, analysts look for plausible combinations of domestic andinternational factors that could precipitate the occurrence of an unlikelyevent. What changes in key assumptions, or dierent sets of key drivers,would have to be present to overturn the prevailing analytical line thatIndia would not test a nuclear device. Such argumentation is probablymost useful as a longer analysis for a policymaker who is already experton the issues and likely shares the conventional wisdom. The HI=LPanalysis is more valuable to such specialists as a check on their ownthinking, and less valuable to the policymaker who is a generalist and doesnot have a well-formed opinion on the issue.

    G. Scenario Analysis

    Analysts facing more mysteries (unknowable) than secrets (discoverable),nd it very useful to address intelligence questions by using the techniqueof multiple scenario development. This powerful approach explores a rangeof possible outcomes when there are many unknowns and no certaintyabout a single outcome. Typically, a group of experts will use structured

    397FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • brainstorming to identify key factors and forces that will shape an issue.First, experts will agree that a focal issuefor example, the testing ofnuclear weapons in South Asiais suciently important to justifyexploring a range of futures. Next, experts describe what they believe is theconventional wisdom and general assumptions about the issuee.g., abelief that neither India nor Pakistan would test because of internationalcondemnation and possible sanctions. In the course of their brainstorming,analysts list areas of relative certainty (e.g., relative economic=politicaldependence on the U.S., technological capabilities to test, governmentcontrol over nuclear materials, decisionmaking processes) and identifycritical uncertainties (e.g., stability of governing coalitions, role of publicopinion, perception of threat). Among those areas of uncertainty, analyststhen select two or more key uncertainties (dened as drivers) and developa matrix of possible alternative scenarios.One possible scenario matrix, for example, might display the relative

    stability of Indias government coalition (from highly stable to highlyunstable) as one axis and array a range of Indian threat perceptions (fromhighly benign to highly threatening) as the other axis. This will produce atleast four dierent conditionsa highly stable or unstable coalition withina highly threatening or very benign environment. Analysts can then openlyspeculate as to the likely actions taken by a government under thesedierent conditions. They can select the scenarios that are most divergentfrom the conventional wisdom and most threatening to U.S. interests forfurther examination. Typically, analysts will look for key developments orindicators (sometimes called signposts) that are associated with a specicscenario; these signposts would indicate the emergence of a scenario thatwill challenge U.S. policy objectives.The virtue of scenario analysis is that it allows the decisionmaker to

    imagine what might happen under a range of plausible outcomes, andwhat actions might be needed to reduce the damage or take advantage ofthe opportunities posed. While the conventional wisdome.g., India willnot testcan be described as one plausible scenario, analysts are able toexpose policymakers to alternative futures that challenge them to thinkthrough what they might face if current analysis is operating under faultyassumptions or incomplete information.

    INSTITUTIONALIZING CHANGE

    Perhaps the rst sign of senior CIA support to improve analytical tradecraft,the Sherman Kent School for Intelligence Analysis, opened in May 2000.Created with the primary mission of developing a more professionalanalytical cadre, the school now oers a wide variety of courses, includinga four-month, entry-level career analyst course, as well as discipline-

    398 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • specic training on military, scientic, economic, leadership, and politicalanalysis techniques. The Kent Schools new analyst training includedAlternative Analysis techniques,21 specically aimed at more explicittesting of prevailing analytical judgments. This course emphasized usingcontrarian analysis, making the prevailing linchpin assumptions explicit inDI writing, and identifying best practices in Alternative Analysis (AA) thatcould be adopted throughout the DI. From 1999 to the present, AAworkshops have instructed more than one-third of the Directoratescurrent analystsas well as other IC analystsin the proper use of thesetechniques. As an outgrowth of the AA workshops, the Directorate ofIntelligence also produced a short primer on AA techniques, describing thepurposes and applications of dierent techniques that could be used byindividual analysts or teams to challenge their own thinking. It alsostimulated intelligence analysis that would sensitize policymakers to theuncertainties surrounding some key intelligence judgments.22

    LIMITATIONS OF ALTERNATIVE ANALYSIS

    These Alternative Analysis techniquesin addition to other more specializedgaming and simulation techniqueshave been put to use in currentpublications when senior intelligence ocials believed that going beyondconventional intelligence judgments was appropriate. But, consistent withthe renewed emphasis on communicating the limitations of intelligenceproducts to its customers, policymakers and senior military ocials shouldbe aware of some important limitations in the use of AA. The challengesof presenting Alternative Analysis to senior policymakers are almost asgreat as are the dangers of not using it at all.First, some policymakers continue to prefer precise, single-point

    judgments from intelligence analysts; to these decisionmakers, the DIsresorting to what if or contrarian analysis undermines their condencein what the CIA or other IC analysts are telling them. If on Monday, theCIA asserts that an Indian nuclear test is unlikely, but on Tuesday writesan analysis suggestingin the absence of compelling informationthatNew Delhi might possibly see things dierently and could be planning totest secretly, then the reader might not know what the Agency thinks ismost likely. Unless AA techniques are carefully explained and used tosensitize policymakers to important uncertainties in the CIAs analysis, therisk exists of its being considered CYA analysis.The second pitfall to using AA techniques is that some of its

    methodologies are arcane. A key assumptions check assumes that thereader has a fairly sophisticated grasp of an issue. The decision to use AAor other techniques will always rest on the Agencys knowing thecustomers, their level of knowledge, and their interest in having the CIA

    399FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • challenge their thinking on a vital issue. Naturally, there always will be seniorpolicymakers who are not experts in their own right and will wonder whyCIA is questioning its own analysis of a critical issue. Moreover, unlesspolicymakers have actually been through a scenario development exercise,they probably will not understand the process by which the scenarios weredeveloped. Very few have the luxury of participating in the lengthyworkshops that scenario exercises often require.In truth, some of these techniques turn out to be of most benet to those

    who use them. Experience has shown that analysts, as well as policymakers,who have participated in scenario development workshops value thebrainstorming exercise but nd the written product less helpful. Inevitably,the scenarios cannot capture all the interesting insights shared amongexperts and policymakers during the one- or two-day workshops.23

    Similarly, the value of Devils Advocacy is primarily its challenging theweaker assumptions or evidentiary base of a current intelligence judgment.Since most of this contrarian analysis will be proven wrong, and rearmthe conventional wisdom rather than replace it, there is little value inpublishing it. Only when Devils Advocacy is judged useful to provoke adialogue between policymakers and intelligence analysts would it serve theDIs interest in disseminating such analysis. In the end, most of thelearning occurs during the process of using unconventional analyticaltechniques. The reported ndings are not always judged as insightful as thecritical thinking that went into the process.Third, using Alternative Analysis will create friction within analytical

    ranks if there is a well-accepted view of important intelligence issues.Trying to argue against the current analytical line can be seen asundermining teamwork or even a sign of personal self-promotion. Hence,unless there is higher-level receptivity to AA, the analyst eager to try outDevils Advocacy or other contrarian techniques against an analyticalunits conventional wisdom could face considerable resistance, if not openhostility. Senior intelligence leaders must encourage the use of thesetechniques as an important tradecraft tool that all analysts should practice.Even so, when strong views are held within an analytical oce, trying toset up a Team A=Team B exercise, or conduct Devils Advocacy againstthe prevailing wisdom, can set analysts on edge, The use of publicinvestigative commissions, like the ones chaired by Admiral Jeremiah andDonald Rumsfeld, tend to accentuate the adversarial nature of thesereviews. Many analysts will react defensively to the notion of using aTeam A=Team B approach if it is perceived to be a blatant condemnationof their analysis and tradecraft. Some analysts can become so personallyinvested in being proven right that any hope of understanding theassumptions and logic behind analytical judgments, or of learning how toimprove analysis, is derailed.

    400 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • Fourth, Alternative Analysis can be resource intensive. Many managersbelieve it must be used sparingly, and only on the most important topics.Few intelligence analysts, much less their managers, will wish to invest thetime and eort into conducting AA on a third-order issue which does notconfront policymakers with major diculties. To focus on a highlyunlikely event that will not make a dierence to anyone in Washingtonwould be just as foolish as not conducting it against critical issues wherethe risks of being wrong might be catastrophic. Clearly, where there aremajor equities, and where the Directorate of Intelligence invests substantialresourcessuch as on Iraq, Iran, North Korea, China, counterterrorism,etc.the case can then be made to employ these techniques periodically totest analysts judgments and highlight where condence in the conventionalwisdom might be unjustied.To proselytize the use of AA everywhere and everyday is unrealistic, and a

    poor use of scarce analytical resources. As a rule, intelligence managers willhave to be convinced that: (1) the issue is suciently important to justifyusing these techniques; (2) the consequences of being wrong would be amajor surprise to the U.S. intelligence and policy communities; and (3) theuse of the techniques will raise a policymakers understanding of theissue and the uncertainties surrounding it. Similarly, good judgmentis required in deciding when to present Alternative Analysis to apolicymaker. Contrarian analysis can be downright irritating to thepolicymaking community after it has completed a major policy review andselected a course of action; had this same analysis been presented a fewmonths earlier, however, it might have been well received and helpful instimulating and raising the level of the policy debate.

    EVOLUTION OR REVOLUTION IN TRADECRAFT?

    Over the past sixty years, has an evolution or revolution in analyticaltradecraft taken place? The jury is still out. Without question, theanalytical tradecraft has changed markedly since the end of the Cold War.The CIA is using Alternative Analysis more explicitly to challenge itsanalyses, raise important questions to policymakers, and reduce theirsusceptibility to surprise. The CIA is also providing a much fuller range oftraining opportunities for new and mid-career analysts that will improvetheir skills and understanding of policymakers needs. The establishment ofthe DIs Sherman Kent School in 2000 was a major step towardprofessionalizing the CIAs analytical ranks and providing a center wherethe best practices in analytical tradecraft can be identied and used inadvanced analytical training courses. Perhaps more far-reaching will be thefostering of new analytical units designed to emphasize nonconventional,nonlinear thinking. Already the Directorate is home to several such

    401FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • groups. The Oce of Transnational Issues has expanded the domain of itsStrategic Analysis Group, which regularly conducts alternative analysis,uses sophisticated gaming and simulation techniques, and actively solicitsoutside expertise to challenge and enrich its own analysis. Similarly, theKent Schools small sta in the Global Futures Partnership (GFP) hasbecome an in-house incubator on novel ideas and counterintuitivethinking. As its name implies, the GFP is active in partnering withAmerican and even foreign academies interested in futures work that touchon global issues of mutual concern.But there remains room for improvement. Has the Directorate of

    Intelligence employed these analytical tradecraft techniques enough? Areintelligence managers supportive enough of them? Do policymakersthemselves understand the value of these techniques, and encourage theiruse to sharpen the CIAs own analytical skills? The answer to thesequestions is probably still no.On the positive side, senior CIA ocials have openly embraced the

    philosophy of challenging analysts to be more self-critical and skeptical oftheir own infallibility. Connecting the dots or putting the puzzle piecestogether will always remain easier after the fact, but the U.S. IntelligenceCommunity is obliged to strive to creatively assemble evidence of threatsand opportunities in as many ways as is possible before the fact. As DDCIJohn McLaughlin put it in 2001, Our country and its interests are at theirmost vulnerable if its intelligence professionals are not always ready forsomething completely dierent.24 In the wake of 11 September, thepractice of thinking dierently is the strongest defense against theunexpected.

    REFERENCES1Carl von Clausewitz, On War, edited and translated by Michael Howard andPeter Paret (Princeton, NJ: Princeton University Press, 1986), p. 117.

    2See Richards J. Heuers excellent book, The Psychology of Intelligence Analysis(Washington, DC: Center for the Study of Intelligence, 1999). Chapter 2,Perception: Why Cant We See What Is There to Be Seen? lays out thepsychological basis for mind-sets.

    3Sherman Kent, A Crucial Estimate Relived, Studies in Intelligence, Vol. 8, No.2, Spring 1964, p. 6.

    4Ibid., p. 7.

    5According to many accounts, Director of Central Intelligence John McCone, whowas alone in believing the Soviets might risk placing oensive missiles in Cuba,never quite recovered condence in the Oce of National Estimates or inKents judgment.

    6Both the U.S. and Israel conducted extensive post-mortems on this intelligencesurprise. In Israel, the two-year-long Agranat Commission issued extensive

    402 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE

  • ndings as well as punished senior intelligence ocials. One recommendation wasthat Israeli military intelligence create a Devils Advocacy unit within itsanalytic directorate that would have the authority to challenge conventionalwisdom and draft contrarian analysis to ensure that all possible hypothesesabout hostile military movements were explored. A description of this unit isfound in Lt. Col. Smuel, The Imperative of Criticism: The Role ofIntelligence Review, IDF Journal, Vol. 2, No. 3, pp. 6269.

    7See Henry Kissinger,Years of Upheaval (Boston: Little Brown, 1972). His ChapterXI on The Middle East War, talks at length about why we were surprised.Sadat boldly all but told us what he was going to do and we did not believe him.Heoverwhelmeduswith informationand letusdrawthewrongconclusion (p. 457).

    8Richards J. Heuer, Jr., Psychology of Intelligence Analysis, p. 5.

    9DCI Robert Gates, as a former DDI as well as Deputy National Security Advisorin the rst Bush administration, was an early critic of the Agencys analyticalperformance. As both consumer and producer of intelligence, he signaled thenecessity of tradecraft reforms that were to become more formalized in the 1990s.

    10See comments of Paul D. Wolfowitz in Roy Godson, Ernest R. May, and GarySchmitt, eds., U.S. Intelligence at the Crossroads: Agendas for Reform(Washington, DC: Brasseys, 1995), p. 76.

    11Tell me what you know . . . tell me what you dont know . . . tell me what youthink . . . always distinguish which is which. General Colin Powell, Chairman,Joint Chiefs of Sta 19891993 (quoted in DOD=JCS Publication 2.0,Doctrine for Intelligence Support to Joint Operations, 9 March 2000, pageIII-5).

    12Jack Davis, Changes in Analytic Tradecraft in CIAs Directorate of Intelligence,Product Evaluation Sta=Directorate of Intelligence, April 1995, p. 8.

    13John McLaughlin, The Changing Nature of CIA Analysis in the Post-SovietWorld, Remarks of the DDCI at the Conference on CIAs Analysis of theSoviet Union 19471991, 9 March 20001, p. 3.

    14Jeremiah News Conference, 2 June 1998. Inter alia, Jeremiah also highlightedscarce imagery assets and improper collection priorities, as well as clever Indiandeception and denial as among the root causes for this surprise.

    15Report of the Commission to Assess the Ballistic Missile Threat to the UnitedStates, Intelligence Side Letter, 18 March 1999, Unclassied Version of theIntelligence Side Letter, p. 6. Members of the Commission included DonaldRumsfeld (Chairman), Paul D. Wolfowitz, and former DCI R. James Woolsey.

    16Executive Summary of the Report of the Commission to Investigate theBallistic Missile Threat to the United States, 15 July 1998, Pursuant to PublicLaw 201, 104th Congress.

    17Ibid., p. 7.

    18Oce of Inspector General, Alternative Analysis in the Directorate ofIntelligence, Central Intelligence Agency, 1999, cited in Jack Davis, ImprovingCIA Analytic Performance: Strategic Warning, Sherman Kent Center forIntelligence Analysis, Occasional Papers, p. 4.

    403FIXING THE PROBLEM OF ANALYTICAL MIND-SETS

    AND COUNTERINTELLIGENCE VOLUME 17, NUMBER 3

  • 19Richards J. Heuer, Jr., Psychology of Intelligence Analysis, p. 16.

    20For a detailed review of this exercise, see John Prados, The Soviet Estimate:United States Intelligence and Soviet Strategic Forces (Princeton, NJ: PrincetonUniversity Press, 1986), especially Chapter 15: Intelligence Alarum, pp.245257. The Senate Select Committee on Intelligence also issued a detailedreport of its ndings of the AB Team exercise and concluded, There is aneed for competitive and alternative analyses. Both within the estimative bodyand with respect to outside expertise, competing and on occasion alternativeestimates should be encouraged. Report of the U.S. Senate Select Committeeon Intelligence, 1978, cited in Harold Ford, ibid., Annex III, pp. 267268.

    21This course has since been renamed Advanced Analytical Techniques. For moreon the next school, see Stephen Marrin, CIAs Kent School: Improving Trainingfor New Analysts, International Journal of Intelligence and CounterIntelligence,Vol. 16, No. 4., Winter 20032004, pp. 609637.

    22The DIs Oce of Policy Support produced a 25-page Alternative Analysis Primerthat was provided to every DI analyst and made available to all AA Workshopsand to other interested agencies.

    23Like simulation exercises, another technique, but not discussed here, scenariodevelopment requires considerable planning and logistics. The DIs StrategicAssessment Group and the Global Futures Partnership both conduct these one-or two-day workshops that bring together senior experts from both inside andoutside the government.

    24John McLaughlin, The Changing Nature of CIA Analysis in the Post-SovietWorld, p. 6.

    404 ROGER Z. GEORGE

    INTERNATIONAL JOURNAL OF INTELLIGENCE