how useful is qra - george apostolakis

6
Risk Analysis, Vol. 24, No. 3, 2004 Perspective How Useful Is Quantitative Risk Assessment? George E. Apostolakis This article discusses the use of quantitative risk assessment (QRA) in decision making regard- ing the safety of complex technological systems. The insights gained by QRA are compared with those from traditional safety methods and it is argued that the two approaches com- plement each other. It is argued that peer review is an essential part of the QRA process. The importance of risk-informed rather than risk-based decision making is emphasized. En- gineering insights derived from QRAs are always used in combination with traditional safety requirements and it is in this context that they should be reviewed and critiqued. Examples from applications in nuclear power, space systems, and an incinerator of chemical agents are given to demonstrate the practical benefits of QRA. Finally, several common criticisms raised against QRA are addressed. 1. INTRODUCTION It has been about 30 years since quantitative risk assessment (QRA) was the first applied to large tech- nological systems. (1) Since then, we have seen many methodological advances and applications to nuclear power reactors (where it is called probabilistic risk assessment—PRA), space systems, waste repositories (where it is called performance assessment), and in- cinerators of chemical munitions. In every application, I have observed a familiar pattern of progress. At first (Phase 1), the safety com- munity of that industry is very skeptical about the usefulness of this new technology. Then (Phase 2), as engineers and decision makers become more famil- iar with the technology, they begin to pay attention to the insights produced by QRA. Typically, the decision makers first pay attention to the “negative” insights, i.e., those that reveal failure modes of the system that had not been identified previously. Actions are taken to make these failure modes and their consequences less likely. With time (Phase 3), confidence in QRA Department of Nuclear Engineering and Engineering Systems Division, Room 24-221 Massachusetts Institute of Technology, Cambridge, MA 02139-4307, USA; [email protected]. increases as more safety analysts use it and they be- gin to pay attention to the “positive” insights, i.e., that some of the previously imposed safety requirements can be relaxed because either they do not contribute to safety or they contribute a very small amount that cannot be justified when it is compared with the cor- responding cost. Entering Phase 3 usually requires a cultural change regarding safety management. This change is not always easy for engineers who have been using traditional “deterministic” methods for years. In all three phases, risk insights alone are never the sole basis for decision making. Of course, there are no sharp lines dividing the three phases. The phase in which a particular indus- try is depends on the extent to which it is regulated. Thus, the heavily regulated U.S. nuclear power indus- try entered Phase 1 in 1975 and Phase 2 a few years later. The first regulatory guidance on how risk infor- mation could be used in regulatory affairs was issued in 1998, (2) which marks the beginning of Phase 3. It took about a quarter century for Phase 3 to begin. This is evidence of the caution with which QRA is ac- cepted by real decision makers. Most foreign nuclear regulatory agencies are still in Phase 2 and are watch- ing carefully the U.S. experiment with Phase 3. In my view, NASA is in Phase 1 at this time and is cautiously 515 0272-4332/04/0100-0515$22.00/1 C 2004 Society for Risk Analysis

Upload: vivxtract

Post on 28-Nov-2015

5 views

Category:

Documents


1 download

DESCRIPTION

Research by George Apostolakis

TRANSCRIPT

Page 1: How Useful is QRA - George Apostolakis

Risk Analysis, Vol. 24, No. 3, 2004

Perspective

How Useful Is Quantitative Risk Assessment?

George E. Apostolakis∗

This article discusses the use of quantitative risk assessment (QRA) in decision making regard-ing the safety of complex technological systems. The insights gained by QRA are comparedwith those from traditional safety methods and it is argued that the two approaches com-plement each other. It is argued that peer review is an essential part of the QRA process.The importance of risk-informed rather than risk-based decision making is emphasized. En-gineering insights derived from QRAs are always used in combination with traditional safetyrequirements and it is in this context that they should be reviewed and critiqued. Examplesfrom applications in nuclear power, space systems, and an incinerator of chemical agents aregiven to demonstrate the practical benefits of QRA. Finally, several common criticisms raisedagainst QRA are addressed.

1. INTRODUCTION

It has been about 30 years since quantitative riskassessment (QRA) was the first applied to large tech-nological systems.(1) Since then, we have seen manymethodological advances and applications to nuclearpower reactors (where it is called probabilistic riskassessment—PRA), space systems, waste repositories(where it is called performance assessment), and in-cinerators of chemical munitions.

In every application, I have observed a familiarpattern of progress. At first (Phase 1), the safety com-munity of that industry is very skeptical about theusefulness of this new technology. Then (Phase 2), asengineers and decision makers become more famil-iar with the technology, they begin to pay attention tothe insights produced by QRA. Typically, the decisionmakers first pay attention to the “negative” insights,i.e., those that reveal failure modes of the system thathad not been identified previously. Actions are takento make these failure modes and their consequencesless likely. With time (Phase 3), confidence in QRA

∗ Department of Nuclear Engineering and Engineering SystemsDivision, Room 24-221 Massachusetts Institute of Technology,Cambridge, MA 02139-4307, USA; [email protected].

increases as more safety analysts use it and they be-gin to pay attention to the “positive” insights, i.e., thatsome of the previously imposed safety requirementscan be relaxed because either they do not contributeto safety or they contribute a very small amount thatcannot be justified when it is compared with the cor-responding cost. Entering Phase 3 usually requiresa cultural change regarding safety management. Thischange is not always easy for engineers who have beenusing traditional “deterministic” methods for years. Inall three phases, risk insights alone are never the solebasis for decision making.

Of course, there are no sharp lines dividing thethree phases. The phase in which a particular indus-try is depends on the extent to which it is regulated.Thus, the heavily regulated U.S. nuclear power indus-try entered Phase 1 in 1975 and Phase 2 a few yearslater. The first regulatory guidance on how risk infor-mation could be used in regulatory affairs was issuedin 1998,(2) which marks the beginning of Phase 3. Ittook about a quarter century for Phase 3 to begin.This is evidence of the caution with which QRA is ac-cepted by real decision makers. Most foreign nuclearregulatory agencies are still in Phase 2 and are watch-ing carefully the U.S. experiment with Phase 3. In myview, NASA is in Phase 1 at this time and is cautiously

515 0272-4332/04/0100-0515$22.00/1 C© 2004 Society for Risk Analysis

Page 2: How Useful is QRA - George Apostolakis

516 Apostolakis

experimenting with more substantive use of QRAinsights.

Citizen groups that oppose a particular decisionor industry frequently raise questions about the valid-ity of QRA, especially the “Q” part.(3) Although thereis occasionally a reply by responsible groups,(4) thesecriticisms, especially articles in the popular press, arecent example being Reference 5, remain largelyunanswered.

My purpose in this article is to address some ofthese criticisms and to place the use of QRA in per-spective. It is my thesis that QRAs should be viewedas an additional tool in safety analysis that improvessafety-related decision making. QRA is not a whole-sale replacement of traditional safety methods orphilosophies. QRA analysts will be the first to admitthat this tool is not perfect, yet it represents tremen-dous progress toward rational decision making.

2. TRADITIONAL SAFETY ANALYSIS

It is important to recognize that even traditionalsafety analyses must deal with probabilities, but, un-like in QRA, these probabilities are not quantified.1

The evaluation of safety is typically bottom-up, i.e., itstarts with postulated failures and proceeds to identifytheir consequences. If an event, such as the failure ofa component, is judged to lead to unacceptable con-sequences, measures are taken either to make it lesslikely (without knowing quantitatively by how much)or to mitigate its potential consequences. Typically,these actions include the introduction of redundantelements and additional safety margins, i.e., the differ-ence between the failure point and the anticipated op-erating point is made larger. These actions are basedon engineering judgment informed by analyses, tests,and operating experience. The result is frequently acomplex set of requirements for the design and oper-ation of the system.

A facility that meets these requirements is judged“acceptable” in the sense that there is no “undue risk”to the public or the crew. What “undue risk” is remainsunquantified. The presumption is that meeting the re-quirements guarantees adequate protection of publicand crew health and safety, i.e., the (unquantified) riskis acceptably low.

3. QUANTITATIVE RISK ASSESSMENT

In its bare essence, QRA answers the three ques-tions posed in Reference 6, namely: (1) What can go

1 Traditional safety analyses are often called “deterministic.” Thisis a misnomer. Uncertainties are always present.

wrong? (2) How likely is it? and (3) What are the con-sequences? It is a top-down approach that proceedsas follows:

1. A set of undesirable end states (adverse con-sequences) is defined, e.g., in terms of risk tothe public, loss of crew, and loss of the system.These answer the third question of Reference6.

2. For each end state, a set of disturbances tonormal operation is developed that, if un-contained or unmitigated, can lead to theend state. These are called initiating events(IEs).

3. Event and fault trees or other logic diagramsare employed to identify sequences of eventsthat start with an IE and end at an end state.Thus, accident scenarios are generated. Thesescenarios include hardware failures, human er-rors, fires, and natural phenomena. The de-pendencies among failures of systems andredundant components (common-cause fail-ures) receive particular attention. These sce-narios answer the first question.

4. The probabilities of these scenarios are eval-uated using all available evidence, primarilypast experience and expert judgment. Theseprobabilities are the answer to the secondquestion.

5. The accident scenarios are ranked accordingto their expected frequency of occurrence.

A peer review by independent experts is an es-sential part of the process. Depending on the sig-nificance of the issue, this review may involvenational and international experts occasionally sup-ported by staff. The first PRAs for nuclear power re-actors and severe accidents were subjected to suchdetailed reviews.(7, 8) NASA’s QRA for the Inter-national Space Station was also subjected to peerreview.(9) The QRA for the Tooele incinerator ofchemical munitions, sponsored by the U.S. Army, wasreviewed by independent experts(10) and the wholeoperation was under the oversight of an indepen-dent committee of the National Research Council.(11)

These reviews had significant impact on the respec-tive QRAs. The PRA Standard issued by the Ameri-can Society of Mechanical Engineers contains a peerreview as an integral part.(12) Guidance on peer re-views can be found in Reference 13. Insights fromQRAs should not be used in decision making un-less they have been subjected to a peer review byindependent experts.

Page 3: How Useful is QRA - George Apostolakis

Perspective 517

4. QRA BENEFITS

QRA has been found useful because it:

1. Considers thousands of scenarios that involvemultiple failures, thus providing an in-depthunderstanding of system failure modes. Suchan enormous number of possible accident sce-narios is not investigated by traditional meth-ods. The completeness of the analysis is signif-icantly enhanced by the QRA investigation.

2. Increases the probability that complex inter-actions between events/systems/operators willbe identified.

3. Provides a common understanding of theproblem, thus facilitating communicationamong various stakeholder groups.

4. Is an integrated approach, thus identifying theneeds for contributions from diverse disci-plines such as the engineering and the socialand behavioral sciences.

5. Focuses on uncertainty quantification and cre-ates a better picture of what the community ofexperts knows or does not know about a par-ticular issue, thus providing valuable input todecisions regarding needed research in diversedisciplines, e.g., physical phenomena and hu-man errors.

6. Facilitates risk management by identifying thedominant accident scenarios so that resourcesare not wasted on items that are insignificantcontributors to risk.

5. QRA LIMITATIONS

Several items that are either not handled well ornot at all by current QRAs are:

1. Human errors during accident conditions.There is general agreement among analyststhat the Human Reliability Handbook(14) pro-vides adequate guidance for human errorsduring routine operations, e.g., maintenance.For an accident in progress, we can distinguishbetween errors of omission (the crew fails totake prescribed actions) and errors of commis-sion (the crew does something that worsensthe situation). These errors, especially thoseof commission, are not handled well and re-search efforts are underway to improve the sit-uation. I point out, however, that this is a goodexample of the fifth benefit that I mentionedin Section 4. Even in times of very limited

resources, the Nuclear Regulatory Commis-sion has expended significant resources on re-search on errors of commission.(15) This workhas benefited tremendously from the insightsdeveloped by human error theorists, e.g., Ref-erence 16. It is also important to point out thatexperience has shown that the crews often be-come innovative during accidents and use un-usual means for mitigation. These human ac-tions are not modeled in QRAs.

2. Digital software failures. This is an active areaof research. The aim is not to quantify the fail-ure probabilities but, rather, to understand thekinds of failure modes that may be introduced.It is to the credit of QRA analysts that theyhave not rushed to use any of the many mod-els available in the literature that treat soft-ware as black boxes and assign failure rates tothem based on dubious assumptions. Protect-ing against digital software errors is still withinthe realm of traditional safety methods, e.g.,by requiring extensive testing and the use ofdiverse software systems.

3. Safety culture. When asked, managers of haz-ardous activities or facilities say that they putsafety first. Unfortunately, experience showsthat this is not always the case. While it is rel-atively easy to ascribe an accident that has oc-curred to a bad safety culture, the fact is thatdefining indicators of a good or bad safetyculture in a predictive way remains elusive.QRAs certainly do not include the influenceof culture on crew behavior and one can makea good argument that they will not do so for avery long time, if ever.

4. Design and manufacturing errors. These areespecially important for equipment that wouldbe required to operate under unusual condi-tions, such as accident environments. Tradi-tional safety methods of testing and equip-ment qualification address these errors. Itis encouraging to note that a study by theNuclear Regulatory Commission(17) foundthat, of the design basis violations reportedby nuclear power plants in 1998, only 1% hadsome safety significance.

6. EXAMPLES OF QRA INSIGHTS

The publication of the Reactor Safety Study(RSS)(1) in 1975 and subsequent industry-sponsoredPRAs(18) had a tremendous impact on the thinking

Page 4: How Useful is QRA - George Apostolakis

518 Apostolakis

of nuclear safety experts. Several major new insightswere as follows:(19)

� Prior thinking was that the (unquantified) fre-quency of severe core damage was extremelylow and that the consequences of such damagewould be catastrophic. The RSS calculated acore damage frequency on the order of oneevent every 10,000 to 100,000 reactor-years,a much higher number than anticipated, andshowed that the consequences would not al-ways be catastrophic.

� A significant failure path for radioactivity re-lease that bypasses the containment build-ing was identified. Traditional safety analysismethods had failed to do so.

� The significance of common-cause failures ofredundant components and the significanceof human errors were demonstrated quantita-tively.

An early QRA for the auxiliary power units (APUs)on the shuttle’s orbiter produced some interesting re-sults.(20) A traditional qualitative method for develop-ing failure modes (Failure Modes and Effects Analy-sis) had been employed to identify hazards. Of the 313scenarios identified, 106 were placed on the so-calledcritical items list using conservative screening criteria.

� The QRA showed that 99% of the probabilityof loss of crew/vehicle was contributed by 20scenarios, two of which had not been on thecritical items list.

� Clearly, risk management is much more effec-tive when one has to deal with 20 scenariosas opposed to the 106 of the critical items list,which, in addition, missed two important sce-narios.

� The QRA demonstrated, once again, the sig-nificance of common-cause and cascading fail-ures. These failure modes are handled verypoorly in traditional safety methods.

Unlike the QRAs for nuclear power reactors andthe space shuttle, the QRA for the Tooele incineratorof chemical munitions was performed while the facil-ity was still under construction. This allowed the use ofQRA insights in design and operation changes.(11,21)

For example, the QRA identified a scenario that dom-inated worker risk, which involved the potential forbuildup and ignition of agent vapors in a furnacefeed airlock. As a result of this finding, a hardwarechange was implemented to vent the airlock and pre-clude agent buildup. In addition, the operations were

changed to limit the time any item is held in the feedairlock.

7. RISK-INFORMED DECISION MAKING

I wish to make one thing very clear: QRA resultsare never the sole basis for decision making by respon-sible groups. In other words, safety-related decisionmaking is risk-informed, not risk-based. The require-ments of the traditional safety analysis that I describedabove are largely intact.

In all three examples that I have been us-ing (Nuclear Regulatory Commission, NASA, andthe Tooele incinerator), the safety requirements areoverwhelmingly traditional. Only the Nuclear Reg-ulatory Commission, which, as I have stated inSection 1, has entered Phase 3, has a formal process formodifying some traditional requirements using riskinsights.(2)

Even in the Nuclear Regulatory Commission’scase, however, it is interesting to see how cautiousthe Commission was when it issued its policy state-ment in 1995:(22) “The use of PRA technology shouldbe increased in all regulatory matters to the extentsupported by the state of the art in PRA methods anddata and in a manner that complements the NRC’sdeterministic approach and supports the NRC’s tra-ditional defense-in-depth philosophy.” The Commis-sioners made it very clear that PRA insights were tobe scrutinized and were subordinate to the traditionalsafety philosophy of defense-in-depth.

At this time, formal methods for combining riskinformation with traditional safety methods do notexist. The process relies on the judgment of the de-cision makers and is akin to the analytic-deliberativeprocess that the National Research Council has rec-ommended for environmental decision making.(23)

The continuous risk management process thatNASA employs is similar in the sense that risk infor-mation is only one of the many inputs that the processconsiders.

8. COMMENTARY ON FREQUENTLYRAISED CRITICISMS

All QRA analysts care about is the bottom-linenumbers.

I know of no QRA analysts who act this way(and I know a lot of them). The uncertainties aboutthe results and the dominant scenarios are the re-sults that experienced analysts look for. The lower

Page 5: How Useful is QRA - George Apostolakis

Perspective 519

the probabilities that are reported, the more suspi-cious these analysts become.

QRAs are performed to understand how the sys-tem can fail and to prioritize the failure modes, not toproduce a set of numbers. The only QRA results thathave any chance of influencing risk management arethose that provide engineering insights.

NASA doesn’t seem to be able to get the numbersfor the shuttle right.

This is an interesting comment that one encoun-ters frequently. Old studies are cited(24) that appar-ently produced very low failure probabilities for theshuttle (as low as once every 100,000 flights). Then, itis pointed out that there have been two failures in 113shuttle flights; therefore, QRAs are no good.

An early QRA for the shuttle(25) gives a prob-ability for the loss of vehicle that ranges from onceevery 230 flights to once every 76 flights. The ana-lysts state that they are 90% confident that the failureprobability is in this interval. The reported intervalis on the right order of magnitude, especially for afirst effort that has not been peer reviewed formallyand in an industry that is still in Phase 1 of QRA de-velopment.2 I doubt very much that the old studiesmentioned above had been subjected to the rigorouspeer-review process that modern QRAs receive.

But this criticism raises another important point.There is a presumption that QRA should “get thenumber right” right away. This does not allow time forthe technology to mature in the NASA environment.As I mentioned in Section 1, the nuclear power indus-try has had plenty of time to improve its PRAs andone would indeed expect its numerical results to beconsistent with operating experience. NASA, on theother hand, is still exploring what QRAs can do andhow to apply them realistically to its systems, someof which are very different from nuclear power appli-cations. For example, methodological improvementsare needed to handle the dynamic nature of spaceflights. I believe that the agency should be given timeto make this technology part of its standard analyticaltools.

The fundamental question should not be whetherold studies failed to accurately predict the number (al-though the results I cited above show that this is notquite true). The question should be: What did NASA

2 NASA is currently in the process of producing a comprehen-sive QRA for the shuttle that involves several NASA centersand contractors. This QRA is undergoing rigorous peer reviewand is considered as one that is truly embraced by NASA. Theagency intends to risk-inform its decision making processes usingthis QRA.

do with these numbers? The answer is very little, ifanything. There are no guidelines as to which num-bers are acceptable and, as I have said several times,the traditional safety requirements are intact. It is theimpact on decision making that matters, not that somestudy produced indefensible numbers.

Having seen first-hand how PRAs have improvedsafety in the nuclear power industry, I believe stronglythat it would be a disservice to the nation if NASAwere discouraged from using this technology.

QRA results are not useful when they are highlyuncertain. Why bother doing a QRA in those cases?

These uncertainties exist independently ofwhether we do a QRA or not. The decisions that needto be made will be better if quantitative informationthat has been peer reviewed is available. Recall themisperceptions of the frequency and consequences ofcore damage that nuclear safety professionals had be-fore the Reactor Safety Study was issued (Section 6).

By attempting to quantify the uncertainties andto identify the dominant contributors, QRA analystscontribute to the common understanding of the is-sues and, in addition, may provide useful input to theallocation of research resources.

Probabilities cannot be realistically calculated.This criticism presumably means that one can-

not use straightforward statistical methods and dividethe number of failures by the number of trials to cal-culate “realistic” probabilities. This criticism appearsto miss the point that QRAs are performed for sys-tems that are highly reliable and well defended, so aplethora of failures does not exist (and is highly un-desirable to begin with). QRA methods analyze rareevents in a systematic way and use all the available in-formation in evaluating probabilities. QRA analystsare fully aware of the extensive use of expert judg-ment and always look at the uncertainties associatedwith the results. As I have stated, peer reviews areessential. Ultimately, the decision making process isrisk-informed and not risk-based.

QRAs do not include “one technician’s unbeliev-able stupidity.”(5)

I do not know how one defines “unbelievablestupidity,” so I have difficulty dealing with this crit-icism. Of course, these facilities are usually operatedby crews of several people, so I am not sure howmany trained persons would have to behave in anunbelievably stupid manner for something extraor-dinary to happen. As I mentioned above (Section 5),QRAs do not model acts of unbelievable intelligenceeither, which, as the operating experience shows, oc-cur frequently.

Page 6: How Useful is QRA - George Apostolakis

520 Apostolakis

ACKNOWLEDGMENTS

My views on PRA and its proper use have beenshaped by many stimulating debates with my col-leagues on the Advisory Committee on Reactor Safe-guards of the Nuclear Regulatory Commission. Thisarticle has also benefited from comments by H. Dez-fuli, T. Ghosh, and L. Pagani. The tone of some of mycomments and all potential errors are mine.

REFERENCES

1. U. S. Nuclear Regulatory Commission. (1975). Reactor SafetyStudy, an Assessment of Accident Risks in U.S. Nuclear PowerPlants, WASH-1400. Report, NUREG-75/014. Washington,DC.

2. U.S. Nuclear Regulatory Commission. (1998). An Approachfor Using Probabilistic Risk Assessment in Risk-InformedDecisions on Plant Specific Changes to the Licensing Ba-sis. Regulatory Guide 1.174. Washington, DC. Available atwww.nrc.gov.

3. Union of Concerned Scientists. (2000). Nuclear Plant RiskStudies: Failing the Grade. Available at www.ucsusa.org.

4. Advisory Committee on Reactor Safeguards. (2000). Unionof Concerned Scientists, Nuclear Plant Risk Studies: Failingthe Grade. Available at www.nrc.gov.

5. San Francisco Chronicle. (2003). NASA’s nuclear-fueledoddsmaking. Assurances can’t calm fears of a Chernobyl inthe sky. April 28.

6. Kaplan, S., & Garrick, B. J. (1981). On the quantitative defi-nition of risk. Risk Analysis, 1, 11–28.

7. Lewis, H. W., Budnitz, R. J., Kouts, H. J. C., Loewenstein, W.B., Rowe, W. D., von Hippel, F., & Zachariasen, F. (1978). RiskAssessment Review Group Report to the U.S. Nuclear Regu-latory Commission. Report, NUREG/CR-0400. Washington,DC: U.S. Nuclear Regulatory Commission.

8. Kastenberg, W. E., Apostolakis, G., Bickel, J. H., Blond, R. M.,Board, S. J., Epstein, M., Hofmann, P., King, F. K., Ostrach, S.,Reed, J. W., Ritzman, R. L., Stetkar, J. W., Theofanous, T. G.,Viskanta, R., & Guarro, S. (1988). Findings of the Peer ReviewPanel on the Draft Reactor Risk Reference Document. Report,NUREG/CR-5113. Washington, DC: U.S. Nuclear RegulatoryCommission.

9. Apostolakis, G. E., Dezfuli, H., Gahring, S. D., Sattison,M. B., & Vesely, W. E. (2002). Report of the Independent PeerReview Panel on the Probabilistic Risk Assessment of the Inter-national Space Station. Phase II—Stage 7A Configuration. Pre-pared for NASA Headquarters, Office of Safety and MissionAssurance.

10. Apostolakis, G. E., Budnitz, R. J., Hedman, P. O., Parry, G. W.,& Prugh, R.W. (1996). Report of the Risk Assessment ExpertPanel on the Tooele Chemical Agent Disposal Facility Quanti-tative Risk Assessment. Prepared for Mitretek Systems.

11. Magee, R. S., Drake, E. M., Bley, D. C., Dyer, G. H., Falter

V. E., Gibson, J. R., Greenberg, M. R., Kolb, C. E., Kossen,D. S., May, W. G., Mushkatel, A. H., Niemiec, P. J., Parshall,G. W., Tumas, W., & Wu, J.-S. (1997). Risk Assessment andManagement at Deseret Chemical Depot and the Tooele Chem-ical Agent Disposal Facility. Committee on Review and Evalu-ation of the Army Chemical Stockpile Disposal Program, Na-tional Research Council. Washington, DC: National AcademyPress.

12. American Society of Mechanical Engineers. (2002). Standardfor Probabilistic Risk Assessment for Nuclear Power Plant Ap-plications, ASME RA-S-2002.

13. Budnitz, R. J., Apostolakis, G., Boore, D. M., Cluff, L. S.,Coppersmith, K. J., Cornell, C. A., & Morris, P. A. (1998).Use of technical expert panels: Applications to probabilisticseismic hazard analysis. Risk Analysis, 18, 463–469.

14. Swain, A. D., & Guttmann, H. E. (1983). Handbook of HumanReliability Analysis with Emphasis on Nuclear Power PlantApplications. Report, NUREG/CR-1278. Washington, DC:U.S. Nuclear Regulatory Commission.

15. Barriere, M., Bley, D., Cooper, S., Forester, J., Kolaczkowski,A., Luckas, W., Parry, G., Ramey-Smith, A., Thompson, C.,Whitehead, D., & Wreathall, J. (1998). Technical Basis andImplementation Guidelines for a Technique for Human EventAnalysis (ATHEANA). Report, NUREG-1624. Washington,DC: U.S. Nuclear Regulatory Commission.

16. Reason, J. (1990). Human Error. Cambridge, UK: CambridgeUniversity Press.

17. U. S. Nuclear Regulatory Commission, Office of Nuclear Reg-ulatory Research. (2000). Causes and Significance of DesignBasis Issues at U. S. Nuclear Power Plants. Draft Report. Wash-ington, DC.

18. Pickard, Lowe, & Garrick, Inc., Westinghouse Electric Corpo-ration, & Fauske & Associates, Inc. (1981). Zion Probabilis-tic Safety Study. Report prepared for Commonwealth EdisonCompany, Chicago.

19. Beckjord, E. S., Cunningham, M. C., & Murphy, J. A. (1993).Probabilistic safety assessment development in the UnitedStates 1972–1990. Reliability Engineering and System Safety,39, 159–170.

20. Frank, M. V. (1989). Quantitative risk analysis of a spaceshuttle subsystem. In Proceedings of the International Topi-cal Meeting on Probability, Reliability, and Safety Assessment,PSA ’89, Pittsburgh, Pennsylvania, April 2–7.

21. Science Applications International Corporation. (1996).Tooele Chemical Agent Disposal Facility Quantitative Risk As-sessment. Report, SAIC-96/2600. Prepared for the U.S. ArmyProgram Manager for Chemical Demilitarization.

22. U.S. Nuclear Regulatory Commission. (1995). Use of prob-abilistic risk assessment methods in nuclear activities: Finalpolicy statement. Federal Register, 60, 42622.

23. National Research Council. (1996). Understanding Risk: In-forming Decisions in a Democratic Society. Washington, DC:National Academy Press.

24. Seife, C. (2003). Columbia disaster underscores the risky na-ture of risk analysis. Science, 299, 1001–1002.

25. Science Applications International Corporation. (1995).Probabilistic Risk Assessment of the Space Shuttle. Report,SAICNY95-02-25. Prepared for NASA Office of Space Flight.