drw - quality function

24
315 LESSONS LEARNED QUALITY FUNCTION DEPLOYMENT AS A TOOL FOR IMPLEMENTING COST AS AN INDEPENDENT VARIABLE David R. Wollover The essence of cost as an independent variable (CAIV) is using reliable tools to balance cost with mission needs for new program development. This article addresses concerns about implementing CAIV for Department of Defense (DoD) acquisition programs that vary by scope, budget, and dimension. Perhaps no single CAIV implementation tool is robust enough to apply to all cases. However, we are interested in tools to implement CAIV for a maximum number of programs to collect lessons learned and related beneficial aspects of the CAIV learning curve. This article describes and illustrates Quality Function Deployment (QFD) as a tool with good potential to help implement CAIV for a variety of DoD acquisition programs. An example of a generic acquisition system (a weapon system in this writing) not attributed to any specific program is used. The example is actually elementary compared to some advanced QFD applications. However, it is still manifold enough to illustrate a fairly detailed QFD application. While this paper focuses on a weapon system, the same process may be applied to automated information system (AIS) programs, with appropriate modifications. QFD consists of six general steps: (a) identifying and analyzing customer needs and requirements, (b) identifying technical performance measures (TPMs), (c) benchmarking TPMs, (d) assigning priority to customer requirements, (e) establishing TPMs to identify specific design characteristics, and (f) evolving technical performance measures into the follow-up design phase’s requirements. This elementary example will illustrate QFD, providing a framework to transform vague customer requirement statements into TPMs that are deployed throughout system design and development. oD has adopted a strategy to use aggressive, realistic cost objectives to acquire systems, and managing risks to obtain objectives. These objectives must balance mission needs with projected out-year resources, accounting for exist- ing technology as well as high-confidence maturation of new technologies. D

Upload: david-wollover-cissp

Post on 22-Jan-2018

671 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

315

LESSONS LEARNED

QUALITY FUNCTION DEPLOYMENTAS A TOOL FOR IMPLEMENTING

COST AS AN INDEPENDENT VARIABLEDavid R. Wollover

The essence of cost as an independent variable (CAIV) is using reliable toolsto balance cost with mission needs for new program development. This articleaddresses concerns about implementing CAIV for Department of Defense(DoD) acquisition programs that vary by scope, budget, and dimension.Perhaps no single CAIV implementation tool is robust enough to apply to allcases. However, we are interested in tools to implement CAIV for a maximumnumber of programs to collect lessons learned and related beneficial aspectsof the CAIV learning curve.

This article describes and illustrates Quality Function Deployment (QFD) as atool with good potential to help implement CAIV for a variety of DoD acquisitionprograms. An example of a generic acquisition system (a weapon system inthis writing) not attributed to any specific program is used. The example isactually elementary compared to some advanced QFD applications. However,it is still manifold enough to illustrate a fairly detailed QFD application. Whilethis paper focuses on a weapon system, the same process may be applied toautomated information system (AIS) programs, with appropriate modifications.

QFD consists of six general steps: (a) identifying and analyzing customer needsand requirements, (b) identifying technical performance measures (TPMs),(c) benchmarking TPMs, (d) assigning priority to customer requirements, (e)establishing TPMs to identify specific design characteristics, and (f) evolvingtechnical performance measures into the follow-up design phase’srequirements. This elementary example will illustrate QFD, providing aframework to transform vague customer requirement statements into TPMsthat are deployed throughout system design and development.

oD has adopted a strategy to useaggressive, realistic cost objectivesto acquire systems, and managing

risks to obtain objectives. These objectives

must balance mission needs with projectedout-year resources, accounting for exist-ing technology as well as high-confidencematuration of new technologies.

D

Page 2: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

316

David R. Wollover is currently a candidate for the Master of Engineering Administration de-gree from Virginia Polytechnic and State University. Previously, he completed separate master’sdegrees in Resource Economics and Regional Planning. In 1985, he became an operationsresearch analyst with the U.S. Navy , and then worked with the Air Force Center for Studies andAnalyses from 1987 to 1989. In 1989, he joined Applied Research, Inc., and has provided directcost analysis support to the Ballistic Missile Defense Organization for over five years, focusingon preparing LCC estimates for major systems facing acquisition milestone review. Currently acontract member of the Senior Staff for the United Missile Defense Company (UMDC), he per-forms LCC analysis and Cost as an Independent Variable (CAIV) in a systems engineeringenvironment. He is a member of the Society of Cost Estimating and Analysis, has served as areferee for the Journal of Cost Analysis, and presented various cost analysis papers for SCEA,ISPA, and REVIC Users Group. The Certified Cost Estimator/Analyst (CCE/A) professional des-ignation was attained in 1989.

This concept is called cost as an inde-pendent variable (CAIV), meaning thatonce a system’s performance and objec-tive costs are decided on the basis of cost-performance tradeoffs, the acquisition pro-cess establishes cost as a constraint, ratherthan as a dependent variable, while stillgetting the needed military capability(ODUSD [AR]), 1996). Tradeoffs aremade among cost, schedule, and perfor-mance based on CAIV analysis (Office ofthe Secretary of Defense, 1996).

In a Dec. 4, 1995, memorandum on lifecycle cost reduction, Dr. Paul Kaminskirequested: (a) cost performance trades; (b)aggressive program management, makingcost a major independent driver, whilepreserving warfighter requirements; (c)expanding use of existing techniques tomeet program goals; and (d) reducing un-necessary program and product complex-ity (Kaminski, 1995).

Guidance attached to Kaminski’smemorandum calls for CAIV to include:(a) adopting aggressive realistic cost goalsfor operations and support, as well as pro-duction, with well-defined steps leadingto objectives; (b) using existing practicesproven to have managed meeting cus-tomer requirements; and (c) formalizing

the cost-performance tradeoff processthrough performance specifications usedto state requirements in a manner thatclearly directs the CAIV process to evalu-ate all pertinent design parameters thatserve as key metrics and observables,while assuring preserving needed militarycapability (Longuemare, 1995).

CAIV METRIC AND OBSERVABLES

The CAIV Working Group Paper Sum-mary, an attachment to Kaminski (1995),describes the instrumental role of keymetrics and observables. This attachmentdescribes the importance of setting earlycost objectives. The ability to set cost ob-jectives depends on results of early cost-performance tradeoff analyses. Metricsand observables are needed to assessCAIV implementation progress.

Metrics and observables identify ob-servable steps for meeting aggressive pro-duction and operations and support costobjectives, and then managing for theirachievement. Conrow (1995, p. 209) in-dicates that a significant influence on cre-ating DoD program development cost, andtechnical and schedule risk is incorrectly

Page 3: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

317

“Titles are lessimportant thaninsights to correla-tions among keyperformance param-eters (KPPs), criticaltechnical parameters(CTPs), or other TPMcandidates describedin the literature…”

specified technical possibilities. Both gov-ernment and contractors “routinely under-estimate the risk present in military pro-grams.” Risk reduction steps for technol-ogy development and application, manu-facturing, and operations can be guidedby unbiased metrics and observables tai-lored to specific programs.

SIGNIFICANCE OF TECHNICALPERFORMANCE MEASURES

Examining the DoD description of “keymetrics and observables” (Kaminski,1995) reveals they are similar to what sys-tem engineers call technical performancemeasures (TPMs) (Verma, Chilakapati, &Blanchard, 1996, p. 39). Titles are lessimportant than insights to correlationsamong key performance parameters(KPPs), critical technical parameters(CTPs), or other TPM candidates de-scribed in the literature, for example, byHiggins (1997, pp. 45–46) and Jones(1996, p. 151).

Risks to meeting performance require-ments with aggressive cost goals must bemanaged through iterated cost, perfor-mance, and schedule tradeoffs, identify-ing performance, manufacturing, or opera-tions uncertainties, and demonstrating so-lutions prior to final design. We seek toefficiently manage weapon system com-plexity, defined here as an evolving largenumber of interfaces, parts, and final test-ing requirements among maturing systemconfiguration elements (Gindele, 1996, p.66). In this context we seek proven meansto systematically organize all independentvariables and their interrelationships.

Commitments to technology, systemconfiguration, performance, and life cycle

cost are strong even in early system de-sign. Many system characteristics inter-act; consequences of these static and dy-namic interactions are rarely well evalu-ated or understood. There are ample op-portunities to reduce costs while life cycledecisions continue to be made. Progressmay be created by techniques that enableearlier use of integrated design informa-tion (Fabrycky, 1994, pp. 134–136).

The best time to reduce life cycle costsis early in the acquisition process, whencost-performance tradeoff analyses areconducted to decide an acquisition ap-proach. However, because factors both in-ternal and external to the program change,tradeoffs must occur throughout the ac-quisition process, and key TPMs may alsosignificantly change throughout programevolution.

Still, it is critical to CAIV that the pro-cess of setting TPMs reflecting cost andperformance objectives begin as early aspossible. The ability to achieve cost ob-jectives greatly depends on early executedcost-performance tradeoffs, including us-ing TPMs to measure and thus better man-age risk mitiga-tion. Specifi-cally, for ex-ample, as in thecase of the F–22program, TPMchanges may beobserved in di-rect response torisk reductionefforts (Justice,1996, p. 70).

Consequently, applying CAIV TPMs toDoD programs entails: (a) setting cost andperformance objectives as early as pos-sible; (b) quantifying these objectives as

Page 4: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

318

“Quality functiondeployment (QFD) isa well-establishedprocedure thatessentially uses aseries of interdepen-dent matrices.”

TPM threshold values, tailored to specificassets and activities; (c) setting pathwayssupporting observable transitions betweenobjective-oriented actions; (d) adhering toa cost-performance tradeoff process thathas structured all relationships amongTPMs; and (e) empowering program man-agers to flexibly respond to changes in theset(s) of TPMs and their values.

INTRODUCTION TOQUALITY FUNCTION DEPLOYMENT

Quality function deployment (QFD) isa well-established procedure that essen-tially uses a series of interdependent ma-trices. The matrices are used to organizeand translate customer requirements, in anintegrated fashion, to the successive stepsthat ultimately meet these requirements.QFD has been used in a wide variety of

industries to useTPMs to trans-late and literallymap customerneeds into ob-jective productoutcomes. QFDis a historicallyproven meansto guide process

development using TPMs to systemati-cally organize all independent variables(cost, etc.), and their interrelationships.QFD is cited as the most widespreadimplementation of total quality manage-ment (TQM) (Sage, 1992, p. 222), and asa key facilitating tool in concurrent engi-neering environments (Menon et al., 1994,p. 91).

QFD is a process tool that helpsstrengthen management of key elementsof the system engineering process for DoDadvanced technology development pro-grams. QFD is structured to accommodatevaguely stated customer specifications,and through a series of interdependentmatrices, allocate and map requirementsinto specific design strategies, develop-ment processes, product characteristics,and program operations controls. For eachintended result of the design and produc-tion process, engineers identify TPMs, andthen specify corresponding threshold val-ues to be met in order to achieve the re-quired features of the overall system.These assignments set the minimum lev-els of achievement required to satisfy cus-tomer requirements.

ORIGINS OFQUALITY FUNCTION DEPLOYMENT

QFD was developed in the late 1960sby Shigeru Mizano of the Tokyo Instituteof Technology (Menon et al. 1994, p. 94).Mitsubishi Heavy Industries also began touse it then on supertanker projects at KobeShipyard. Mitsubishi tried to build 300-yard-long supertankers having sophisti-cated propulsion, maneuvering, and bal-ance control, challenging design andmanufacturing logistical requirements,and having essentially no production line(Guinta and Praizler, 1993, p. 1).

Toyota adopted the Kobe shipyard QFDmethodology in the mid-1970s. Toyota setperformance benchmarks combined withcustomer focus groups. They experienced40 percent reductions in new model de-

Page 5: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

319

velopment costs, and a 50 percent reduc-tion in development time (Menon et al.1994, p. 94; Prasad, 1996, p. 82).Panasonic Consumer Electronics pushedQFD to greater limits in the mid-1970s.They used it to predict what consumerswould want in the future, ergo their slo-gan “Just slightly ahead of our time”(Guinta and Praizler, 1993, p. 4).

A 1986 survey of Japanese Union ofScientists and Engineers reported that 54%used QFD, most of them in high technol-ogy and transportation industries. TheJapanese exploited QFD to structure pro-duction and supporting operations to be-come less sensitive to variations causedby operators, equipment, and materials(Guinta and Praizler, 1993, p. 7). A mostinteresting historical note is that QFD wasapplied principally in companies and prod-ucts primary to Japan’s export business,particularly to the United States (Sanchezet al., 1993, p. 239).

THE UNITED STATES EXPERIENCE

WITH QFD

Ford Automotive applied QFD in theearly 1980s, using it to reorganize sequen-tial functions to concurrent interaction ofdesign, engineering, and manufacturing.Ford used more than 50 applications ofQFD to (a) establish quality goals; (b)identify customers and others affected; (c)discover customer needs, such as in-creased reliability; (d) develop longermaintenance-free operation; (e) clarify theimpact of manufacturing process plans ondesign; and (f) establish process controlscoordination among functions (Hauser andClausing, 1988, p. 63).

Ernst and Young innovated QFD ap-plied to the paper products industry dur-ing 1990, where they included importanceweighting, measured correlation amongcustomer requirements, and completedcompetitive evaluations (a.k.a. “bench-marking” (Juran and Gryna, 1993, p. 255).Thiokol Strate-gic Operationsused QFD spe-cifically to bet-ter measure andcertify its partssuppliers, andconsequentlyreduce devel-opment time tobuild strategicand tacticalweapon systemsolid rocketmotors (Guintaand Praizler,1993, p. 13).

Other companies using QFD includeAerojet Ordnance, ITT, IBM, DigitalEquipment, Texas Instruments, Chrysler,General Motors, Procter and Gamble,Deere & Company, Polaroid, Rockwell In-ternational, Hughes Aircraft, and HewlettPackard (Sanchez et al., 1993, p. 239). Re-search by Guinta and Praizler (1993, p. 8)revealed that various domestic service andmanufacturing companies using QFD ex-perienced 50 percent cost reductions, and33 percent project time reductions.

The DoD Joint Strike Fighter Program(JSFP) has an activity referred to as theStrategy-to-Task Technology QFD IIAnalysis, which has been awarded theAmerican Supplier Institute (ASI) “BestApplication” Award, recognizing exem-

“Other companiesusing QFD includeAerojet Ordnance,ITT, IBM, DigitalEquipment, TexasInstruments,Chrysler, GeneralMotors, Procter andGamble, Deere &Company, Polaroid,Rockwell Interna-tional, Hughes Air-craft, and HewlettPackard.”

Page 6: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

320

“QFD has beensuccessfully used in awide variety ofindustries: aircraft,aerospace, automo-biles, computersoftware, construc-tion equipment,copiers, consumergoods, electronics,paper products,shipbuilding, andtextiles.”

plary use of QFD. This award was grantedat the 1995 ASI Product DevelopmentSymposium. ASI cited this QFD II analy-sis as “the most robust aggressive use ofQFD to analyze weapon system require-ments seen to date.” The award was pre-sented by Dr. Genichi Taguchi, a four-timeDeming Prize winner (JSFP, 1996a).

QFD has been successfully used in awide variety of industries: aircraft, aero-space, automobiles, computer software,construction equipment, copiers, con-sumer goods, electronics, paper products,shipbuilding, and textiles (Menon et al.,

1994). The lit-erature review,taken together,reliably indi-cates that QFDis deeply inte-grated into ourcommercial in-dustry culture.

United Statesmilitary cost-constrained ef-fectiveness isinfluenced byour quality of

organizing and deploying technology ac-cording to specific functions. Other na-tions that compete militarily with theUnited States are evidencing their under-standing of this (Brauchli, 1997, p. A14;Chen, 1997, p. A15; Fisher, 1996, p. A18).The lessons learned from the competitivestrategies practiced under Secretary ofDefense Casper Weinberger during theReagan administration are not lost in thisera characterized by aggressive nationsactively seeking technologies providinggreater military leverage.

Research by Pisano and Wheelwrightdemonstrates that outstanding high-techcompanies such as Intel and Hewlett-Packard have integrated their product de-velopment skills with new focus on pro-cess development, and built unique sus-tainable competitive positions withoutexpending more resources (1995, p. 105).The type of plan chosen does make a dif-ference!

HOW WELL-SUITED IS QUALITYFUNCTION DEPLOYMENT TO DOD CAIV?

The applicability of QFD to CAIV isenhanced through the instrumental role ofintegrated product and process develop-ment (IPPD). IPPD is a philosophy of in-tegrating all acquisition activities through-out the program life cycle. Integratedproduct teams (IPTs) are at the core ofIPPD; IPTs are most instrumental to CAIVdevelopment and implementation.

A standardized structure for cost IPToperations is desirable for common imple-mentation of CAIV initiatives across allDoD programs. System-level cost objec-tives, in turn decomposed to the sub-sys-tem level, are key technical performancemeasures shared by the program managerand corresponding IPTs. Cost/Perfor-mance IPTs (CPIPTs) are empowered torecommend engineering and designchanges to the program manager.

IPPD facilitates IPTs for synthesizingacquisition activities throughout the pro-gram life cycle. As such, IPTs offer DoDan unprecedented opportunity to imple-ment QFD as an interfunctional planningand communications tool. This is espe-cially true for currently planned advanced

Page 7: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

321

military technology implementation pro-grams that require demonstrating a clearpath toward reducing costs as well asmeeting operational requirements(Wollover and Koontz, 1996, pp. 1–7).

ILLUSTRATION OF APPLYING QFD:GENERIC WEAPON SYSTEM DESIGN

Now we’ll illustrate the application ofQFD to the process of developing a ge-neric weapon system. A notional weaponsystem example was selected to provideadequate design complexity, to permit afairly detailed QFD application example.This example system need not be plat-form-specific; it is most broadly consid-ered deployable to strike any target (e.g.,underwater, surface, or airborne) from anyplatform (e.g., human, vehicle, aircraft,ship, spacecraft).

The QFD process consists of the fol-lowing steps: (a) identifying and analyz-ing customer needs and requirements, (b)identifying TPMs, (c) benchmarkingTPMs, (d) assigning priority to customerrequirements, (e) establishing TPMs toidentify specific design characteristics,and (f) evolving TPMs into the follow-updesign phase’s requirements. Actual stepsdo vary in the literature (Guinta andPraizler, 1993; Sanchez et al., 1993;Menon et al., 1994; Verma et al., 1996).The above steps suit our example.

IDENTIFY AND ANALYZECUSTOMER NEEDS AND REQUIREMENTS

Customer need is defined in the con-text of a single on-target engagement. Ini-

tially, customer language is qualitative andsubjective, imparting vagueness and im-precision to the early weapon system de-sign. For example, needs such as maxi-mizing mission effectiveness, maximizingaffordability, maximizing supportability,minimizing risk, and optimizing person-nel use are all too general for design engi-neers to immediately respond to. Hencethese fuzzy statements are analyzed andtranslated into more specific requirementsto better understand and respond to theperceived deficiency.

The first two columns of Table 1 illus-trate these translations. For example,“maximize mission effectiveness” is trans-lated into more concrete goals, such as“locate, track, reach, and destroy target.”Developing a common dictionary for theoverall QFD model aids in understandinguser requirements in light of later tradeoffdecisions (Bregard and Chasteen, 1996,p. 172). Once identified, similar customerrequirements are grouped with like func-tional items. Referring again to Table 1,note how the five general customer re-quirements are the basis of the groupingof subsequent more concrete requirementstatements.

IDENTIFY TECHNICAL PERFORMANCE

MEASURESTPMs are the keys to estimating

progress for the weapon system’s designand development. As “design-dependentparameters,” TPMs offer variousfunctionalities. They provide visibility intothe status of actual versus required sys-tem performance, define correspondingfuture design goals, provide guideposts toevaluating design concepts and configu-rations, provide early detection of perfor-

Page 8: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

322

CUSTOMER QUANTITATIVEREQUIREMENTS a TPM REQUIREMENT

Maximize Locate target Motor burnout velocity (km/sec)mission Track target Range (km)effectiveness Reach target Maneuverability (Gl/ms)

Destroy target Data processing speed MhzData reception speed Kb/SecLength * diameter M^2Mass kgSensor accuracy S / N

Maximize Minimize R&D cost R&D Constant year $Maffordabilty Minimize production cost Production Constant year $M

Minimize support cost O&S Constant year $MMinimize operations cost

Maximize Maximize reliability MTBF Monthssupportability Failure rate Failures / mission

EngagementMTBM Months

Maximize maintainability Mean prevent maint. time -BITE (MPMT-B) MinutesMean prevent maint. time -ExTE (MPMT-E) MinutesMean corr. maint. time (MCMT) - org. level Minutes

Minimize risk Maximize producibility Amount of major modifications Percent(Reintegrating subsystems)Amount of minor modifications Percent(Repackaging subsystems)

Minimize design complexity Hardware complexity No. of interfaces/No. subsystemsSoftware complexity No. of interfaces/No. subsystemsSubsystems integration complexity No. of interfaces/No. subsystems

Optimize Maximize operator effectiveness Operator response times Secondspersonnel Errors per mission engagement No. errorsuse

Minimize Support Errors Errors per testing event series No. errorsErrors per maintenance action No. errors

Optimize anthropometric factors Size of maintenance access panel areas In. x in. x in.Time to open each maintenance access panel Seconds

Optimize sensory factors Each maintenance access panel lighted LumensColor coded panels Indicate tool needs

a Ranking Order determined on basis of customer perceived relative degree of shortfall toward existing benchmark

b Because Mission Engagements are not continuous, it is readily assumed that the relationship between Reliability and Operating Time / MTBF is not exponential. Hence MTBF and Failure Rate may be somewhat more independently specified as design goals.

Table 1. Notional Generic Weapon System Objectives Table

Page 9: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

323

mance problems requiring managementattention, assess technical impact of pro-posed changes, and contrast implicationsof design alternatives. Consequently,

TPMs are integral to the program’s riskmanagement.

At this early stage of design, TPMs arekey parameters that are under the control

Technical Quantitative Current RelativePerformance Requirement Benchmark Importance

Measure (Competing systems)

Motor burnout velocity (km/sec) 2N N 8

Range (km) 1.5N N 6

Maneuverability (Gl/ms) 2N N 8

Data processing speed 5N Mhz N Mhz 3

Data reception speed 10N Kb/sec N Kb/sec 3

Length * diameter N Meter^2 N Meter^2 1

Mass .8N Kg N Kg 1

Sensor accuracy 2N:N Signal/noise N:N Signal/noise 8

R&D (constant year $M) .9N Dollars N Dollars 4

Production (constant year $M) .8N Dollars N Dollars 6

O&S (constant year $M) .7N Dollars N Dollars 7

MTBF .5N Months N Months 2

Failure (F) rate .5N F / mission engagement N F / Mission engagement 2

MTBM .5N Months N Months 1

Mean prevent. maint. time -BITE (MPMT-B) .8N Minutes N Minutes 1

Mean prevent. maint. time -ExTE (MPMT-E) .8N Minutes N Minutes 1

Mean corr. maint. time (MCMT) -Org Level .8N Minutes N Minutes 1

Amount of major modifications .5N % N % 7

Amount of minor modifications .5N % N % 3

Hardware complexity Sustain No. of Interfaces 4

Software complexity Sustain No. of Interfaces 4

Subsystems integration complexity Sustain No. of Interfaces 5

Operator response times N Sec N Sec 3

Errors per mission engagement .5*(0.N) Errors 0.N Errors 5

Errors per testing event series .5*(0.N) Errors 0.N Errors 1

Errors per maintenance action .5*(0.N) Errors 0.N Errors 1

Size of maintenance access panel areas Sustain n in. x n in. x n in. 1

Time to open each maintenance panel .8N Seconds N Seconds 1

Each maintenance access panel lighted 1.5N Lumens N Lumens 1

Color coded panels Reflect tool needs Using B&W symbols only 1

Table 2. Notional Generic Weapon System Objectives TableBenchmarking of Technical Performance Measures

Pre-Rank-Ordering

Page 10: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

324

of the design team. They are manipulatedeither directly or indirectly to meet cus-tomer requirements. TPMs are tangibleand describe any relevant system attributein measurable terms.

TPM ratios may be used. An exampleis effectiveness-to-cost ratios, for whicha very wide variety of options may bespecified (Wollover, 1991, pp. 149–153).While discrete changes in design measuresleading to distinct effectiveness changesmay be discerned, effectiveness-to-costratios may be normalized so equivalentcomparisons of TPMs may be made forpurposes such as the six functions men-tioned at the beginning of this section.Ratio examples are: system effectivenessto life cycle cost, or reliability to devel-opment cost. Ratios such as these may bespecified in the form of [gkD] customerbenefit -/[gkD]cost, to facilitate compari-son of relative changes among alternativeTPM values.

In Table 1, the TPMs in the third col-umn evolve from the more concretely de-fined customer requirements shown in thesecond column. The fourth column dis-plays specific quantitative requirementsassociated with each TPM.

BENCHMARK TECHNICAL PERFORMANCE

MEASURESTable 2 lists the TPMs and the quanti-

tative requirement (the latter being thesame measure found in the fourth columnof Table 1). The third column in Table 2,“Current Benchmark,” holds the corre-sponding quantified TPMs found either inthe predecessor weapon system or in ei-ther domestic or foreign competingweapon systems. Consequently, the sys-tem developers would like to surpass thesebenchmarks.

PRIORITIZE CUSTOMER REQUIREMENTSVarious system requirements will likely

conflict. For example, adding weaponspeed and range conflicts with minimiz-ing development and production costs.Consequently, assuming a limited budget,tradeoffs are inevitable. The issue here ison what basis should the various inter-dependent tradeoffs be made. To helpresolve if not overcome these conflicts,the requirements are assigned relativeweights that reflect the customer’s priori-ties. For this step, there is little substitutefor direct customer survey techniques(Salomone, 1995, p. 108), although appro-priate weapon system operations simula-tions are invaluable for enhancing cus-tomer decision processes.

Here we have used an arbitrary and sys-tematic process to assign relative weights,as follows. The last column of Table 2,“Relative Importance,” is reserved for as-signing customer weights to TPMs. Thefirst pass through the entire TPM seriesassigned a weight, equal to one, to allTPMs. The second pass entailed assign-ing a relative weighting equal to two formore important TPMs. The third pass as-signed weights of three to those progres-sively more important TPMs, and so on,until the sum of the relative importancemeasures equaled 100. Finally, this list ofTPMs was sorted based on these relativeimportance measures; Table 3 lists thesesorted TPMs.

The above ranking procedure is suit-able for our illustration; more rigorousprioritizing procedures are available. Forexample, variations of the commonly ref-erenced analytical hierarchy process(AHP) are cited in the literature (Armacostet al. 1994, p. 72; Wasserman, 1993, p.59; Lyman, 1990, p. 307). These proce-

Page 11: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

325

Technical Quantitative Current RelativePerformance Requirement Benchmark Importance

Measure (Competing Systems)

Motor burnout velocity (km/sec) 2N N 8

Maneuverability (Gl/ms) 2N N 8

Sensor accuracy 2N:N Signal/noise N:N Signal/noise 8

O&S (constant year $M) .7N Dollars N Dollars 7

Amount of major modifications .5N % N % 7

Range (km) 1.5N N 6

Production (constant year $M) .8N Dollars N Dollars 6

Subsystems integration complexity Sustain No. of Interfaces 5

Errors per mission engagement .5*(0.N) Errors 0.N Errors 5

R&D (constant year $M) .9N Dollars N Dollars 4

Hardware complexity Sustain No. of Interfaces 4

Software complexity Sustain No. of Interfaces 4

Data processing speed 5N Mhz N Mhz 3

Data reception speed 10N Kb/Sec N Kb/Sec 3

Amount of minor modifications .5N % N % 3

Operator response times N Sec. N Sec 3

MTBF .5N Months N Months 2

Failure (F) rate .5N F / Mission engagement N F / Mission engagement 2

Length * diameter N Meter^2 N Meter^2 1

Mass .8N Kg N Kg 1

MTBM .5N Months N Months 1

Mean prevent. maint. time -BITE (MPMT-B) .8N Minutes N Minutes 1

Mean prevent. maint. time -ExTE (MPMT-E) .8N Minutes N Minutes 1

Mean corr. maint. time (MCMT) -org. level .8N Minutes N Minutes 1

Errors per testing event series .5*(0.N) Errors 0.N Errors 1

Errors per maintenance action .5*(0.N) Errors 0.N Errors 1

Size of maintenance access panel areas Sustain N in. x N in. x N in. 1

Time to open each maintenance access panel .8N Seconds N Seconds 1

Each maintenance access panel lighted 1.5N Lumens N Lumens 1

Color coded panels Reflect tool needs Using B&W symbols only 1

Table 3. Notional Generic Weapon System Objectives TablePrioritization of Technical Performance Measures

Rank-Ordered According to Priority

dures essentially are driven by using so-phisticated customer query techniques todevelop and assign explicit weightingvariables that represent customer priori-ties. These techniques, while “method-

ologically intense,” should result in fairlyunambiguous communication of whichcustomer inputs most greatly influenceQFD.

Page 12: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

326

Table 4. First-Order Quality Function Deployment Correlation Matrix

CUSTOMER DESIRED ATTRIBUTES

MAXIM

IZELocate target

33

31

12

15

MISSION

Track target2

33

13

11

31

5

EFFECTIVENESSReach target

33

11

11

11

18

Destroy target3

33

23

11

11

18

MAXIM

IZEM

inimize R&

D cost(1)

(1)(2)

(1)(1)

(1)(2)

(3)(1)

(1)(1)

(1)(1)

(1)3

22

4

AFFORDABILITYM

inimize production cost

(1)(1)

(2)(2)

(3)(3)

(1)(1)

(1)(1)

(1)(1)

33

26

Minim

ize support cost(1)

(1)(3)

22

22

22

11

17

Minim

ize operations cost(1)

32

22

22

21

7

MAXIM

IZEM

aximize reliability

(1)(1)

(2)(1)

33

31

11

15

SUPPORTABILITYM

aximize m

aintainability(1)

(1)(1)

(2)(1)

33

33

23

33

35

MINIM

IZEM

aximize producibility

(1)(1)

(1)1

(1)(1)

33

5

RISKM

inimize design com

plexity(1)

(1)(1)

1(1)

(1)1

11

12

23

5

OPTIMIZE

Maxim

ize operator effectiveness(1)

(2)1

11

1(3)

33

22

11

11

4

PERSONNELM

inimize support errors

(1)(1)

11

(3)1

13

32

21

14

USEOptim

ize anthropometric factors

11

(3)1

11

13

32

24

Optimize sensory factors

11

(3)1

11

12

23

34

TECHNICAL DIFFICULTY (10 = M

ost Difficult)8

88

77

55

76

66

55

54

44

86

84

34

33

32

2

IMPUTED IM

PORTANCE (10 = M

ost Important)

86

83

31

18

46

72

21

11

17

34

35

11

11

11

Motor burnout velocity (km/sec)

Range (,km)

Maneuverability (Gl/ms)

Data processing speed (mhz)

Data reception speed (kb/sec)

Length * diameter

Mass (kg)

Sensor accuracy (S/N)

R&D expenditure (constant year $M)

Production expenditure (constant year $M)

O&S expenditure (constant year $M)

MTBF (months)

Failure rate (failures/mission engagements)

MTBM (Months)

Mean prevent. maint. time - BITE (MPMT-B)(min)

Mean prevent. maint. time - ExTE (MPMT-E)(min)

Mean corr. maint. time (MCMT-B) - org. level (min)

Amount of major modification (%)

Amount of minor modificaiton (%)

Hardware, software, integ. complexity (No. interfaces)

Operator response times (sec)

Errors per mission engagement (No. errors)

Errors per testing event series (No. errors)

Errors per maintenance action (No. errors)

Size of maintenance access panel areas (in. x in.)

Time to open each maintenance access panel (sec.)

Each maintenance access panel lighted (lumens)

Color coded panels)

RELATIVE IMPORTANCE

Example 1

Example 2

Example 3

TECHNICAL/ENGINEERING CHARACTERISTICSM

issionEffectiveness

CostSupportability

RiskPersonnel Use

Page 13: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

327

ESTABLISHING TPMS TO IDENTIFY SPECIFIC

DESIGN CHARACTERISTICS

All TPMs are considered on an inte-grated basis using the QFD correlationmatrices. This is the first opportunity tointegrate all requirements, including effec-tiveness, cost, operations, and logisticssupport into the mainstream design anddevelopment process.

Configure the QFD matrix. Customerrequirements from the first two columnsof Table 1 are set as the “customer desiredattributes” (Table 4, the first-order QFDmatrix, left-hand section). These are the“whats” to be satisfied. In response tothese requirements, TPMs from Tables 1through 3 are positioned along the top ofthe matrix. These TPMs are the “hows,”to the extent that they support customerrequirements.

Correlate customer requirementsWith TPMs. This is the key step of theQFD process. It involves populating thecorrelation matrix to reflect program-di-rected or otherwise inherent cause and ef-fect relationships. Each TPM is analyzedin terms of the extent of its influence oncustomer requirements.

Varying relative levels of this correla-tion are notionally depicted in the examplecorrelation matrix (Table 4), ranging froma value of +3, the maximum positive cor-relation, depicted in the matrix as 3, to –3, the maximum negative correlation, de-picted in the matrix as (3). We especiallynote that these values do not simply cor-relate, but rather indicate the degree towhich the TPMs support the customer re-quirements. Three examples follow; theiroccurrences in the first-order QFD matrixshown in Table 4 are highlighted using

bold borders surrounding the relevant cor-relation cells.

Example 1: Note the negative correla-tions between the TPM mass and the cus-tomer requirements for minimizing R&Dand production cost. Here, the matrix isnot negatively correlating mass with cost,but with the customer requirement ofminimizing cost.

Example 2: An interesting example oc-curs where all six Supportability TPMs arenegatively correlated with R&D and pro-duction costs, but are more positively cor-related with operations and support costs.This reflects the normative view purportedby CAIV proponents that greater up-frontinvestment in supportability is warrantedfor ultimately reducing overall life cyclecosts through disproportionately greatersavings in the operations and supportphase of the program. This example par-ticularly helps to emphasize QFD’s util-ity in identifying clusters of interactionelements.

Example 3: The last example occurs toillustrate the relatively minor yet real con-tribution that increasing personnel perfor-mance (e.g., greater human response time,fewer errors, reduced maintenance dura-tions) contributes to overall system per-formance reliability. This example is in-tended to call attention to the high value-added human machine interface (HMI)avenues to cost reduction such as anthro-pometric factors, as described generallyby Blanchard and Fabrycky (1990, pp.436–440), and as directly applied to ad-vanced aerospace design as described byReed (1994, pp. 54–59).

General evaluation of the correlationmatrix. Empty matrix rows represent un-addressed customer requirements. Wherethis is so, the set of TPMs is reevaluated,

Page 14: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

328

and additional TPMs are specified whereneeded. By contrast, empty columns in thematrix indicate design or other develop-ment actions that are not traced to anycustomer requirement; they may indicateeither under-leveraged, redundant, or un-necessary system-level design require-ments (Verma et al., 1995, pp. 38). Othermatrix evaluation strategies not coveredby the above example could involve thefollowing five avenues:

• Contrast complimentary technical so-lutions versus each other, to assess thedegree they conflict (Hartzell andSchmitz, 1996, p. 36). Correlationsamong design inputs may be shownusing a triangular table at the top of thematrix. However, care is needed to vali-date the true interactions between de-sign inputs, as many of these interac-tions may strengthen or weaken as thedesign evolves (Maisel,1996, p. 16).

• Evaluate the functional (cause-effect)relationships among concurrent activi-ties to determine not only how flex-ible the overall development processis, but where additional flexibility ismost needed to maintain process re-sponsiveness to customer requirementsvolatility (Jordan and Graves, 1995, pp.577–583).

• Cooper and Chew argue that it is in-sufficient to focus on customers; com-petitors are a parallel concern (1996,p. 95). Expand the matrix to focus oncompetitors as well as customers, us-ing key mission or other customer sat-isfaction TPMs. Use existing competi-tor TPMs in a bench-marking fashion,as illustrated earlier.

• Thurston and Locascio (1993, p. 208–213) use multiattribute utility theory tointerpret QFD matrices as a general for-mulation of a design optimization prob-lem. Customer requirements are ex-pressed as constraint functions, andtradeoffs among design attributes areformally specified as sets of variableswhose optimal values are solved usingselected mathematical optimizationmodels (p. 211).

• Sanchez et al. (1993, pp. 244–249) dis-play perhaps the most creative and pro-ductive QFD application. They showmatrix appendages containing trendline analyses of data populating thematrix, and separately illustrate iterat-ing successive matrices to the extentof serving inputs to statistical processcontrol. Both of these enhancementshelp take QFD beyond the realm of acognitive tool to that of hard empiricaldata generator.

Other value-added assessments fromchecking the correlation matrix are likely.Well-populated QFD matrices permit syn-ergistic insights particular to a program’schief concerns.

EVOLVE TPMS INTO THE NEXT DESIGN

PHASE’S REQUIREMENTSTable 5 illustrates the transition of the

“hows” in the first-order matrix to the“whats” in the second-order matrix. Thisseries of steps, where the TPM outputs ofa nth-ordered matrix become input to thesuccessive nth +1-ordered matrix, as thedesign resolution is enhanced, until sys-tem design detail has ideally progressedto the point where: (a) all significant de-sign tradeoffs are defined and resolved,

Page 15: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

329

Table 5. Second-Order Quality Function DeploymentCorrelation Matrix

(b) specific subsystem or component pack-aging is determined and tested, so that (c)overall program risk is reduced to accept-able levels.

Third-, fourth-, fifth-, etc.-order QFDmatrices are preferred for translating cus-tomer requirements into highly detailedsubsystem attributes, or even detailed con-trol of operations (Menon et al. 1994, p.94). For diverse examples of progressiveQFD translation matrix series used totranslate the “voice of the customer” intothe more evolved “voice of the engineer,”see Guinta and Praizler (1993), Hauser andClausing (1988), Sanchez et al. (1993),and Sage (1992). The number of transla-tion matrices is influenced by the com-

plexity and diversity of the program, incombination with the degree of requireddesign detail. While translation matricescontent may vary widely, they all do havea similar structure.

USING QFD TO BENEFITIMPLEMENTING CAIV

QFD’s multi-attribute structure can sys-tematically capture data and interrelate itin a variety of tailored arrangements. Thisenables higher quality integrated programanalysis and evaluation (including costand effectiveness analysis) earlier in theweapon system life cycle. With QFD’s col-

MAXIMIZE Motor burnout velocity (km/sec) 3 2 2 2 8

MISSION Range (km) 3 3 1 2 2 6

EFFECTIVENESS Maneuverability (Gl/ms) 1 2 3 2 1 1 1 1 8

Data processing speed (MHz) 3 1 3

Data reception speed (Kb/sec) 2 3 1 2 3

Length * diameter (m * m) 1 2 3 2 1

Mass (kg) 1 3 2 3 1

Sensor accuracy (S/N) 1 1 1 3 8

TECHNICAL DIFFICULTY (10 = Most Difficult) 8 8 8 7 7 5 5 7

IMPUTED IMPORTANCE (10 = Most Important) 8 6 8 3 3 1 1 8

Mot

or th

rust

(New

tons

)

Tota

l mis

sile

Ava

ilabl

e fu

el m

ass

(kg)

Mis

sile

div

ert t

hrus

ter t

hrus

t acc

eler

atio

n (m

/s2)

COTS

pro

cess

or g

ener

atio

n up

grad

(200

Mhz

)

Mis

sile

upl

ink-

rece

ive

ante

nna

gain

(wat

ts)

Min

iatu

rizat

ion

tech

niqu

es

Sele

ctio

n of

ligh

ter m

ater

ials

Peak

tran

smitt

ed p

ower

(wat

ts)

RELA

TIVE

IMPO

RTA

NCE

TECHNICAL ENGINEERING CHARACTERISTICS

Page 16: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

330

lective knowledge, program managers canless ambiguously evaluate what technolo-gies or other initiatives will fit in and ad-vance the program.

QFD facilitates comprehensively dis-playing rela-tionship nodesamong cost andnoncost vari-ables. This en-courages struc-tured analysesto yield im-proved rank

orderings into which cost reduction op-portunities due to earlier discovery andresolution of conflicts are most significant,before large shares of system life cyclecost become locked in during the earliestprogram phases.

Also, there is perhaps no better tool toascertain whether the degree of systemdefinition and development is commen-surate with requirements determination, oralternatively, whether requirements deter-mination is lagging behind system speci-fication and development.

The traditional system engineering pro-cess provides a standardized “top-down”context comprehensively to apply QFD toprogram management, as described, forexample, by Blanchard and Fabrycky(1990, p. 22, 50). It consists of: concep-tual design and advanced planning, pre-liminary systems design and advanced de-velopment, and detail system design anddevelopment.

Applying QFD through sound systemengineering principles will allow greaterexploitation of modern manufacturingprocesses and controls. Marshall and Vander Ha (1996, pp. 218–226) provide a per-

tinent beneficial example of the system en-gineering approach applied to designingspace system ground segments to reduceoperations costs.

QFD provides a consistent robust struc-ture to arrange interactions among cross-functional team members. As organiza-tions gain experience with QFD, the modelbecomes a source of historical informa-tion and “hard-wired” corporate memory.This promotes growing an integrated prod-uct team-facilitating learning curve. Thus,an expected output of implementing QFDis to advance the efficiency of the organi-zation, especially for better controlling theflow of IPT interactions, while safeguard-ing against organizational de-evolutiondue to loss of corporate memory.

SUMMARY

QFD is a procedure-oriented yetnonmechanistic enabling technology fornew program development. It provides astructured framework that uses TPMs toensure that customer needs are deployedinto all phases of design, development,production, and operations. This frame-work drives the process of developing aroad map showing how key steps fromdesign to manufacturing, operations, andsupport interact at various levels to fulfillcustomer requirements. This road mappromotes documenting overall systemlogic, reflected by a series of interrelatedmatrices that translate customer needs intoprocess and product characteristics. Well-documented QFD matrices provide a flex-ible dynamic communication vehicle ofprior, present, and future actions. Thus,QFD provides a communications tool to

“QFD is a procedure-oriented yetnonmechanisticenabling technologyfor new programdevelopment.”

Page 17: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

331

accelerate building better relationships andpromote trust among cross-functionalteam members earlier in the system lifecycle.

QFD is a team-building, consensus-ori-ented, flexibly disciplined approach thatstructures synthesizing new ideas. It worksas a cognitive map to ease communica-tion of evolving knowledge across costperformance integrated product team ele-ments—enhancing their work in an inte-grated fashion to give customers what theyare asking for. QFD can apply throughoutsteps ranging from requirements determi-nation through design through deliverythrough operations and support.

Incorporating QFD in the design helpsto identify critical driving design attributesthat should be addressed up front, wherethey most greatly benefit design evolution.

QFD models integrate data from manyareas: customer requirements, strategicplans, engineering expertise, cost, missioneffectiveness, production capability, logis-tic support, hardware and software reli-ability, and operations and maintenance.The QFD model presents these data in aside-by-side format showing relationships,correlations, and conflicts. It can show,where needed, tradeoffs among require-ments, resources, and organizations. Asingle-page QFD matrix can easily com-municate what would require a large num-ber of text pages.

By better connecting developer, user,and supporter, QFD facilitates makingCAIV tradeoffs among performance,schedule, and cost. QFD’s iterative natureof using progressively refined TPMs clari-fies system design detail to where signifi-

Figure 1. An Iteration of the Quality Function Deployment Process

Benchmark technicalperformance metrics

Identify technicalperformance metrics

Identify and analyzecustomer requirements

Prioritize/rank ordercustomer requirements and

technical performance metrics

Configure the qualityfunction deployment matrix

Inter/Intra-correlate customerrequirements and technical

performance metrics

Evolve technical performancemetrics into the next design

phase’s requirements

1.0 2.0 3.0

4.0 5.0

6.0 7.0

Page 18: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

332

cant design tradeoffs are defined and re-solved, and subsystem function and pack-aging are determined and tested. This re-duces program risk. Consequently,through exploiting detailed coordinatedTPMs, QFD functions as a key part of pro-gram risk management. Figure 1 general-izes a single iteration of the overall QFDsequence.

One of the most challenging systemdevelopment steps is sustaining the trans-lation of subjective evolutionary customerrequirement statements into objective en-gineering performance measures. Hartzelland Schmitz (1996, p. 36) point out thatvolatile customer requirements are a sig-nificant developmental program riskdriver. Here, QFD may be used to relatedifferent aspects of design, test, manufac-turing, cost, reliability, and technologywhile both maintaining and archiving thechanging customer’s voice as the productdevelopment driving force. In this sense,QFD is usable as a DoD-equivalent ofsound commercial business practices thatdo not lose sight of the fact that the devel-oping voice of the customer is critical tosuccessful implementation.

DoD has recognized QFD as a viableoption in complex analyses involving in-tegrated product teams. QFD has beenacknowledged as a process enabling trueunderstanding of user requirements andexpectations, and documenting the bestapproach to satisfy requirements. DoD hascited QFD as a way to track the expectedtradeoffs through determining require-

ments, (design decisions, production, andsupport (OUSD[A&T], 1996, pp. 2–5, 6).

No single management tool is a pana-cea. DoD acquisitions heavily dependenton integrated product teams will benefitfrom QFD. It is a strong tool for structur-ing IPT processes to comprehensivelyidentify what to do, coordinate actions andtheir interfaces, monitor all tradeoffsamong activities, and understand evolu-tions of program features and interrela-tionships. QFD imposes a self-revealinglogic and structure to program development.

Implementing QFD emphasizes the re-quirement that IPT members take time tolearn other functional areas’ terminologyand develop a common definition of terms,to build renaissance multidisciplinaryteams. It is best to set QFD use objectivesthat stretch organizations, not break them.

Upper-level management may benefitfrom becoming familiar with QFD. Thisfamiliarity can furnish the benefit of un-derstanding what questions to ask to evokeuseful information from the QFD frame-work. A chief example of this is the QFDmatrix revealing new relationships amongcost and performance variables that indi-cate emerging cost reduction responsibili-ties associated with implementing newweapon system technologies. This infor-mation, revealed by QFD, may be thenused to evolve management strategies thatsupport organization efforts in a mannerthat guide development toward optimiz-ing system cost-effectiveness.

Page 19: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

333

REFERENCES

Fabrycky, W. (1994, July-September).Modeling and indirect experimentationin system design evaluation. Journal ofthe National Council of Systems Engi-neering, 1 (1), 133–144.

Fisher, R. (1996, December 30). China’smissile threat. Wall Street Journal, p. A16.

Gindele, M. (1996, May-June). Evaluat-ing concurrent engineering programs.Program Manager, 66.

Guinta, L., & Praizler, N. (1993). TheQFD book—the team approach to solv-ing problems and satisfying customersthrough quality function deployment.New York: Amacom Books (AmericanManagement Association).

Hartzell, R., & Schmitz, D. (1996, July-August). Manufacturing questions pro-gram managers should ask. ProgramManager, 34–38.

Hauser, J., & Clausing, D. (1988, May-June). The house of quality. HarvardBusiness Review, 63–73.

Higgins, G. (1997, January-February).CAIV—An important principle of ac-quisition reform. Program Manager,44–47.

Jones, E. (1996). Specification, harmoni-zation, and linkage of test parameters.Acquisition Review Quarterly, 3 (2),147–162.

Armacost, R., et al. (1994, July). An AHPframework for prioritizing customer re-quirements in QFD: An industrializedhousing application. Institute of Indus-trial Engineers Transactions, 26 (4), 72–79.

Blanchard. B., & Fabrycky., W. (1990).System Engineering and Analysis.Englewood Cliffs, NJ: Prentice-Hall.

Brauchli, M. (1997, January 14). Chinastays a magnet for overseas money. WallStreet Journal, p. A13.

Bregard, R., & Chasteen, T. (1996). Imple-menting integrated product develop-ment: A project manager’s perspective.Acquisition Review Quarterly, 3 (2), pp.163–182.

Chen, K. (1997, January 14). Latest bigChina-Russia arms deal could raise con-cern in Asia and U.S. Wall Street Jour-nal, p. A14.

Conrow, E. (1995, Summer). Some long-term issues and impediments affectingmilitary systems acquisition reform.Acquisition Review Quarterly, pp. 199–211.

Cooper, R., & Chew. W. (1996, January–February). Control tomorrow’s coststoday’s designs. Harvard Business Re-view, pp. 88–97.

Page 20: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

334

Jordan, W., & Graves, S. (1995, April).Principles on the benefits of manufac-turing process flexibility. ManagementScience, 41 (4), 577–594.

Juran, J., & Gryna, F. (1993). Quality plan-ning and analysis—from product devel-opment through use (pp. 253–258). NewYork: McGraw-Hill..

JSFP. (1996b, February 7). JAST virtualmanufacturing fast track demonstratedbenefits. Joint Strike Fighter ProgramWebsite: http\\www.jast.mil\.

Justice, R. (1996, July-August). Risk inthe F–22: A defense science board taskforce analyzes F–22 concurrency andrisk. Program Manager, 68–74.

Kaminski, P. (1995, December 4). Reduc-ing life cycle costs for new and fieldedsystems. Memorandum for distribution.Washington, DC: Office of the Secre-tary of Defense.

Longuemare, R. (1995). Reducing lifecycle costs for new and fielded systems.Memorandum for distribution—imple-mentation guidance—systems in devel-opment. Washington, DC: Office ofPrincipal Deputy Under Secretary ofDefense, Acquisition & Technology.

Lyman, D. (1990). Deployment normal-ization. Transactions from a secondsymposium of quality function deploy-ment, a conference co-sponsored by theAutomotive Division of the AmericanSociety for Quality Control. Dearborn,MI: GOAL/QPC.

Maisel, L. (1996). Activity-based costmanagement. Paper presented at the1996 Conference on Life Cycle Analy-sis: Refining the Trade-Off Tool for In-tegrated Product and Process Develop-ment (IPPD). Harrison, NY: ParamountConsulting Group.

Marshall, M., & Van der Ha, J. (1996).Reducing mission operations cost. In J.Wertz and W. Larson (Eds.), Reducingspace mission cost. Torrance, CA: SpaceTechnology Library Publishers.

Menon, U. et al. (1994). Quality functiondeployment: An overview. In C. Syanand U. Menon (Eds.), Concurrent engi-neering—concepts, implementation andpractice (pp. 90–99). London: Chapmanand Hall.

Office of Deputy Under Secretary of De-fense (AR). (1996). Cost as an indepen-dent variable. Defense acquisitiondeskbook. Defense Acquisition Joint Pro-gram Office. http://[email protected].

Office of Principal Deputy Under Secre-tary of Defense, Acquisition and Tech-nology. (1995). CAIV working group pa-per—executive summary. Washington,D.C.

Office of the Secretary of Defense. (1996,March 15). DoD directive 5000.2-R.Mandatory procedures for major de-fense acquisition programs (MDAP)and major automated information sys-tems (MAIS) acquisition programs.DoD 5000.2-R, part 3.3.3.1. Washing-ton, D.C.

Page 21: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

335

Pisano, G., & Wheelwright, S. (1995, Sep-tember-October). High-tech R&D.Harvard Business Review, 93–105.

Prasad, B. (1996). Concurrent engineer-ing fundamentals: Vol. 1. Integratedproduct and process organization (pp.82-89, 154-156). Upper Saddle River,NJ: Prentice Hall.

Reed, F. (1994, April-May). You can lookbut you can’t touch. Air&Space Maga-zine, 54–59.

Sage, A. (1992). Systems engineering (pp.278–284). Fairfax, VA: George MasonUniversity, School of Information Tech-nology and Engineering.

Salomone, T. (1995). What every engineershould know about concurrent engineer-ing (pp. 100–115). New York: MarcelDekker.

Sanchez, S., et al. (1993). Quality by de-sign. In A. Kusiak (Ed.), Concurrentengineering—automation, tools, andtechniques (pp. 235–251). New York:John Wiley and Sons.

Thurston, D., & Locascio, A. (1993).Multiattribute design optimization andconcurrent engineering. In H. Parsaeiand W. Sullivan (Eds.), Concurrent en-gineering—contemporary issues andmodern design tools (pp. 207–213).London: Chapman and Hall.

Under Secretary of Defense for Acquisi-tion and Technology. (1996). DoDGuide to Integrated Product and ProcessDevelopment (Version 1.0). Washing-ton, D.C.

Under Secretary of Defense for Acquisi-tion and Technology. (1995). Rules ofthe road, a guide for leading successfulintegrated product teams. Washington,D.C.

Verma, D., Chilakapati, R., & Blanchard,B. (1996). Quality function deployment(QFD): Integration of logistics require-ments into mainstream system design(Department Paper). Blacksburg, VA:Virginia Polytechnic and State Univer-sity, Department of Industrial and Sys-tems Engineering, Systems EngineeringDesign Laboratory (SEDL).

Wasserman, G. (1993, May). On how toprioritize design requirements duringthe QFD planning process. Institute ofIndustrial Engineers Transactions, 25(3), 59–65.

Wollover, D. (1991). Accounting for re-sidual value and the probability of warwhen estimating cost-to-effectivenessratios for evaluating alternative weaponsystems. In R. Kankey and J. Robbins(Eds.), Cost analysis and estimating:Shifting U.S. priorities (pp. 143–154).New York: Springer-Verlag.

Wollover, D., & Koontz, R. (1996). Issuessurrounding the joint common costmodel of the joint strike fighter program.Rockville, MD: Princeton EconomicResearch, Inc.

Page 22: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

336

SUPPLEMENTAL BIBLIOGRAPHY

Chang, C. (1989). Quality function de-ployment process in an integrated qual-ity information system. Computers andIndustrial Engineering, 17 (1–4).

Clausing, D., & Pugh, S. (1991, Febru-ary). Enhanced quality function deploy-ment. Proceedings, Design Productiv-ity Conference. Honolulu, HI.

Drucker, P. (1995). Managing in a time ofgreat change. New York: Truman-Talley/Dutton.

Ernst and Young Quality ImprovementConsulting Group. (1990). Total qual-ity. Homewood, IL: Dow Jones-Irwin.

Eureka, W., & Ryan, N. (1988). The cus-tomer-driven company: Managerialperspectives on QFD. Dearborn MI:American Supplier Institute.

Funk, P. (1993). How we used qualityfunction deployment to shorten newproduct introduction cycle time. Pro-ceedings, APICS International Confer-ence and Exhibition. San Antonio, TX.

Gavoor, M., & Wasserman, G. (1989).Framing QFD as a mathematical pro-gram (internal technical report). Detroit,MI: Wayne State University, Depart-ment of Industrial and ManufacturingEngineering.

Akao, Y. (1990). Quality function deploy-ment—integrating customer require-ments into product design. Cambridge,MA: Productivity Press.

ASI. (1988). Implementation manual forthe three day QFD workshop. DearbornMI: American Supplier Institute.

Aswad, A. (1989). Quality function de-ployment: A systems approach. Pro-ceedings of the 1989 Institute of Indus-trial Engineers Integrated Systems Con-ference. Atlanta, GA.

Bordley, R., & Paryani, K. (1990). A con-cept for prioritizing design improve-ments using quality function deploy-ment. Dearborn MI: American SupplierInstitute.

Bossert, J. (1991). Quality function de-ployment—a practitioners approach.Milwaukee, WI: ASQC Press/MarcelDekker.

Burgar, P. (1994). Applying QFD to coursedesign in higher education. Proceedings,American Society of Quality ControlCongress. Las Vegas, NV.

Cavanagh, J. (1990, June). What do I puton my QFD charts? Proceedings, Sec-ond Symposium on Quality FunctionDeployment. Novi, MI.

Page 23: DRW - Quality Function

Quality Function Deployment as a Tool for Implementing Cost as an Independent Variable

337

Masud, A., & Dean, E. (1993). Usingfuzzy sets in quality function deploy-ment. Proceedings, 2nd Industrial En-gineering Research Conference.

Prasad, B. (1995). A structured approachto product and process optimization formanufacturing and service industries.International Journal of Quality andReliability Management, 12 (9).

ReVelle, J. (1990). The new quality tech-nology—an introduction to QFD andTaguchi methods (report B138). LosAngeles, CA: Hughes Aircraft Com-pany.

Ross, P. (1988, June). The role of Taguchimethods and design of experiments inQFD. Quality Program.

Schubert, M. (1989). Quality functiondeployment—a comprehensive tool forplanning and development. Proceedingsof the IEEE 1989 National Aerospaceand Electronics Conference, 4.

Slabey, W. (1990). QFD: A basic primer—excerpts from the implementationmanual for the three-day QFD work-shop. Transactions, Second Symposiumon QFD, Novi, MI.

Smith, L. (1991, February 6–9). QFD andits application in concurrent engineer-ing. Honolulu, HI: Design ProductivityInternational Conference.

Sullivan, L. (1986, June). Quality func-tion deployment. Quality Progress.

Hales, R. et al. (1990). Quality functiondeployment and the expanded house ofquality (technical report). InternationalTechneGroup Inc.

Hales, R., et al. (1993). Quality functiondeployment in concurrent product/pro-cess development. Proceedings, IEEESymposium on Computer-Based Medi-cal Systems. Ann Arbor, MI.

King, B. (1987). Better designs in half thetime: Implementing QFD in America.Methuen, MA: Goal/QPC.

Langdon, R. (1990, February-March).Quality function deployment: Newweapon in the zero defects war. Auto-motive Engineer.

Locascio, A., & Thurston, D. (1993).Multiattribute design optimization withquality function deployment. Proceed-ings, 2nd IE Research Conference. LosAngeles, CA.

Maddux, G. et al. (1991, June). Organiza-tions can apply quality function deploy-ment as a strategic planning tool. Indus-trial Engineering, 33–37.

Mallon, J., & Mulligan, D. (1993, Sep-tember). Quality function deploy-ment—a system for meeting customerneeds. Journal of Construction Engi-neering and Management, 119 (3).

Mann, G., & Halbleib, L. (1992). Appli-cation of QFD to a national security is-sue. Annual Quality Congress Transac-tions.

Page 24: DRW - Quality Function

Acquisition Review Quarterly—Summer 1997

338

Wasserman, G. et al. (1989). Integratedsystems quality through quality functiondeployment. Proceedings from the 1989Integrated Systems Conference, Instituteof Industrial Engineers Conference. At-lanta, GA.

Wheelwright, S., & Clark, K. (1992).Revolutionizing product development.New York: Free Press.

Williams, R. (1994, Summer). Small busi-ness manufacturing: An important com-ponent of the U.S. defense industrialbase. Acquisition Review Quarterly,250–267.

Zultner, R. (1990). Software quality func-tion deployment—adapting QFD tosoftware. Transactions, Second Sympo-sium on QFD, Novi, MI.

Zultner, R. (1989). Software quality func-tion deployment. American Society ofQuality Control Quality CongressTransactions.