systems engineering return on investment thesis-distrib.pdfsystems engineering return on investment...

239
Systems engineering return on investment by Eric C. Honour BSSE, MSEE A thesis submitted for the degree of Doctor of Philosophy Defence and Systems Institute School of Electrical and Information Engineering January 2013

Upload: others

Post on 31-Jan-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

  • Systems engineering return on investment

    by

    Eric C. Honour BSSE, MSEE

    A thesis submitted for the degree of

    Doctor of Philosophy

    Defence and Systems Institute

    School of Electrical and Information Engineering

    January 2013

  • Systems Engineering Return on Investment

    iii

    Contents

    1 !"#$%&'(#)%" **+* ,-(./$%'"& **+0 1%-23 4*+4 5(%67 4*+8 9737-$(: ;7#:%&3 8*+< =$);-$> $73'2#3 <*+? @:73)3 %$/-")A-#)%" ?

    2 97B)7C %D $72-#7& $737-$(: E0+* ,-(./$%'"& )"D%$;-#)%" E0+0 F%$;-#)B7 #:7%$> *80+4 5#-#)3#)(-2 $737-$(: $72-#7& #% 5G 9H! *?0+8 5';;-$> -"& D)"&)"/3 %D 6$)%$ $73'2#3 00

    3 9737-$(: &73)/" 0?4+* 9737-$(: I'73#)%"3 0?4+0 9737-$(: -(#)B)#)73 0J

    4+4 1')&-"(7 -"& 6-$#)()6-#)%" 4*

    4+8 G#:)(3 (%"3)&7$-#)%"3 404+< HK37$B-#)%"3 -"& D)"&)"/3 44

    4 L-#- /-#:7$)"/ 4<8+* L-#- #% K7 /-#:7$7& 4?

    8+0 !"#7$B)7C &73)/" 8M8+4 N66$%-(: #% %K#-)" &-#-

  • iv

  • Systems Engineering Return on Investment

    v

    N667"&)U , !"#7$B)7C )"3#$';7"#3 *M<N667"&)U R L7B72%6;7"#-2 6-67$3 0*4R+* N&B-"()"/ -" %"#%2%/> 0*4R+0 =$-(#)(-2 6$%/$-; %D $737-$(: 00?R+4 1-#:7$)"/ &-#- 04JR+8 L73)/" %D 7U67$);7"#3 08JR+< L7;%/$-6:)(3 0

  • vi

    List of Figures F)/'$7 *+ !"#')#)B7 B-2'7 %D 5G+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++0F)/'$7 0+ !;6-(# %D D$%"# 7"& 6$%T7(# &7D)")#)%" 7DD%$#+ WD$%; 1$':2 *MM0X ++++++++++++++++++++++++++++++++++ MF)/'$7 4+ L7D)")#)%" 7DD%$# )3 "%# #:7 3-;7 -3 3>3#7;3 7"/)"77$)"/ 7DD%$#+ ++++++++++++++++++++++++++++++++++*OF)/'$7 8+ 5:%$#7$ 3(:7&'27 D%$ ;%$7 (%;627U YZF '3)"/ K7##7$ 5G WD$%; F$-"#A *MM 5G GDD%$#\ ] 6$%T7(# (%3# WD$%; Z%"%'$ 0OO8X++++++++*JF)/'$7 **+ R%3# 67$D%$;-"(7 -3 - D'"(#)%" %D 5G 7DD%$# WD$%; Z%"%'$ 0OO8X +++++++++++++++++++++++++++++*EF)/'$7 *0+ 5(:7&'27 67$D%$;-"(7 -3 - D'"(#)%" %D 5G 7DD%$# WD$%; Z%"%'$ 0OO8X++++++++++++++++++++*EF)/'$7 *4+ 5'KT7(#)B7 3'((733 -3 - D'"(#)%" %D 5G 7DD%$# WD$%; Z%"%'$ 0OO8X++++++++++++++++++++++++++++*EF)/'$7 *8+ 5G 7DD%$# $7I')$7& )" RHRHSH !! D%$ &)DD7$7"# 3)A7 3>3#7;3 WD$%; ,%7:; 0OOJX+0OF)/'$7 * :-3 2733 7DD7(# D%$ :)/: (:-227"/7 6$%/$-;3 WD$%; G2; 0OOEX ++++++++++00F)/'$7 *E+ 577.)"/ %6#);'; 27B72 %D 5G 7DD%$# C)#:)" 6$%/$-;3 WD$%; Z%"%'$ 0OO0X+ ++++++++++4EF)/'$7 *M+ R%3# %B7$$'" B3+ 3>3#7;3 7"/)"77$)"/ 7DD%$# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++?*F)/'$7 0O+ 5(:7&'27 %B7$$'" B3+ 3>3#7;3 7"/)"77$)"/ 7DD%$# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++?*F)/'$7 0*+ Z)3#%/$-;3 %D 6$%/$-; ^3)A7_ 6-$-;7#7$3 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++?4F)/'$7 00+ @7-; '"&7$3#-"&)"/ 6-$-;7#7$3 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++?8F)/'$7 04+ =$%K27; &)DD)('2#> 6-$-;7#7$3++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++?<F)/'$7 08+ @7-; (-6-K)2)#> -"& 7U67$)7"(7 6-$-;7#7$3+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++??F)/'$7 06)(-2 3(-##7$ 62%# %D - 3'((733 ;7-3'$7 -/-)"3# -" 5G -(#)B)#> +++++++++++++++++++++++++++++++E4F)/'$7 4*+ R%$$72-#)%" (:-$#3 %D 3'((733 ;7-3'$73 -/-)"3# !"##C)#:%'# -662)(-#)%" %D

    6$%/$-; (:-$-(#7$)3#)(3 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++M*F)/'$7 40+ `7)/:#)"/ 37-$(: 7U-;627 W%$)/)"-2 B-2'73X++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++M<F)/'$7 48+ `7)/:#)"/ 37-$(: 7U-;627 WD)"-2 B-2'73X ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++M?F)/'$7 4 W'(%X -/-)"3# 5G W#"##%X++++++++++++++++++++++++++++++++++++++***F)/'$7 8*+ 5727(#)"/ %6#);'; 5G 7DD%$# '3)"/ 9H! ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++**4F)/'$7 80+ R%$$72-#)%" 62%#Q (%3# W$%X -/-)"3# SL W#)*#%X++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++**8F)/'$7 84+ R%$$72-#)%" 62%#Q 3(:7&'27 W"%X -/-)"3# SL W#)*#%X ++++++++++++++++++++++++++++++++++++++++++++++++++++**<F)/'$7 88+ R%$$72-#)%" 62%#Q %B7$-22 3'((733 W&"%X -/-)"3# SL W#)*#%X++++++++++++++++++++++++++++++++++++++**<F)/'$7 8

  • Systems Engineering Return on Investment

    vii

    F)/'$7

  • viii

    List of Tables @-K27 *+ N&B-"(73 %D #:)3 $737-$(: -/-)"3# 6-3# C%$. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++08@-K27 0+ 9737-$(: -(#)B)#)73 6$)%$ #% -"& &'$)"/ #:)3 #:73)3 7DD%$#+++++++++++++++++++++++++++++++++++++++++++++++++0J@-K27 4+ L-#- )#7;3 D%$ 6$%/$-; 3'((733++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++88@-K27 8+ L-#- )#7;3 D%$ 3>3#7;3 7"/)"77$)"/ 7DD%$# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++88@-K27 27B723 K> 27B72 %D 3#-$# &7D)")#)%" ++++++++++++++++++++++++++++++++++++++++++++EO@-K27 *O+ R%$$72-#)%" (%7DD)()7"#3 D%$ 3'((733 ;7-3'$73 -/-)"3# !"##C)#:%'# -662)(-#)%" %D

    6$%/$-; (:-$-(#7$)3#)(3 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++M0@-K27 **+ H'#2)7$3 $7;%B7& D$%; 3#-#)3#)(-2 &-#-++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++M8@-K27 *0+ R%$$72-#)%" );6$%B7;7"# D%$ #%#-2 5G++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ME@-K27 *4+ R%$$72-#)%" );6$%B7;7"# D%$ ;)33)%"b6'$6%37 &7D)")#)%" WSLX ++++++++++++++++++++++++++++*O*@-K27 *8+ R%$$72-#)%" );6$%B7;7"# D%$ $7I')$7;7"#3 7"/)"77$)"/ W9GX ++++++++++++++++++++++++++++++++*O0@-K27 *3#7; -$(:)#7(#)"/ W5NX++++++++++++++++++++++++++++++++++++++++++++++*O0@-K27 *?+ R%$$72-#)%" );6$%B7;7"# D%$ 3>3#7; )"#7/$-#)%" W5!X+++++++++++++++++++++++++++++++++++++++++++++++++*O4@-K27 *J+ R%$$72-#)%" );6$%B7;7"# D%$ B7$)D)(-#)%" c B-2)&-#)%" WaaX++++++++++++++++++++++++++++++++++++*O8@-K27 *E+ R%$$72-#)%" );6$%B7;7"# D%$ #7(:")(-2 -"-2>3)3 W@NX+++++++++++++++++++++++++++++++++++++++++++++++++*O<@-K27 *M+ R%$$72-#)%" );6$%B7;7"# D%$ 3(%67 ;-"-/7;7"# W5SX++++++++++++++++++++++++++++++++++++++++++++++*O<@-K27 0O+ R%$$72-#)%" );6$%B7;7"# D%$ #7(:")(-2 27-&7$3:)6b;-"-/7;7"# W@SX +++++++++++++++*O?@-K27 0*+ +. &7()3)%" B-2'73 D%$ 3)/")D)(-"(7 O+OX +++++++++++++++++++++++++++++++++*OM@-K27 04+ R%$$72-#)%" 3)/")D)(-"(7 #73#3 D%$ 5>3#7;3 G"/)"77$)"/ 7DD%$# W-22 &-#-X +++++++++++++++**O@-K27 08+ P'-"#)D)7& 3>3#7;3 7"/)"77$)"/ $7#'$" %" )"B73#;7"# W5G 9H!X+++++++++++++++++++++++++++++**4@-K27 0 %D D)"&)"/3 D$%; 3#-#)3#)(-2 C%$. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*4M@-K27 4?+ @>6)(-2 6$%/$-; (:-$-(#7$)A-#)%" B-2'-#)%" &7D)")#)%"3 W^3>3#7; 3)A7_X +++++++++++++++*?<@-K27 4J+ H6#);-2 #%#-2 5G 7DD%$# D%$ B-$)-#)%" )" %"7 D-(#%$+++++++++++++++++++++++++++++++++++++++++++++++++++++++++*??@-K27 4E+ H6#);'; -"& -(#'-2 5G 7DD%$# 27B723 D%$ 36-(7 3>3#7; 6$%/$-; ++++++++++++++++++++++++++++++*?J@-K27 4M+ H6#);'; -"& -(#'-2 5G 7DD%$# 27B723 D%$ -)$K%$"7 #$-)")"/ 3>3#7; +++++++++++++++++++++++++*?E@-K27 8O+ 5';;-$> %D D)"&)"/3 D$%; &)3('33)%" %D 3#-#)3#)(-2 $73'2#3++++++++++++++++++++++++++++++++++++++++*J?@-K27 8*+ S%3# );6%$#-"# D-(#%$3 D%$ 5G 7DD%$# 73#);-#)%" ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*E*

  • Systems Engineering Return on Investment

    ix

    Glossary ASEE Adjusted systems engineering effort AXXE Adjusted XX effort (adjusted from XXE to correct for missing early-

    phase activities; see Section 5.1.5)

    C Cost compliance (also used as subscript)

    CA Actual cost at completion CP Planned cost CXX Cost of the effort expended in SE activity XX ˆ C Cost compliance, predicted trend for all programs ˆ C G Cost compliance, predicted trend for a given set of program

    characteristics

    CMMi Capability maturity model integrated

    COCOMO Constructive cost model

    COSYSMO Constructive systems engineering cost model

    EIA Electronics Industries Association

    ESEE Effective systems engineering effort EXXE Effective XX effort (corrected from AXXE to remove the specific

    characteristics of the program; see Section 5.1.9)

    GXX Correction factor applied to XX effort for a given set of program characteristics

    HX Hypothesis (used for several, with different subscript designations)

    HX0 Null hypothesis (used for several, with different subscript designations)

    INCOSE International Council on Systems Engineering

    K, k Value for a KPP KPP Key performance parameter

    LEP Large engineering projects

    MD Mission/purpose definition

    MDE Mission/purpose definition effort MIT Massachusetts Institute of Technology

    N Sample size NDIA National Defense Industry Association

    O Overall success (when used as subscript) OS Overall success OXXE0 Optimum XX effort for the average program (predicted from data

    trends)

  • x

    OXXEG Optimum XX effort (predicted from data trends, corrected from OXXE0 based on the given program characteristics)

    PC Program challenge

    PCA Principal component analysis

    PM Program management

    PP Percentile point ranking of a program’s characteristic against all programs

    QF1-QF7 Principal components (factors) of quantitative program characterization parameters

    R2 Statistical Pearson’s correlation coefficient RAG Research advisory group

    RE Requirements engineering

    REE Requirements engineering effort RESL Architecture and risk resolution

    RFP Request for proposal

    ROI Return on investment RQX Research question (used for several, with different subscript

    designations)

    S Schedule compliance s Sample variation in the data obtained SA Actual duration SP Planned duration SA System architecting

    SAE System architecting effort SE Systems engineering

    SEC Systems engineering capability

    SECOE Systems Engineering Center of Excellence

    SEE Systems engineering effort SEQ Systems engineering quality SE-ROI Systems engineering return on investment

    SE% Raw cost ratio of effort expended for total SE activity against total program cost.

    SF1-SF7 Principal components (factors) of subjective program characterization SI System integration

    SIE System integration effort SM Scope management

    SME Scope management effort

  • Systems Engineering Return on Investment

    xi

    T Technical quality (when used as subscript) t Student’s t distribution statistic TA Technical analysis

    TAE Technical analysis effort TC Technical committee

    TM Technical leadership/management

    TME Technical leadership/management effort TQ Technical quality U Utility value UHF Universal holding fixture

    UniSA University of South Australia

    VV Verification/validation

    VVE Verification/validation effort Weightj Weighting factor used to correct SE activity efforts based on one

    characterization factor j XX General indicator for various SE activities MD, RE, SA, SI, VV, TA,

    SM, TM (also used as subscript)

    XXE Effort expended for SE activity XX, normalized from XX% for subjective quality of original effort (See Section 5.1.4)

    XXQ Subjective quality of the effort expended for SE activity XX XX% Raw cost ratio of effort expended for SE activity XX against total

    program cost.

    Significance level (in statistics); probability of accepting the null hypothesis wrongly; Type-I error rate; miss rate

    Probability of rejecting the null hypothesis wrongly; Type-II error; false alarm rate

    Acceptable variation in the calculated mean

    Pearson’s correlation coefficient

  • xii

    Summary This Systems Engineering Return on Investment (SE-ROI) research project explored

    the quantifiable relationships between systems engineering (SE) activities and program

    success. The work discovered statistically significant relationships between SE

    activities and three success measures: cost compliance, schedule compliance, and

    stakeholder overall success. SE-ROI is discovered to be as high as 7:1 for programs

    with little SE effort and 3!:1 for median programs. Optimum SE effort for median

    programs is 14.4% of total program cost; the work provides an a priori estimation

    method to determine this optimum for specific programs based on 14 characterization

    parameters. These findings address a significant state-of-the-art gap in that SE effort

    levels have typically been based on subjective heuristics rather than quantified success

    parameters.

    The research developed an interview methodology and interview instruments to obtain

    a rich set of data from completed programs. The research was supported by a Research

    Advisory Group of over 60 international members who evaluated the research plans,

    methods, and instruments during development, ensuring a robust research approach.

    Program interviews were performed on 51 completed programs in 16 organizations.

    Interview participants were the Program Manager and Lead Systems Engineer of each

    program. Programs were from a wide variety of both contracted and amortized

    development domains; had a wide range of cost, schedule and success; and evidenced

    SE effort level from near-nil to large.

    Through the use of Principal Component Analysis and a hill-climbing search for best

    correlation, the program SE effort levels were adjusted for the specific program

    characteristics, increasing correlations from R2 = 14% to as much as R2 = 80%. The

    high degree of resultant correlation indicates that the appropriately weighted program

    characterization parameters largely remove the confounding factors that usually

    obscure the relationship between SE effort and program success.

    The resulting relationships show that all SE activities correlate significantly with cost

    compliance, nearly all SE activities correlate with schedule compliance, and most SE

    activities correlate with stakeholder overall success. There is some indication of

    causality in qualitative and theoretical factors. While not definitive, the implication is

    that the level of selected SE effort is causative of the program success. If true, then the

  • Systems Engineering Return on Investment

    xiii

    use of the SE effort estimation method herein would result in the best available

    program success.

    Some additional findings are also presented. The data shows no significant correlation

    between SE effort levels and system technical quality. There is indication that this lack

    of correlation is due to program emphasis on requirement thresholds rather than on

    stakeholder-defined technical quality. Optimizing technical leadership/management

    levels is shown to provide a unique benefit in simultaneously associating with cost

    compliance, schedule compliance, and overall stakeholder success. The work also

    contributed to the discovery of a commonly held SE ontology that could be expressed

    in eight SE activities. The worth of this ontology was evident in its easy understanding

    by all interview participants.

  • xiv

    Declaration This thesis presents work carried out by myself and does not incorporate without

    acknowledgment any material previously submitted for a degree or diploma in any

    university; to the best of my knowledge it does not contain any materials previously

    published or written by another person except where due reference is made in the text;

    and all substantive contributions by others to the work presented, including jointly

    authored publications, are clearly acknowledged.

    Eric C. Honour

    8 January 2013

  • Systems Engineering Return on Investment

    xv

    Acknowledgements This work would not have been possible without the participation, encouragement, and

    wisdom provided by many.

    The University of South Australia (UniSA) has played a significant role in the

    completion of this work. While the original three-phase research plan was created in

    1997, its progress was slow and difficult until UniSA graciously offered its sponsorship

    of Phase III as a doctoral candidacy. The ‘official’ nature of the research from that

    point served two primary purposes: (a) it provided deadlines and impetus to progress,

    to act in opposition to the pull of daily business that often held the research back, and

    (b) it opened doors to other advisors and for programs to interview that were not open

    to an independent researcher. In particular, I wish to acknowledge Prof. Stephen Cook,

    who twisted my arm to formalize this relationship and who reluctantly ended up as

    primary supervisor; A/Prof. Joseph Kasser, primary supervisor during the formative

    stages, who helped to keep the scope within control; and A/Prof. Timothy Ferris, whose

    detailed reviews were always insightful and helpful. Other researchers and academics

    on staff at UniSA Defence and Systems Institute (DASI) often helped with

    encouragement, review, contacts, and tidbits of essential knowledge. In addition, the

    entire DASI administrative staff was a delight to work with, with a special note for the

    selfless and always-present help of Dale Perin.

    Untold numbers of friends in the International Council on Systems Engineering

    (INCOSE) provided special help, encouragement, and knowledge to this effort. I can

    single out Dr. George Friedman, Dr. Elliot Axelband, and Dr. Azad Madni for their

    constant and necessary push toward successful completion. Others, who preceded me

    through the immense work to complete doctorates, gave me a helpful pull forward,

    including specifically Dr. Ricardo Valerdi and Dr. Sarah Sheard. I also thank Dr. Bill

    Ewald for his encouragement and always-positive attitude toward my work. Particular

    thanks go to Dr. Brian Mar, who worked with me and encouraged me in the earlier

    phases but has since gone on to a better world.

    The SE-ROI Research Advisory Group participants, over 60 strong, provided

    significant help by reviewing the formulation of the research plans and methods, with

    many timely comments and changes to improve the effort. They also often provided

    the necessary contacts to obtain the program interviews.

  • xvi

    And finally, I cannot express sufficiently my gratitude to the unnamed organizations

    that participated in interviews, and the individuals from those organizations who were

    interviewed. The work could not have been done without the generous access you

    allowed me to your programs through proprietary boundaries. I hope that this work

    fulfills your expectations, and that the results advance the discipline of systems

    engineering to your benefit and the benefit of mankind.

  • Systems Engineering Return on Investment

    xvii

    This work is dedicated to

    My wife, Beth, whose unstinting support over 15 years

    has encouraged, cautioned and prodded this success,

    The systems engineering discipline, making the world

    better through technology,

    and mostly to

    God, who gave me the skills and reasons to perform it.

  • Systems Engineering Return on Investment

    1

    1 Introduction

    1.1 Background The discipline of systems engineering (SE) has been recognized for 50 years as

    essential to the development of complex systems. Since its recognition in the 1950s

    (e.g. Goode 1957), SE has been applied to products as varied as ships, computers and

    software, aircraft, environmental control, urban infrastructure, automobiles, and many

    more (SE Applications TC 2000). Systems engineers have been the recognized

    technical leaders of complex program after complex program (Hall 1993, Frank 2000).

    In many ways, however, less is understood about SE than nearly any other engineering

    discipline. The engineering aspects of SE rely on systems sciences; they also rely on

    engineering relationships in many domains to analyze product system performance.

    But systems engineers still struggle with the basic mathematical relationships that

    control the development of systems. SE guides each system development by the use of

    heuristics learned by each practitioner during the personal experimentation of a career.

    The heuristics known by each differ; one need only view the fractured development of

    SE standards and SE certification to see how much they differ.

    As a result of this heuristic understanding of the discipline, it has in the past been

    nearly impossible to quantify the value of SE to programs (Sheard 2000). Yet both

    practitioners and managers intuitively understand that value. They typically

    incorporate some SE practices in every complex program. The differences in

    understanding, however, just as typically result in disagreement over the level and

    formality of the practices to include. Presciptivists create extensive standards,

    handbooks, and maturity models that prescribe the practices that ‘should’ be included.

    Descriptivists document the practices that were ‘successfully’ followed on given

    programs. In neither case, however, are the practices based on a quantified

    measurement of the actual value to the program.

  • 2

    The intuitive understanding of the value of SE is shown in Figure 1. In traditional

    design, without consideration of SE concepts, the creation of a system product is

    focused on fixing problems during production, integration, and test. In a ‘system

    thinking’ design, greater emphasis on the front-end system design creates easier, more

    rapid integration and test. The overall result promises to save both time and cost, with

    a higher quality system product.

    The primary impact of the systems engineering concepts is to reduce risk early, as also

    shown in Figure 1. By reducing risk early, the problems of integration and test are

    prevented from occurring, thereby reducing cost and shortening schedule. The

    challenge in understanding the value of SE is to quantify these intuitive understandings.

    SYSTEMDESIGN

    DETAILDESIGN

    PRODUCTIONINTEGRATION TEST

    Traditional Design

    SavedTime/Cost“System Thinking” Design

    SYSTEMDESIGN

    DETAILDESIGN

    PRODUCTIONINTEGRATION TEST

    Traditional Design

    SYSTEMDESIGN

    DETAILDESIGN

    PRODUCTIONINTEGRATION TEST

    Traditional Design

    SavedTime/Cost“System Thinking” Design

    SavedTime/Cost“System Thinking” Design

    Time

    Risk

    Time

    Risk

    Traditional Design

    “System Thinking” DesignTime

    Risk

    Time

    Risk

    Time

    Risk

    Time

    Risk

    Traditional Design

    “System Thinking” Design

    Figure 1. Intuitive value of SE

    The research program described in this thesis commenced in 2005. In an earlier phase

    of this research Honour (2004) reported on survey work that indicated a correlation

    between the amount of SE work and the cost/schedule success of programs. Another

    more extensive survey performed by the National Defense Industry Association

    (NDIA) (Elm et al. 2007) showed levels of correlation between subjectively determined

    SE activities and the success of programs. A third work (Boehm et al. 2008) explored

    quantitative indications about SE correlations available from within software cost

    estimation data. Yet to date, none of these prior works have provided sufficient

    information to determine either the degree of correlation or the optimum level and type

    of SE to select based on the parameters of a program. These are the goals of this

    research. See chapter 2 for a more extensive discussion of prior work.

  • Systems Engineering Return on Investment

    3

    1.2 Goals This Systems Engineering Return on Investment (SE-ROI) research was designed to

    gather empirical information about how systems engineering methods relate to

    program1 success. In particular, the research was aimed at two results of great value to

    the theory and practice of SE:

    1. Determining the degree of correlation between SE activities and program

    success.

    2. Determining the optimum amount and type of SE activities based on a

    program’s definition parameters.

    The effort leading to this thesis was originally conceptualized in work that preceded

    this thesis work. Sections 2.2 and 3.2 describe the distinction between the prior work

    and this thesis.

    1.3 Scope In performing this research, the field of SE has been defined by an ontological view

    based on a melding of the primary current SE standards as described in Section 4.1.3.

    This ontology resulted in the definition of eight major activities that collectively

    encompass what is usually perceived to be ‘systems engineering’:

    Mission/purpose definition

    Requirements engineering

    System architecting

    System integration

    Verification and validation

    Technical analysis

    Scope management

    Technical leadership/management

    Of these, the first five are roughly sequential in nature, defining the typical

    development lifecycle. The last three are largely continuous throughout the

    development. See section 4.1.3 for definitions of these eight activities and a

    description of how they were selected. 1 Possible confusion exists between the terms ‘program’ and ‘project.’ For clarity in this report, the word ‘project’ refers to the SE-ROI project. The word ‘program’ refers to the system development programs whose data is gathered.

  • 4

    The results of this research are bounded by the lifecycle of a system development.

    Because SE applies to the primary development of systems, the data obtained has been

    restricted to programs in which a system development occurred, with temporal lifecycle

    bounds from the beginning of development to the creation of first system(s). See

    section 4.5 for the methods used to define the bounds of each program.

    The results of this research are also bounded by the domain source of the data. The

    data source has been limited to the programs and organizations that made themselves

    available for the research. See section 4.6 for the demographics of the source

    organizations and programs. The results herein do not apply to SE as used outside

    these bounds.

    1.4 Research methods The research gathered data from real programs through an interview process. The

    information gathered included three major classes:

    Amount and type of SE activities used on the programs

    Success levels of the programs

    Program characterization parameters

    Prior to conducting interviews, the research project created a consistent structure for the

    interviews based on (a) theoretical hypotheses about the expected relationships, (b) the

    ontological view of systems engineering, and (c) feasible interview length. The

    interview design was vetted through peer review and through a process of test

    interviews. See section 4.2 for further discussion of the interview design.

    The interviews were conducted once per program, using data extracted from program

    records and from responses of the interview participants. Interviews were conducted

    over a three-year period by the author. Interview participants were the primary

    program and technical leaders of each program. At the outset of each interview, initial

    conversations set the program bounds to be used throughout the interview; from that

    point, all answers were consistent with the defined program bounds. The researcher

    performed all recording of data. See section 4.5 for further description of the interview

    methods.

    After gathering interview data, the research applied rigorous statistical methods to

    examine the relationships in the data against the hypotheses related to the research

  • Systems Engineering Return on Investment

    5

    project’s goals. Through Principal Component Analysis and a hill-climbing search for

    best correlation, the data was used to determine the combination of program

    characteristics that most affected the correlations of success versus SE activities. Initial

    adjustments were then made to the raw data to convert each program from its native

    program characteristics to a median level of characteristics. These adjustments largely

    removed the confounding factors that usually obscure the primary relationships. Using

    the medianized data then, tests were performed on the primary relationships of success

    versus SE activities, to determine the level and confidence of the correlations. The

    relationships of the program characterization parameters to the primary correlations

    were tested to determine the effect of each parameter on the correlations. The ROI of

    SE was then calculated from the correlated relationships. See section 5.1 for further

    discussion of the statistical methods used, and sections 5.2 and 5.3 for the specific

    statistical analyses against the project goals.

    1.5 Primary results The research succeeded in demonstrating the expected relationships between program

    success and SE activities. In addition, the results provide a quantified evaluation of

    those relationships that can be used in the planning of system development programs.

    Findings of this research are included throughout this thesis and are listed at the end of

    each chapter. Major findings are described in Section 7.1. The primary results include

    the following, all of which are rigorously supported by the work herein:

    There is a quantifiable relationship between systems engineering effort levels and

    program success, demonstrated by high correlation coefficients well in excess of

    the test values for a significance level of = 0.05.

    The Return on Investment (ROI) for SE effort can be as high as 7:1 for programs

    expending little to no SE effort. For programs expending a median level of SE

    effort, the ROI is 3.5:1.

    No correlation was found between systems engineering effort levels and system

    technical quality.

    There is an optimum amount of systems engineering effort for best program

    success. For a program of median characterization parameters, that optimum is

    14.4% of the total program cost.

  • 6

    Programs typically use less systems engineering effort than is optimum for best

    success.

    An effort estimation method is available to determine the optimal levels of SE

    effort for a given set of program characterization parameters. Variation in the

    program characterization typically changes the optimum between 8% and 19% of

    the total program cost.

    Prior research work in this area is examined in Chapter 2. The primary results listed

    above are a significant extension beyond anything available in the prior work. The

    extension is most apparent in the specific empirical evidence that provides quantifiable,

    management-oriented decision information as opposed to the prior work that is largely

    subjective in kind. Section 2.4 specifically summarizes the findings of prior work and

    compares the results obtained by this research.

    1.6 Thesis organization The thesis has the following organization:

    Chapter 1 Introduction (this chapter) contains a brief introduction to the research

    and summary of its results, with references to the chapters that follow.

    Chapter 2 Review of related research describes prior work by the author and

    others, establishing a basis upon which the current research builds. Some findings

    are offered based on the research of prior works.

    Chapter 3 Research design describes in detail the design of the research, including

    the research questions explored; the research activities performed; the guidance,

    participation, and organization of the research project; and the ethics

    considerations. Findings are offered based on the research design work.

    Chapter 4 Data gathering describes the methods used to obtain data, including the

    approach to gain access to programs; the methods used during interviews; the

    protection of the data; and the demographics of the programs interviewed. Findings

    are offered based on the interview work and demographics.

    Chapter 5 Statistical results describes the statistical methods used to analyze the

    gathered data and the results obtained. The research questions are reformulated as

    specific hypotheses. The statistical results are used to test each hypothesis.

    Findings are offered based on the statistical results, along with necessary

    limitations.

  • Systems Engineering Return on Investment

    7

    Chapter 6 Discussion of results provides a logical analysis of the results, possible

    indications of causality, and their limitations, including examples of how the results

    can be used during or in advance of a system development program.

    Chapter 7 Conclusions and recommendations completes the work by

    summarizing the major findings and indicating areas of possible future work.

    Appendices provide supporting information including

    Bibliography

    Interview instruments

    Copies of developmental papers

  • 8

    2 Review of related research

    This chapter provides a review of prior work done by the author and others, to lay a

    basis of knowledge for the research. For each work, this chapter provides a specific

    reference to the work (by reference to the Bibliography) and a summary of its

    contributions and findings.

    2.1 Background information Some prior work has provided a historical background of information related to the

    ROI of SE. Most of this work, while interesting, is anecdotal in nature because it (a)

    was not directed at the SE issues, (b) was based on few data points, and/or (c) was not

    based on a sound and declared research methodology. Yet the total of these works

    indicates an underlying trend that generally supports the possibility to calculate SE-

    ROI.

    Boundary management study. A statistical research project in the late 1980s (Ancona

    1990) studied the use of time in engineering projects. Ancona and Caldwell gathered

    data from 45 technology product development teams. Data included observation and

    tracking of the types of tasks performed by all project members throughout the projects.

    Secondary data included the degree of success in terms of product quality and

    marketability. Of the projects studied, 41 produced products that were later

    successfully marketed. The remaining four projects failed to produce a viable product.

    One primary conclusion of the research was that a significant portion of the project

    time was spent working at the team boundaries. Project time was divided as:

    Boundary management 14%

    Work within team 38%

    Individual work 48%

  • Systems Engineering Return on Investment

    9

    Boundary management included work that was typically done by a few individuals

    rather than by all members of the team. The work included efforts in classes defined as

    Ambassador, Task Coordinator, Scout, and Guard, indicating the role of the work with

    relation to the project. More important to the value of systems engineering, the

    research also concluded statistically that high-performing teams did more boundary

    management than low-performing teams. This relates to systems engineering because

    many of the boundary management tasks are those that are commonly performed as

    part of SE management.

    NASA project definition. Werner Gruhl of the NASA Comptroller’s office presented

    results (Gruhl 1992) that relate project quality metrics with a form of systems

    engineering effort (Figure 2). This data was developed within NASA in the late 1980’s

    for 32 major projects over the 1970s and 1980s.

    Total Program Overrun32 NASA Programs

    R2 = 0.5206

    0

    20

    40

    60

    80

    100

    120

    140

    160

    180

    200

    0 5 10 15 20

    Definition Percent of Total Estimate

    Prog

    ram

    Ove

    rrun

    Definition $Definition Percent = ---------------------------------- Target + Definition$

    Actual + Definition$Program Overrun = ---------------------------------- Target + Definition$

    GRO76OMV

    GALL

    IRAS

    TDRSS

    HST

    TETH

    LAND76MARS

    MAG

    GOES I-MCENACT

    CHA.REC.

    SEASAT

    DEUARS

    SMM

    EDO

    ERB77

    STSLAND78

    COBE

    GRO82

    ERB88VOY

    EUVE/EP

    ULYS

    PIONVEN IUE ISEE

    HEAO

    Total Program Overrun32 NASA Programs

    R2 = 0.5206

    0

    20

    40

    60

    80

    100

    120

    140

    160

    180

    200

    0 5 10 15 20

    Definition Percent of Total Estimate

    Prog

    ram

    Ove

    rrun

    Definition $Definition Percent = ---------------------------------- Target + Definition$

    Actual + Definition$Program Overrun = ---------------------------------- Target + Definition$

    GRO76OMV

    GALL

    IRAS

    TDRSS

    HST

    TETH

    LAND76MARS

    MAG

    GOES I-MCENACT

    CHA.REC.

    SEASAT

    DEUARS

    SMM

    EDO

    ERB77

    STSLAND78

    COBE

    GRO82

    ERB88VOY

    EUVE/EP

    ULYS

    PIONVEN IUE ISEE

    HEAO

    Figure 2. Impact of front end project definition effort. (from Gruhl 1992)

    The NASA data compares project cost overrun with the amount of the project spent

    during phases A and B of the NASA five-phase process (called by Gruhl the ‘definition

    percent’). The data shows that expending greater funds in the project definition results

    in significantly less cost overrun during project development. Most projects used less

    than 10% of funds for project definition; most projects had cost overruns well in excess

  • 10

    of 40%. The trend line on Gruhl’s data seems to show an optimum project definition

    fraction of about 15%.

    The NASA data, however, does not directly apply to systems engineering. In Gruhl’s

    research, the independent variable is the percent of funding spent during NASA Phases

    A and B, the project definition phases. Figure 3 shows the difference between this and

    true systems engineering effort. It is apparent from this difference that the relationship

    shown in the NASA data only loosely supports any conclusion related to systems

    engineering.

    Time

    NASADefinition

    Effort

    Dev

    elop

    men

    t Eff

    ort

    Total Project Effort

    SystemsEngineeringEffort

    Time

    NASADefinition

    Effort

    Dev

    elop

    men

    t Eff

    ort

    Total Project Effort

    SystemsEngineeringEffort

    Figure 3. Definition effort is not the same as systems engineering effort.

    Impact of SE on quality and schedule. A unique opportunity occurred at Boeing as

    reported by (Frantz 1995), in which three roughly similar systems were built at the

    same time using different levels of systems engineering. The three systems were

    Universal Holding Fixtures (UHF) used for manipulating large assemblies during the

    manufacture of airplanes. Each UHF was of a size on the order of 10’ x 40’, with

    accuracy on the order of thousands of an inch. The three varied in their complexity,

    with differences in the numbers and types of sensors and interfaces.

    0 50 100

    UHF3

    UHF2

    UHF1

    Overall Development Time (weeks)

    0 50 100

    UHF3

    UHF2

    UHF1

    Overall Development Time (weeks)

    Figure 4. Shorter schedule for more complex UHF using better SE (from Frantz 1995)

    The three projects also varied in their use of explicit SE practices. In general, the more

    complex UHF also used more rigorous SE practices. Some differences in process, for

  • Systems Engineering Return on Investment

    11

    example, included the approach to stating and managing requirements, the approach to

    subcontract technical control, the types of design reviews, the integration methods, and

    the form of acceptance testing.

    The primary differences noted in the results were in the subjective quality of work and

    the development time. Even in the face of greater complexity, the study showed that

    the use of more rigorous SE practices reduced the durations (a) from requirements to

    subcontract Request For Proposal (RFP), (b) from design to production, and (c) overall

    development time. Figure 4 shows the significant reduction in overall development

    time. It should be noted that UHF3 was the most complex system and UHF1 the least

    complex system. Even though it was the most complex system, UHF3 (with better SE)

    completed in less than ! the time of UHF1.

    Large engineering projects study. An international research project led by

    Massachusetts Institute of Technology (MIT) studied the strategic management of large

    engineering projects (LEP) (Miller 2000). The project reviewed the entire strategic

    history of 60 worldwide LEPs that included the development of infrastructure systems

    such as dams, power plants, road structures, and national information networks. The

    focus of the project was on strategic management rather than technical management.

    The project used both subjective and objective measures, including project goals,

    financial metrics and interviews with participants.

    82%

    Percent of Projects Meeting:

    72%

    45% 18% 37%Failed!

    Cost Targets

    Schedule Targets

    Objective Targets

    82%

    Percent of Projects Meeting:

    72%

    45% 18% 37%Failed!

    Cost Targets

    Schedule Targets

    Objective Targets

    Figure 5. Many engineering projects fail to meet objectives (from Miller 2000)

    The statistical results of the LEPs are shown in Figure 5. Cost and schedule targets

    were often not met, but technical objective targets were only met in 45% of the 60

    projects. Fully 37% of the projects completely failed to meet objectives, while another

    18% met only some objectives. The project found that the most important determinant

    in success was a coherent, well-developed organizational structure; in other words, a

    structure of leadership creates greater success. Because SE usually includes a

  • 12

    component of technical leadership, this finding seems to indicate a significant value of

    SE.

    The Shangri-La of ROI. A popular and often-referenced paper is Sheard and Miller

    (2000), which describes the difficulties in attempting to define the ROI of SE. Through

    observation of the then-current state of measurement, they hypothesized that:

    (1) There are no ‘hard numbers.’ (2) There will be no hard numbers in the foreseeable future. (3) If there were hard numbers, there wouldn’t be a way to apply them to your situation, and (4) If you did use such numbers, no one would believe you anyway.

    Sheard and Miller built the theory based on the general lack of hard numbers in the

    preceding decade and the level of non-use of the few hard numbers that were available.

    In particular, they referenced Herbsleb (1994) and Frantz (1995), the small sample

    sizes used, and the lack of impact of those results. They then went on to discuss how to

    motivate SE process improvement through means other than hard numbers.

    Lessons learned from large, complex technical projects. In contrast with Sheard and

    Miller, another contemporary paper (Cook 2000) performed a survey of prior literature

    to determine what had actually been learned about SE from large, complex technical

    projects. After defining the basic boundaries of SE, Cook examined the NASA data

    from Gruhl (1992), UK Ministry of Defense (MOD) reports, UK civil software

    development, US civil software development, US federal software development, and

    aircraft development case studies, resulting in a set of guidance principles for SE

    practitioners, planners, and process developers. As part of this survey, Cook cites the

    UK Downey principles (DERA 1996), defined since the 1960s, in which 15% of the

    total project costs should be expended during systems definition ‘to engender speedier,

    more coherent and interactive processes.’ This number is also contained in MOD

    (1999).

    Commercial systems engineering effectiveness study. IBM Commercial Products

    division implemented new SE processes in their development of commercial software.

    While performing this implementation, they tracked the effectiveness of the change

    through metrics of productivity. As reported by Barker (2003), productivity metrics

    existed prior to the implementation and were used in cost estimation. These metrics

    were based on the cost per arbitrary ‘point’ assigned as a part of system architecting.

    During the SE implementation, the actual costs of eight projects were tracked against

  • Systems Engineering Return on Investment

    13

    the original estimates of ‘points.’ Three projects used prior ‘non-SE’ methods, while

    the remaining five used the new SE methods. In the reported analysis, the data

    indicated that the use of SE processes improved overall project productivity when

    effectively combined with the project management and test processes. Cost per point

    for the prior projects averaged $1350, while cost per point for the projects using SE

    processes averaged $944, a cost reduction of 30%.

    2000

    2000

    2001

    2001

    2001

    2002

    2002

    2002

    Project 1

    Project 2

    Project 3

    Project 4

    Project 5

    Project 6

    Project 7

    Project 8

    12,934

    1,223

    10,209

    8,707

    4,678

    5,743

    14,417

    929

    18,191

    2,400

    11,596

    10,266

    5,099

    5,626

    10,026

    1,600

    0

    0

    9.2

    0

    10.7

    14.4

    10.2

    16.0

    1,406

    1,962

    1,136

    1,179

    1,090

    980

    695

    1,739

    Year Project “Points” Cost ($K)

    SE Costs (%)

    $/Point

    Figure 6. Implementation of SE processes resulted in statistically significant cost

    decrease (from Barker 2003)

    Impact of systems engineering on complex systems. A third study was reported by

    Kludze (2004), showing results of a survey on the impact of SE as perceived by NASA

    employees and by INCOSE members. The survey contained 40 questions related to

    demographics, cost, value, schedule, risk, and other general effects. While most of the

    survey relates in some way to the value of systems engineering, one primary result

    stands out. Respondents were asked to indicate the percent of their most recent project

    cost that was expended on SE, using aggregated brackets of 0-5%, 6-10%, 11-15%, and

    16% or more. Figure 7 shows the result. (It is noted that the study presented the results

    as shown in a continuous curve, although the actual results only support four data

    points.)

    The respondents believed that their projects most often spent between 6-10% on SE,

    with few projects spending more than 10%. It appears that INCOSE respondents

    believed their projects spent proportionately more on SE than did NASA respondents.

    There is, however, an anomaly in this data that is represented by the bimodal

    characteristic of the responses. Many respondents indicated that their projects spent

    16% or above. It is believed that this anomaly occurs because the respondents

    interpreted ‘project’ to include such projects as a system design effort, in which most of

    the project is spent on SE.

  • 14

    0

    5

    10

    15

    20

    25

    30

    35

    0-5% 6-10% 11-15% 16% & AbovePer

    cent

    of R

    espo

    nden

    ts

    NASA INCOSE Combined

    Figure 7. Percent of total project cost spent on SE (from Kludze 2004)

    Project management and systems engineering in the commercial environment. This

    observational study (Gamgee 2006) examined 10 large programs in commercial

    industries (telecommunications, banking, gaming) for evidence of SE activities,

    comparing those activities to the success or difficulties on the projects. In the paper,

    SE activities include requirements development, technical design/solution, system

    integration & test, system implementation, and system support. Both SE activities and

    program success were evaluated using subjective measures. It was found that all

    projects with poor SE activity measures also had poor success measures.

    The study went further, to examine the management attitudes around the finding. The

    exploration showed that managers were generally aware of the finding, but often had

    bad experiences with increasing or improving SE. The bad experiences included

    overwhelming process initiatives, overemphasis on technical design, and extension of

    important time-to-market.

    2.2 Formative theory Based on the background information, the author performed earlier work to determine

    theoretical relationships that apply to SE-ROI. In 1997, the author defined a three-

    phase approach to quantifying the value of SE:

    Phase I: Theoretical work to predict the quantified form of the relationship

    Phase II: Statistical evaluation of volunteer, subjective surveys

    Phase III: Detailed interviews with programs to obtain contractual data.

  • Systems Engineering Return on Investment

    15

    The papers reported in this section and some of those in the next represent the results

    from Phase I and Phase II. It should be noted that this thesis is essentially Phase III of

    that original research plan.

    The Phase I work was presented in two papers, the second of which became

    foundational for the sequence of research activities that have led to this thesis.

    Characteristics of engineering disciplines. In Honour (1999), the author explored the

    theory that SE is essentially inter-disciplinary in nature and therefore dependent on the

    underlying engineering disciplines for its value. A review of ten major engineering

    disciplines revealed both common and diverging natures that affected the ability to

    engineer systems using those disciplines. In some cases, other engineering disciplines

    were found also to be practicing some essential characteristics of SE. (e.g. Civil

    engineers usually develop systems – building and structures – using life-cycle-phased

    processes and inter-discipline coordination.) Each discipline has representative

    engineered products for which it is normally responsible, but many of those products

    are actually system products. Finally, the paper compared each of the ten disciplines to

    a primary SE standard (EIA-632) to determine what value the standard offered to that

    discipline. A primary conclusion was that the defined SE processes usually provide

    significant value, but that value differed depending on the engineering discipline

    involved.

    Toward a mathematical theory of systems engineering management. In Honour

    (2002b), the author provided the underlying formal theory that led to the work in this

    thesis, based on earlier work reported in Honour (2002a). In this theoretical work, the

    author explored the mathematical relationships among cost, schedule, technical value,

    and risk. (Technical value was further hypothesized as being comprised of size,

    complexity, and quality.) Each relationship was handled by examining the end-point

    values in ternary, heuristic combinations. This analysis showed that the end-points

    always devolve to trivial cases that are usually easily evaluated. Between the trivial

    cases always lies some optimum with unknown value. Figure 8 shows one example of

    such a relationship, taking into account the limits of (a) short duration impossibilities

    (left end point) and (b) constant administrative cost for long durations (right end point).

    A primary result of this work was Figure 9, which shows the value relationship against

    Systems Engineering Effort (SEE), a primary research question of this thesis.

  • 16

    Expected Duration (D)Ex

    pect

    ed C

    ost

    (C)

    Low Medium V

    High V

    Optimum Cost/Duration

    Figure 8. Theoretical relationship: cost against schedule for different levels of

    technical value (from Honour 2002b)

    SE Effort (SEE) %

    Val

    ue (V

    )

    E(v) for SEE=0%

    E(v) for better parameters

    Figure 9. Theoretical relationship: value against SE effort (from Honour 2002b)

    2.3 Statistical research related to SE-ROI This thesis is a direct result of a series of prior research works published by the author

    in pursuit of the theory presented in Honour (2002b). It is also directly related to

    several other statistical works.

    Value of SE – SECOE research progress report. The precursor empirical work to this

    thesis started following the theoretical work of Honour (2002b) and was first reported

    as interim results in Mar and Honour (2002). This was the Phase II work of the 1997

    plan. The work was performed under the auspices of the INCOSE Systems

    Engineering Center of Excellence (SECOE).

    The interim report showed the path used in Phase II, defining statistical parameters of

    cost, duration, SE costs, and SE quality that provided the basis for the survey effort.

    These same parameters have carried into this current research. The interim report also

    established four methods of calculating success that are also used in this thesis: cost

    compliance, schedule compliance, subjective success, and objective technical success.

  • Systems Engineering Return on Investment

    17

    Reporting on 25 received surveys, the interim paper provided initial indications that

    closely matched the results later reported in Honour (2004) and in this thesis.

    One significant finding in the interim report, supported in later work, is that the

    correlations improve when the SE percent is modified by a subjective evaluation of SE

    quality. Following this realization, all graphs reported correlations against a modified

    ‘SE Effort,’ in which the SE percent is factored based on SE quality.

    Understanding the value of SE. The Phase II work was reported complete in Honour

    (2004) with a total of 43 surveys received. This work has been quoted and referenced

    widely to show the basic value of SE, including in the INCOSE Systems Engineering

    Handbook (INCOSE 2010).

    The surveys gathered anonymous data on the statistical parameters of cost

    (planned/actual), duration (planned/actual), SE costs, SE quality, objective technical

    success, and comparative subjective success. The data collected had sufficient

    variation in SE effort, SE quality, cost, and schedule to provide a good statistical basis

    for results. Figure 10 shows the variability in SE Effort as a percent of the total project.

    0

    2

    4

    6

    8

    10

    0 5 10 15 20 25

    SE Effort % = SE Quality * SE Cost/Actual Cost

    Num

    ber o

    f Pro

    ject

    s

    Figure 10. Histogram of submissions by SE Effort, % project cost (from Honour 2004)

    The data was subjected to statistical correlation analysis to determine the statistical

    relationship between the SE Effort and the various success measures. Figure 11

    provides the correlation graph of cost compliance versus SE Effort, while Figure 12

    provides the similar graph for schedule compliance and Figure 13 the graph for overall

    subjective success. The graphs showed that all three success measures have a usable

    level of correlation with SE Effort. The optimum level of SE Effort is not well

    determined due to a lack of data in the optimum region, but was reported as 15-20%.

  • 18

    0.6

    1.0

    1.4

    1.8

    2.2

    2.6

    3.0

    0% 4% 8% 12% 16% 20% 24% 28%

    SE Effort = SE Quality * SE Cost/Actual Cost

    Act

    ual/P

    lann

    ed C

    ost

    Figure 11. Cost performance as a function of SE effort (from Honour 2004)

    0.6

    1.0

    1.4

    1.8

    2.2

    2.6

    3.0

    0% 4% 8% 12% 16% 20% 24%

    SE Effort = SE Quality * SE Cost/Actual Cost

    Act

    ual/P

    lann

    ed S

    ched

    ule

    Figure 12. Schedule performance as a function of SE effort (from Honour 2004)

    0.0

    1.0

    2.0

    3.0

    4.0

    5.0

    6.0

    7.0

    8.0

    9.0

    10.0

    0% 4% 8% 12% 16% 20% 24%

    SE Effort

    Com

    para

    tive

    Succ

    ess

    0.0

    1.0

    2.0

    3.0

    4.0

    5.0

    6.0

    7.0

    8.0

    9.0

    10.0

    0% 4% 8% 12% 16% 20% 24%

    SE Effort

    Com

    para

    tive

    Succ

    ess

    Figure 13. Subjective success as a function of SE effort (from Honour 2004)

    The cost and schedule graphs showed the 5% and 95% probability bounds on the data

    distributions as dotted lines. The variability between the bounds became significantly

  • Systems Engineering Return on Investment

    19

    less as the SE Effort increased. This fact shows a significant increase in predictability

    (i.e. reduction in success variance) as the SE Effort increases toward the optimum.

    This paper also reported an analysis of cost compliance correlated to the program size,

    showing that cost overruns appeared to be more prevalent for programs in the range of

    tens to hundreds of millions of dollars, with larger and smaller programs reporting

    smaller overruns.

    It should be noted that Honour (2004) has nothing to report about the technical quality

    of the product systems. There is indication in the interim report (Mar 2002) that

    technical quality was subjectively measured as ‘objective success,’ but that there was

    no correlation observed between objective success and SE Effort.

    Constructive SE cost model (COSYSMO). A doctoral dissertation at the University of

    Southern California (Valerdi 2005) extended the well-established Constructive Cost

    Model (COCOMO) for software development into the field of systems engineering.

    The resulting Constructive Systems Engineering Cost Model (COSYSMO) addressed

    the basic question as to how much SE effort should be allocated for the successful

    development of large-scale systems. Valerdi created a mathematical model based on

    four system size parameters (requirements, interfaces, algorithms, and operational

    scenarios), one scale factor, and 14 effort multipliers. Parameter relationships were

    developed first by expert-level consensus estimation, then by evaluation of gathered

    program data. The basic form of the COSYSMO mathematical model is

    PMNS A (Size)E EMi

    i 1

    14

    Equation 1

    in which

    PMNS is effort in person months (nominal schedule)

    A is a calibration constant

    Size is computed size based on four size parameters

    E is a factor for economy/diseconomy of scale (default is 1.0)

    EM is an effort multiplier for the 14 cost drivers

  • 20

    The dissertation work shows that this form of equation, when applied to various actual

    system development projects, results in consistent calculation of SE effort as used on

    the projects.

    By extension, the work offered the COSYSMO model as a means to calculate the SE

    effort that ‘should be’ planned for each project. Careful review of the method used, as

    well as personal discussions with Valerdi, however, shows that ‘should be’ is defined in

    terms of ‘the level of SE used on the sample programs’ rather than by an objective

    measure based on program success. This is an important work in the advancement of

    knowledge about SE value, but the work still does not provide the empirical

    relationship between SE effort and program success.

    ROI of SE for software-intensive systems. A further exploration into the constructive

    cost models (Boehm 2007) tied the extensive data from the COCOMO software model

    into the field of SE. The COCOMO II model included one specific ‘effort multiplier’

    parameter, Architecture and Risk Resolution (RESL), that represented the degree to

    which the software design was subject to front-end architectural analysis. This type of

    effort is indicative of SE activities, although the work acknowledges that it is not a

    complete representation of SE. Through a statistical calculation of the 161 projects in

    the COCOMO II database, the work determined the amount of time added due to RESL

    for different size projects as shown in Figure 14.

    Figure 14. SE effort required in COCOMO II for different size systems (from Boehm

    2007)

  • Systems Engineering Return on Investment

    21

    SE effectiveness. An important extension to the work of Honour (2004) was reported

    in Elm (2008), in which Elm and others performed an extensive survey to obtain more

    detailed information about SE activities and program success. After formal

    development and testing of a survey instrument, the study obtained information from

    64 programs, 46 of which were complete enough to contribute to the statistical work.

    Most survey questions were subjective in nature, but were sufficiently detailed to

    obtain insight into the SE activities. SE was measured by the subjective evidence of

    artifacts and activities defined as part of the survey; success was measured by

    subjective answers on a set of questions about the results of the program. Correlation

    of the SE capability against program success was demonstrated by the success levels

    for three brackets of SE capability as shown in Figure 15. The figure shows that

    programs evidencing higher SE capability demonstrated greater success.

    Figure 15. Correlation of SE capability to program performance (from Elm 2008)

    In the statistical analysis, related questions were combined to represent the level of

    capability in specific areas of SE activity. Each SE activity was then checked for

    correlation against program performance, with statistical measurement of the degree of

    correlation using a Gamma test. The results are shown in Figure 16. Nearly all

    activities tested showed a positive correlation. The negative correlation to Monitor &

    Control was perceived to be an inverse causal relationship, in that programs with

    performance difficulty tend to receive greater management monitoring and control.

    The significant result in this effort is to show the ranking of which SE activities have

    stronger correlation.

  • 22

    Figure 16. SE capabilities correlate with program performance (from Elm 2008)

    In another statistical analysis of the data, Elm explored how the program challenge

    (PC) affected the basic correlations by segmenting the program data into ‘low

    challenge’ and ‘high challenge’ programs. The results in Figure 17 show that the

    correlation between performance and SE Capability (SEC) is less for high challenge

    programs.

    Figure 17. SE capability has less effect for high challenge programs (from Elm 2008)

    2.4 Summary and findings of prior results Work on the value of SE has made significant progress in the past decade, beginning to

    build a basis of data that supports informed management decisions. Prior research has

    indicated a number of findings, but the findings do not all have equal validity.

  • Systems Engineering Return on Investment

    23

    Front-end program work. Several reports indicate that front-end program work at

    about 15% of the total program cost minimizes the cost overrun during system

    development.

    Effect of SE on program success. Multiple reports show various forms of subjective

    linkage between SE activities and program success, always indicating that greater SE

    (than currently used) leads to better success. As with front-end program work, the prior

    research also shows that SE effort at about 15% of the total program cost minimizes the

    cost overrun. It has also been shown that the quality of that SE effort matters.

    Technical leadership. Two reports show that better technical leadership is a strong

    indicator of better success. Ancona and Caldwell (Ancona 1990) focused on the

    aspects of leadership involving “boundary management,” the technical interfaces

    between the development team and external entities. The large engineering projects

    study (Miller 2000) focused on the technical organization and coordination of a project

    team as part of the larger strategic management.

    Optimal levels. Theoretical work shows that there should exist an optimum level of

    SE effort; although most programs seem to operate below that optimum, too much SE

    effort would also be detrimental.

    Program size. One study showed that the worst cost overruns appear to occur with

    programs of mid-range size, on the order of $100 million. Another study showed that

    larger software-intensive programs need a greater level of SE effort.

    Parametric estimation of SE effort. The COSYSMO effort shows that parametric

    estimation of SE effort can create consistent predictions using a multiplicative formulat

    of size and subjective parameters.

    Program challenge. One study showed that the correlation between SE effort and

    program success was greater for low-challenge programs and lower for high-challenge

    programs.

    The findings from prior research indicate that this SE-ROI project provides a

    significant new advance in the knowledge about value of SE. The advance is

    specifically in the area of providing management decision information as to how much

    and what kind of SE is indicated for best program success. Table 1 shows the specific

    advances this SE-ROI project makes against the prior research, including a series of

    findings not covered at all by prior work.

  • 24

    Table 1. Advances of this research against past work Prior Research Findings SE-ROI Research Advances

    Front-end Program Work. Front-end work at about 15% of total program minimizes the cost overrun.

    SE-ROI provides specific information about the SE activities throughout the system development, rather than just the front-end work that includes SE and many other activities.

    Effect of SE on Program Success. Subjective linkage is shown that greater SE leads to better success SE effort at about 15% of total program cost minimizes the cost overrun.

    The quality of the SE effort contributes to the relationship.

    SE-ROI provides proven empirical correlation between SE, its subordinate activities, and program success. SE-ROI shows specific empirical evidence that the optimum is 14.4% total SE for a median program. It also provides optimum values for eight subordinate SE activities, as well as a means to pre-calculate the optimum values for a given program based on its program characteristics. SE-ROI shows that the quality of each of the eight subordinate activities also matters.

    Technical Leadership Better technical leadership is a strong indicator of program success.

    SE-ROI provides specific empirical values for the correlation between technical leadership and program success, showing that 3.9% technical leadership effort (of total program cost) is optimum for a median program. SE-ROI shows that technical leadership/ management is unique among the eight subordinate SE activities in that it provides optimum program success simultaneously in cost, schedule, and stakeholder acceptance.

    Optimal Levels Theory showed that an optimum level of SE effort must exist

    SE-ROI provides specific values of the optimum for total SE (14.4%) and each of eight subordinate SE activities

    Program Size Worst cost overruns occur in programs of mid-range size, on the order of $100 million.

    SE-ROI showed that system size (not program size) is the most significant confounding factor in correlation of SE activity to program success, and defined program size through a combination of nine parameters.

    Parametric Estimation of SE Effort COSYSMO provides a consistent methodology to estimate SE effort based on the effort used by other programs.

    SE-ROI provides a consistent methodology to estimate optimum SE effort based on program success.

    Program Challenge Correlation between SE effort and program success is greater for low-challenge programs.

    SE-ROI provides empirical estimation for the amount of SE effort required based on the level of technology risk.

  • Systems Engineering Return on Investment

    25

    Prior Research Findings SE-ROI Research Advances

    Findings Not Covered in Prior Work SE-ROI shows a significant, quantifiable Return on Investment for SE activities, usually 3.5:1. SE-ROi found no correlation between SE and system technical quality. SE-ROi found that programs typically use less SE effort than is optimum for best success. SE-ROI provides a quantified list of program characterization parameters that affect the relationship between SE activity and program success. SE-ROI demonstrated that there is a common ontology of SE that is sufficient to be meaningful. SE-ROI demonstrated that it is possible to effectively quantify SE effort using empirical data. SE-ROI demonstrated that it is possible to obtain meaningful data about SE and success through program proprietary boundaries.

    Most of the prior research provides only anecdotal information about the value of SE,

    with subjective indications that it has value. Some prior work indicates the quantified

    gain that may be obtained with SE activities (30% cost reduction, 50-70% time

    reduction), without any indication of how much or what kind of SE is required to

    obtain these gains. Other work provides some quantifiable decision material (worst

    overruns occur for programs at about $100M size, parametric calculation of SE effort

    works), but again without evaluation of the objective value obtained by using SE

    activities. The COSYSMO work provides an excellent tool to predict the amount of SE

    on a program, but its results are not tied to the program success. The only quantified

    information about SE-ROI is obtained from the Gruhl and Honour studies, which

    provide indication that total SE (or total front-end work) should be on the order of 15%

    of the program cost to minimize cost and schedule overruns. While useful for

    management planning, these results give only the highest-level statistical indication.

    The next chapter describes the research method used to take these indications down to a

    much deeper level, with greater resolution and with greater completeness.

  • 26

    3 Research design

    The SE-ROI research required a formal design to ensure that the results would be

    acceptable to the wider SE community. This chapter describes in detail the design of

    the research, including the research questions explored; the research activities and their

    order; the guidance, participation, and organization of the research project; and the

    ethics considerations. Findings are offered based on the research design work.

    3.1 Research questions Based on the background work of Chapter 2, the primary research questions of the SE-

    ROI project are:

    (RQA) Is there a quantifiable correlation between the amount, types and

    quality of systems engineering efforts used during a program and the success

    of the program?

    (RQB) For any given program, can an optimum amount, type and quality of

    systems engineering effort be predicted from the quantified correlations?

    Several terms in these questions require more definition. These are:

    Program – Each program sought as a data point is a system development program that

    starts with an operational concept and ends with the first prototype system.

    Systems engineering effort – The scope of systems engineering effort to be considered

    is based on an analysis of the existing standards (Honour & Valerdi 2006) that

    demonstrates the widely-agreed categories of mission/purpose definition, requirements

    engineering, system architecting, system implementation, technical analysis, technical

    leadership/management, scope management, and verification/validation.

    Amount – Systems engineering effort is quantified herein in terms of the cost of SE

    effort applied as a fraction of the total program cost (SE%). As shown in Mar &

  • Systems Engineering Return on Investment

    27

    Honour (2002), however, this must also be qualified by a measure of the quality of the

    effort applied.

    Type –This research explores the eight categories of SE effort found in Honour &

    Valerdi (2006) as definitions of ‘type.’ Statistical correlation of each type against

    program success is sought.

    Quality - The interview participants select the quality of systems engineering effort

    against a subjective scale.

    Success – The success of a program is measured herein by four separate success

    parameters: (a) cost compliance with plan, (b) schedule compliance with plan, (c)

    overall subjective success, and (d) technical performance against quantifiable key

    performance parameters (KPP).

    3.2 Research activities Prior related research work by the author provided a literature review, definitions of

    theoretical mathematical concepts, effective methods to obtain data, preliminary data

    that validates the methodology, and an ontology of systems engineering categories

    suitable for measurement. Therefore, the research work covered in this thesis started at

    a more advanced level than many others.

    Table 2. Research activities prior to and during this thesis effort Prior Research Work (“Value of SE”) Thesis Research Work (“SE-ROI”)

    Definition of three-phase concept to quantify the value of SE Implementation of phase I, theoretical exploration of quantified relationships. Implementation of phase II, informal survey to obtain initial indications

    Implementation of Phase III, detailed program interviews to obtain extensive empirical data Statistical analysis of the empirical data to determine relationships and findings Determination of findings.

    The research worked through the following five major activities, which often

    overlapped, in some cases supporting each other simultaneously.

    Research organization

    Technical structuring

    Data gathering

    Data analysis

    Reporting

  • 28

    The general plan for the research was published as a developmental paper in Honour

    (2006a). That paper is included herein as Appendix C.2. The following material

    expands on the general plan in that paper.

    3.2.1 Research organization The research organization activity provided the underlying organization and structure

    for the research. Tasks that were part of this activity included:

    Creation and maintenance of research plans

    Development of the Research Advisory Group (See Section 3.3.1.)

    Monthly status reporting to the Research Advisory Group and to UniSA.

    This activity started with the first creation of a research plan in October 2005 and

    completed its creation/development tasks in October 2006. Maintenance and reporting

    efforts continued throughout the research.

    3.2.2 Technical structuring The technical structuring activity provided the technical concepts and data structures

    necessary to start data gathering. Work in this activity involved the members of the

    Research Advisory Group and the UniSA supervisors. During this activity, the

    researcher was the primary worker while coordinating ideas and results with the

    advisory group and supervisors. The work created concepts and structures that are a

    consensus product of the advisory group. Specific goals for the activity were to create:

    Technical correlations to be tested by the research

    Data structures to obtain the necessary source data

    Access to real programs

    The technical structuring activity started in late 2006 as the Research Advisory Group

    was assembled. It continued through the initial stages of data gathering, to allow

    modification of the technical concepts and data structures based on initial data.

    Technical correlations to be tested. The basic research hypotheses are stated in 5.1.1.

    The intent of data gathering was to quantify the hypotheses in three dimensions:

    Program success, measured in cost compliance, schedule compliance, overall

    success, and technical quality.

  • Systems Engineering Return on Investment

    29

    Systems engineering effort, measured in effort costs against the program total cost,

    in each of eight categories.

    Program characterization values (size, complexity, quality) that parameterize the

    expected correlation of systems engineering effort with program success.

    The primary correlations included the following:

    Percent total SE effort against program success in each of cost, schedule, overall

    success, and technical quality (4 correlations).

    Percent subordinate SE activity effort (8 activities) against program success in each

    of cost, schedule, overall success, and technical quality (32 correlations).

    Each of the 36 correlations was tested against the program characterization parameters

    to determine whether the correlations are improved by adjustment against the

    characterization parameters. Possible characterization parameters to be tested were

    drawn from the work on COSYSMO (Valerdi 2004) and the experience base of the

    Research Advisory Group.

    Data structures. Based on the technical correlations to be tested, the researcher

    guided and coordinated the efforts of the Research Advisory Group to define an

    effective set of data to be gathered. Section 4.1 describes the technical structuring of

    the data, starting with the research questions and continuing into the data necessary to

    support the research questions. The purpose of this task was to define data structures

    that could reasonably be attained during an interview and that capture the information

    necessary to explore the desired technical correlations.

    Access to programs. It was the responsibility of the researcher to identify programs

    and to negotiate access to those programs through key management individuals.

    Section 4.2 describes the approach used to obtain access to programs. As a result of the

    technical structuring work, some members of the Research Advisory Group gained

    sufficient interest in the research to assist in providing access within their

    organizations. Other program interviews occurred through the direct contacts and

    efforts of the researcher.

    The research required a sufficient number of programs to support the statistical

    correlations desired in the technical structure. The greatest challenge of the SE-ROI

    research, as it was for prior projects including COSYSMO, was to obtain data from a

  • 30

    sufficient number of programs. To this end, the researcher made frequent contacts with

    industry and government individuals seeking access to the necessary program data.

    3.2.3 Data gathering The data gathering activity obtained data from programs in accordance with the data

    structures defined in the technical structuring activity. See Section 4.5 for the interview

    methods used in data gathering and Section 4.6 for the demographic description of the

    programs interviewed. This activity started in March 2007 on the initial completion of

    the technical structuring activity and continued until the there was sufficient data for

    completion, in September 2009.

    Obtaining data from programs was time consuming. Although individual interviews

    were short in research terms, the political work to obtain the interviews took many

    months and considerable personal contact.

    3.2.4 Data analysis The data analysis activity used the data gathered to seek correlations that support the

    hypotheses. This activity used statistical methods as described in Chapter 5. As each

    set of data was obtained, the statistical analysis was extended based on the quality and

    quantity of the total data set. Initially, there was insufficient data to reliably support

    any correlation. With a few data sets, high-level correlations were attempted. As the

    number of data sets increased, more correlations were attempted in accordance with

    ‘design of experiments’ methods. The interim results were provided to participating

    organizations. This activity started when a few data sets were obtained in September

    2007 and completed when the statistical correlations were sufficient to support or reject

    the primary hypotheses in March 2011. Further data analysis continued to extend the

    results into secondary hypotheses and SE cost estimation methods, completing in

    November 2011.

    3.2.5 Reporting The reporting activity included the generation of interim and final technical reports, in

    the forms of:

    A public website with summary information.

    An organization of SE practices that was vetted by the Research Advisory Group,

    published as an interim technical paper. (See Appendix C.1)

  • Systems Engineering Return on Investment

    31

    Interim analysis results prepared as internal data and distributed to the Research

    Advisory Group.

    Benchmark reports prepared as written reports to each participating organization.

    The reports included specific data from the organization’s interviewed programs,

    compared with aggregate data from the research as a whole.

    Interim technical conference papers disseminating various aspects of the research as

    it progressed. (See Appendix C.)

    Final results in the form of this technical thesis to UniSA.

    Final results offered for publication as journal-level technical papers.

    The reporting activity started from the beginning of the project in February 2006 and

    continued through 2012.

    3.3 Guidance and participation The SE-ROI research required information that could only be obtained from system

    development organizations, and the collection of that data has proven difficult in past

    research. To ensure a high level of acceptance of the research results, the research

    project sought guidance and participation from an unusually rich selection of senior

    people, namely a Research Advisory Group, numerous staff in the Defence and

    Systems Institute at UniSA, and from senior representatives of the participating

    organizations.

    3.3.1 Research advisory group A Research Advisory Group was created to participate in developing the research

    methods and interview instruments. The Research Advisory Group:

    Provided general acceptance of the data organization,

    Built public interest in the research and its expected results, and

    Provided access to real programs in the group’s parent organizations.

    The Research Advisory Group comprised volunteer individuals who expressed an

    interest in the research and a willingness to participate in its development. The

    formation of this group followed the successful methods used on the COSYSMO

    project and documented by Valerdi (2004). The Research Advisory Group used virtual

    collaboration methods (email reflector, web-based repository, web-enabled

  • 32

    presentations, teleconferencing) coupled with face-to-face working meetings (in

    conjunction with conferences). The group assisted the researcher to do the following.

    Create a high-level ontological structure for SE practices to act as a basis for the

    research hypotheses and data.

    Create the structure and format of data gathering to support the intended

    hypotheses.

    Facilitate access to data from programs within their parent organizations.

    Review and discuss interim results to provide consensus guidance to the researcher.

    (Interim results were carefully protected to guard the security of the source data.

    Members of the advisory group came from many competing organizations.)

    The Research Advisory Group grew and changed in membership during the research.

    In its final version, the group included 66 representatives from 59 organizations

    comprising 31 system development companies,