using parametric software estimates during program support reviews chris miller office of the deputy...
Post on 19-Dec-2015
215 views
TRANSCRIPT
Using Parametric Software Estimates During
Program Support Reviews
Chris Miller Office of the Deputy Director,
Software Engineering and System Assurance
SYSTEMS & SOFTWARE ENGINEERINGOffice of the Deputy Under Secretary of Defense
for Acquisition and TechnologyUS Department of Defense
October 2008
Version 2.0Version 2.0
Systems & Software Engineering – Chris Miller, October 2008 Slide 2
OUSD (AT&L) OrganizationMay 2006
USD, AcquisitionTechnology & Logistics
DUSD, Acquisition &Technology
Dir, Joint AdvancedConcepts
Dir, Systems andSoftware Engineering
Dir, PortfolioSystems Acquisition
IndustrialPrograms
Defense Procurementand Acquisition Policy
Small BusinessPrograms
Defense ContractManagement Agency
Defense AcquisitionUniversity
Systems & Software Engineering – Chris Miller, October 2008 Slide 3
Director, Systems &Software Engineering
Systems and Software Engineering
Director, Systems &Software Engineering
Deputy DirectorEnterprise Development
Deputy DirectorDevelopmental Test
& Evaluation
Deputy DirectorSoftware Engineering &
System Assurance
Deputy DirectorAssessments & Support
Systems & Software Engineering – Chris Miller, October 2008 Slide 4
Elements of a DoD Strategy for Software
• Support Acquisition Success
– Ensure effective and efficient software solutions across the acquisition spectrum of systems, SoS and capability portfolios
• Improve the State-of-the-Practice of Software Engineering
– Advocate and lead software initiatives to improve the state-of-the-practices through transition of tools, techniques, etc.
• Leadership, Outreach and Advocacy
– Implement at Department and National levels, a strategic plan for meeting Defense software requirements
• Foster Software Resources to meet DoD needs
– Enable the US and global capability to meet Department software needs, in an assured and responsive manner
Promote World-Class Leadership for Defense Software EngineeringPromote World-Class Leadership for Defense Software Engineering
Systems & Software Engineering – Chris Miller, October 2008
Notional View of Software Measurement
Software Engineering and Systems Assurance (SSA) initiatives • Program feasibility analysis using estimation models • Software Resources and Data Report: Feasibility Study• Revision of MIL-HDBK-881A to improve software guidance• Integration of software metrics with EVM to assess consistency of estimates
Concepts - Requirements - Arch/Design Development - MaintenanceConcepts - Requirements - Arch/Design Development - Maintenance
Determine methods of obtaining cost estimating
data
Determine methods of obtaining cost estimating
dataUse estimation tools,
techniques, processes, & practices
Use estimation tools, techniques, processes,
& practices
Generate SW appropriate
WBS
Generate SW appropriate
WBS
Earned Value Management (EVM)
for SW
Earned Value Management (EVM)
for SW
Link SW quality indicators to EVM
Link SW quality indicators to EVM
Systems & Software Engineering – Chris Miller, October 2008 6
Driving Technical Rigor Back Into Programs “Program Support Reviews”
• Program Support Reviews provide insight into a program’s technical execution focusing on:
– SE as envisioned in program’s technical planning
– T&E as captured in verification and validation strategy
– Risk management - integrated, effective and resourced
– Milestone exit criteria as captured in Acquisition Decision Memo
– Acquisition strategy as captured in Acquisition Strategy Report
• Independent, cross-functional view aimed at providing risk-reduction recommendations
The PSR reduces risk in the technical and programmatic execution of a program
The PSR reduces risk in the technical and programmatic execution of a program
Systems & Software Engineering – Chris Miller, October 2008 7
Program Support Review Activity (as of August 2008)
PSRs/NARs completed: 57AOTRs completed: 13Nunn-McCurdy Certification: 13Participation on Service-led IRTs: 3Technical Assessments: 13
Decision Support Reviews
DAE Review5%
OTRR9%
Other25%
Pre-MS C19%
Pre-MS A3%
Pre-MS B26%
Nunn-McCurdy
13%
Service-Managed Acquisitions
Marine Corps 10%
Army 27%
Navy 21%
Air Force 33%
Agencies 9%
Programs by Domain Area
Unmanned 4%
Land 13%
C2-ISR 13%
Ships 8%Munitions 3%
Rotary Wing 21%
Comms 3%
Space 4%
Business 2%Missiles 9%
Fixed Wing 18%
Other 2%
Systems & Software Engineering – Chris Miller, October 2008 8
DoD Software Performance:What We’re Seeing*
• Software issues are significant contributors to poor program execution– Schedule realism (compressed, overlapping)– Software requirements not well defined, traceable, testable– Immature architectures, COTS integration, interoperability,
obsolescence (electronics/hardware refresh)– Software development processes not institutionalized, planning
documents missing or incomplete, reuse strategies inconsistent– Software test/evaluation lacking rigor and breadth– Lessons learned not incorporated into successive builds– Software risks/metrics not well defined, managed
*Based on over 60 program reviews over the past 3 ½ years*Based on over 60 program reviews over the past 3 ½ years
Systems & Software Engineering – Chris Miller, October 2008 9
Software Engineering and System Assurance (SSA) Role
• SSA produced software estimates to support several PSR teams in 2007– Program A: SSA was asked to assess software schedule
feasibility prior to MS B– Program B: Significant software issues
• Opportunity for SSA to support program decision making by providing software estimates– Estimation activities aimed at gauging overall program feasibility
and quantifying magnitude of top program risks– Focus on support for engineering vs. budgeting decisions
Systems & Software Engineering – Chris Miller, October 2008 10
Program A
• PSR of aircraft program with ambitious schedule– Three years (36 months) from Milestone B to LRIP– Modifications needed to meet U.S. requirements
• Developed three software estimates– Software size (SLOC) estimates provided by program office– Re-estimated both new and reused code
» Based on reuse and code growth» Estimates for optimistic, typical, and pessimistic were identified
as “Min”, “Mid”, and “Max”– Used three parametric models
» COCOMO II, SLIM, and SEER-SEM– Most input parameters set at nominal
Systems & Software Engineering – Chris Miller, October 2008 11
Program A
• Estimation & Feasibility Analysis– Given an adjusted code size (ASIZE) of 918K to 1590K SLOCs– A range of 5375 to 9571 person months should be expected over a 62
to 74 calendar month schedule
– All three models forecasted 65 to 68 months (assuming a 50% confidence-level)
• Result: – Analysis revealed existing acquisition strategy was not feasible– Service added more schedule to acquisition strategy– DUSD(A&T) estimates change in acquisition strategy saved $5 billion
COCOMO II Min Mid Max
ASize (KSLOC) 918 1,204 1590Effort (PM) 5375 7147 9571Duration (CM) 62 68 74
Systems & Software Engineering – Chris Miller, October 2008 12
Program B
• Major subcontractor with significant software content did not use any parametric estimating tool
– Firm Fixed Price subcontract
• At one point there was a substantial conflict on estimates between major subcontractor and prime, related to requirements
– One portion of software grew from ~250KSLOC to 863KSLOC (3.5x), reflecting “shall not degrade current capability” requirement
• Increase in software engineering staff by 57 percent (40 people) over 9 month period
– Based on schedule, contractor should be drawing down software staff
• Significant variation in code estimates during review
Systems & Software Engineering – Chris Miller, October 2008 13
CSCI Supplier Original NM Review 2 wks. later New Reuse Complete % Test coverage to-dateDR's/Problem
reports
1 A 40,000 374,739 250,844 201,144 49,700 75% 68% 47
2 B 27,000 50,363 50,363 40,363 10,000 90% 85% 0
3 A 20,000 38,962 38,962 38,962 0 71% 60% 5
4
C 325,000 415,249 401,732 131,921 269,811 98% 98%
1
5 5
6 0
7 D 65,000 283,255 25,467 18,367 7,100 82% 60% 0
8 C 33,000 373,587 100,129 68,129 32,000 99% 99% 0
9 E 15,000 7,879 7,879 2,000 5,879 100% 100% 1
10 F 114,000 218,609 218,609 16,409 202,200 99% 95% 0
11 G 26,000 25,622 34,544 8,922 25,622 100% 100% 2
12 H 6,000 16,580 16,580 15,580 1,000 100% 100% 2
13 H 2,000 34,806 34,806 17,206 17,600 100% 100% 0
14 I 33,000 42,355 42,355 31,655 10,700 99% 97% 1
15 G 15,000 126,238 119,626 6,433 113,193 100% 100% 0
16 K 23,000 26,404 26,404 2,604 23,800 91% 91% 1
17 L 7,000 12,500 12,500 1,000 11,500 100% 100% 1
18 C 1,000 29,121 29,121 25,621 3,500 100% 100% 7
19 M NA 100 1,000 1,000 Unavail 100% 100% 0
20 C NA 2,600 2,600 400 2,200 81% 43% 0
752,000 2,078,969 1,413,521
Program B
Changes in SLOC estimates…
Systems & Software Engineering – Chris Miller, October 2008 14
Program B Schedule Estimates
0
200
400
600
800
1000
1200
0 5 10 15 20 25 30
Months
Sta
ff M
on
ths
50 KSLOC (Nominal)
150 KSLOC (Nominal)
100 KSLOC (Nominal)
150 KSLOC (Optimal Schedule)
100 KSLOC (Optimal Schedule)
50 KSLOC (Optimal Schedule)
Time from Detailed Requirements Complete Through End of Developmental Testing
75 KSLOC remaining based on last estimate75 KSLOC remaining
based on last estimate
Systems & Software Engineering – Chris Miller, October 2008 15
Program B
• Estimation and Feasibility Analysis– Contractor appears to be driven to meet schedules versus costs
» Significant increase in effort for minimal schedule savings– Uncertainty in scope of remaining software development
» Need to develop firmer size estimates
• Recommendations– PMO reach a decision on unstable requirements to prevent further code
growth – in or out– Program office bring in parametric estimating consultant to review
contractor’s estimates for most volatile software components
• Result– Acquisition Decision Memorandum requires Service to conduct
software review
Systems & Software Engineering – Chris Miller, October 2008 16
Integrating Management Indicators
Estimation& PlanningEstimation& Planning
Measurement & Analysis
Measurement & Analysis
Risk Management
Risk Management
SDP: WBS, IMP, IMS, Metric Definitions
SDP: WBS, IMP, IMS, Metric Definitions
Data & InsightData & Insight
PrioritizedRisks
PrioritizedRisks
RiskMitigation
Status
RiskMitigation
Status
Program ContextProgram Context
InsightNeededInsightNeeded
Systems & Software Engineering – Chris Miller, October 2008
SSA Software Measurement & Analysis Initiatives
• Software EVM Study/Pilot– Focus: Development of cost controls for software component of
program– Execution: Partnering with DCMA, ARA, PA&E, Contractors
• Cost Estimation / SRDR Feasibility Study– Focus: Analysis on software cost estimation practices, availability
of historical data, and increasing confidence in estimates– Execution: Partnering with PA&E/DCARC and Hill AFB STSC to
evaluate existing data sources
• Work Breakdown Structure Handbook Update– Focus: Recommendations for update of MIL HDBK 881 to
improve handling of software – Execution: Partnering with Services, DCMA, ARA, DAU, NDU,
PA&E/DCARC and using NDIA expert panel
Systems & Software Engineering – Chris Miller, October 2008
Software EVM Study/Pilot
• Development of a methodology combining EVM and software metrics analysis to predict cost and schedule overruns
• Piloting on a 5-year ACAT 1D SW development program, as part of a cross-cutting study
• Pilot indicator shows ETC (estimate-to-complete) forecasts for:– Existing program management plans
– EVM measures
– Software metrics (i.e, growth profile of size, effort, and defects)
Equivalent ETC forecasts provides an increased confidence in project plans
Equivalent ETC forecasts provides an increased confidence in project plans
Systems & Software Engineering – Chris Miller, October 2008
Integrated EVM Concept
CPMO Spiral - Sample SummaryPerformance Measurement Baseline
$71
$478
$706
$792
$941
$1,051
$1,145
$1,318
$1,420
$47
$228$291
$338
$565715
$1,032
$1,251
$1,370
$1,471
$1,559 $1,579
$0
$200
$400
$600
$800
$1,000
$1,200
$1,400
$1,600
$1,800
Oct07 Nov07 Dec07 Jan08 Feb08 Mar08 Apr08 May08 Jun08 Jul08 Aug08 Sep08 Oct08 Nov08 Dec08
TIME (Months)
DO
LL
AR
S (
$K
)
BAC BCWS (PV) BCWP (EV) ACWP (AC) ETC
BAC = $1420
Data Date
EV Metrics
Calculated Schedule Range:09/08/08 - 10/12/08
Calculated Estimate at Complete Range:$1,797K - $2,515K
Cumulative CPI = 0.79SW Metrics
Calculated Schedule Range:05/14/08 - 06/18/08
Calculated Estimate at Complete Range:$1,570K - $1,710K
SW MetricETC
SW MetricETC
Program Plan ETC Program Plan ETC
EV ETC EV ETC
Acquisition Oversight confidence increases as ETCs overlapi.e., as multiple measures indicate to same conclusion
Acquisition Oversight confidence increases as ETCs overlapi.e., as multiple measures indicate to same conclusion
Systems & Software Engineering – Chris Miller, October 2008
SRDR Feasibility Study
• SSA funded a STSC feasibility study on normalizing DCARC data
– Purpose of the study was to determine if the DCARC data set could be used to develop a parametric software estimation model
– Dr. Randall Jensen at Hill AFB is the creator of Sage and SEER-SEM
• Preliminary study results – Data was submitted using a wide variety of definitions that made it
difficult to use directly, but was successful in normalizing the data and validating the results
– The data collected in the SRDR forms allows a first-level parametric model to be constructed (lack of environment information prohibits the creation of a detailed model)
– Minimum development time projection model
Systems & Software Engineering – Chris Miller, October 2008
Minimum Software Development Time Projection Model
Source: Long, L. G. et al, Aerospace Corp Report,2004
Impossible Zone
Impossible Zone
Analysis of CSCI data:• Apparent Maximum Size of 200 KSLOC (per CSCI) • Apparent Minimum Schedule (Impossible Zone)• Environment & Complexity are primary drivers of variation• Apparent Maximum Schedule of 50 months
Analysis of CSCI data:• Apparent Maximum Size of 200 KSLOC (per CSCI) • Apparent Minimum Schedule (Impossible Zone)• Environment & Complexity are primary drivers of variation• Apparent Maximum Schedule of 50 months
Systems & Software Engineering – Chris Miller, October 2008
Updating MIL-HDBK-881A for Software Overview
• Revision objectives:– Revise to make handbook acceptable of software
engineering practice – Correct errors
• Notable changes: – Replace 'material item' with ' acquisition program' – Add words to make 'artifacts' equal to 'products'– Include words that make 'product-oriented' and 'DoDAF
architecture' views interchangeable WRT to creating a WBS• Results
– Compiled a Comment matrix– Drafted a new Appendix B (for software)
• During final review and clean up, the working group was made aware of 90’s effort (draft MIL-HDBK-171)
Systems & Software Engineering – Chris Miller, October 2008
MIL-HDBK-881A Next Steps
• The Software Cost Control working group goal is to provide an community-coordinated set of recommendations
• Reached out to industry members of NDIA to review the suggested changes– Ensure recommended changes are improvements from industry
perspective– Obtain additional examples to include in Appendix B– NDIA feedback is due on September 19th
• Submission of suggested changes to PA&E ARA will take place after collection of NDIA feedback and consolidate into a single baseline
Systems & Software Engineering – Chris Miller, October 2008 24
Questions/Discussion
Contact Information:
Chris MillerSSE/SSA Support Software Engineering & Systems Assurance ODUSD(A&T) Systems & Software [email protected]