2005 by richard d. stutzke estimation perspectives (v5)1 estimation perspectives richard d. stutzke...
Post on 19-Dec-2015
217 views
TRANSCRIPT
Estimation Perspectives (v5) 1 2005 by Richard D. Stutzke
Estimation Perspectives
Richard D. Stutzke (SAIC)
25 October 2005
Presented at the 20th International COCOMO and Software Cost Modeling Forum
Los Angeles, 25-28 October 2005
Estimation Perspectives (v5) 2 2005 by Richard D. Stutzke
Topics
• Measurement and Estimation
• Product Size
• Project Cost and Schedule
• Product Performance
• Product Quality
• Process Performance
• Summary
Estimation Perspectives (v5) 3 2005 by Richard D. Stutzke
Reasons to Estimate and Measure• Product Size, Performance, and Quality
- Evaluate feasibility of requirements - Analyze alternate product designs- Determine the required capacity and speed of hardware components- Evaluate product performance (accuracy, speed, reliability, availability)- Quantify resources needed to develop, deploy, and support a product- Identify and assess technical risks- Provide technical baselines for tracking and controlling
• Project Effort, Cost and Schedule- Determine project feasibility in terms of cost and time- Identify and assess project risks- Negotiate achievable commitments- Prepare realistic plans and budgets- Evaluate business value (cost versus benefit) - Provide cost and schedule baselines for tracking and controlling
• Process Capability and Performance- Predict resource consumption and efficiency- Establish norms for expected performance- Identify opportunities for improvement
Estimation Perspectives (v5) 4 2005 by Richard D. Stutzke
Estimation Uncertainty Typically Decreases with Time
0.0
1.0
2.0
3.0
4.0
5.0
0.0 0.2 0.4 0.6 0.8 1.0
Elapsed Time
Uncertainty (multiple of
estimated value)
Upper 80% Bound
Lower 80% Bound
Zero Error Line
Estimation Perspectives (v5) 5 2005 by Richard D. Stutzke
Total Estimation Life CycleDefine Project
(Products, Process)
Identify and Evaluate Risks
Estimate Cost and Schedule
Compare Planned and Actual Values
Improve Process
Calibrate
Re-estimate Cost and Schedule
Organization’s Historical DataOrganization’s Historical Data
Loop 1
Loop 2
Work Breakdown Structure
Close Out Report
Revised Inputs
Enhancements
Budget and Schedule
BEFORE
DURING
AFTER
BEFORE
DURING
AFTER
Project EstimateProject EstimateProject
PlanProject
Plan
Status & Estimate To Complete
Status & Estimate To Complete
Project ActualsProject Actuals
Documented Estimation Models
Documented Estimation Models
Updated Procedures & Checklists
Updated Procedures & Checklists
Perform Planned Activities
Measure
Perform Planned Activities
Measure
Environmental & Business FactorsEnvironmental & Business Factors
RequirementsRequirements
Staff Skill and Availability
Changes in Requirements,
Design, and Environment
Changes in Requirements,
Design, and Environment
Estimation Perspectives (v5) 6 2005 by Richard D. Stutzke
Ways to Improve Estimation Accuracy
Before starting the project- Understand product requirements and architecture- Choose a suitable production process and supporting tools- Use a mix of estimating techniques, models, and historical data- Produce the estimate in a disciplined way
During the project- Collect actual measurements (costs, performance, progress)- Watch for violations of the estimating assumptions- Watch for changes in key factors (software size, turnover)- Update the estimate with the latest information
After the project- Collect final measurements- Analyze data and lessons learned- Capture the new information (calibrate models, update checklists)- Improve the estimating process (procedures, new tools, training)
Estimation Perspectives (v5) 7 2005 by Richard D. Stutzke
Quantities of Interest Products and Services Delivered
- Size or amount (created, modified, purchased)- Performance (capacity, accuracy, speed, response time)- Quality (conformance to requirements, dependability)- Price and total ownership cost
Project- Effort (activities) - Staff (number, skill and experience, turnover) - Time (phases, schedule milestones)- Effort (direct and indirect)- Costs (labor and non-labor) - Computer resources used for development and test
Process - Effectiveness- Efficiency- Flexibility
Estimation Perspectives (v5) 8 2005 by Richard D. Stutzke
Common Challenges
• Identifying the key factors that affect the quantity of interest
• Defining precise measures– Operational definitions– Affordable– Appropriate scale ([Stevens, 1948], [Zuse, 1997]
• Quantifying the influence of the key factors (valid models)• The “object” evolves during development
– “Object” can be the product, project, or process– The final object is not the one initially conceived
Estimation Perspectives (v5) 9 2005 by Richard D. Stutzke
Product Size
Estimation Perspectives (v5) 10 2005 by Richard D. Stutzke
Labor Dominates Software Project Cost
• The concept:Effort [phrs] = Size [SLOC] / Productivity [SLOC/phr]*
• The challenges– Choice and definition of the size measure
– Defining productivity
– Identifying all the pieces
*Productivity (product size) / (resources expended)
Estimation Perspectives (v5) 11 2005 by Richard D. Stutzke
Physical Architecture for a Monitoring Application
Front EndSensors Mainframe Display
Device Drivers
Polling Logic
Comm. Mgt.
Executive (Control Logic)
Algorithms
- Data Validation
- Data Smoothing
- Trend Analysis
Data Management
Comm. Mgt.
Report Generator
GUI Services
Comm. Mgt.
Display Gen.
GUI Services
Estimation Perspectives (v5) 12 2005 by Richard D. Stutzke
Software Architecture for a Monitoring Application
Control
- Executive
- Polling Logic(?)
- Self-test
- Data Archiving
System Services
- Communications Mgt.
- Math and Statistics (e.g., LSQ fitting)
- GUI Services
- Data management (RDBMS)
Report Generation
- Report Generation
- Display Generation
Data Analysis
- Data Validation(?)
- Data Smoothing
- Trend Analysis (threshold detect)
Data Acquisition
- Device Drivers
- Polling Logic(?)
- Data Validation(?)
- Units Conversion (scaling, clipping)
Software Product
*The items followed by “(?)” may be allocated different areas depending on various considerations.
Tooling
- Emulators
- Stubs
- Test Drivers
- Test Data Preparation
- Test Data Analysis
- Scripts
Estimation Perspectives (v5) 13 2005 by Richard D. Stutzke
Productivity Is Slippery
• Common Issues– What is being counted? (new, modified, prototypes, tooling, breakage)– How is size defined? (Example: a line of source code)– What activities and phase are covered? (CM? QA? RA? I&T?)– How formal is the process? (reviews, documentation)– How skilled is the staff?– How many person-hours per person-month? (overtime, days off)
• Specific Challenges– Software (all components unique, “exceptional” functions overlooked)– Hardware (components and assemblies)– Services (installation, training)– Documents (original, copies)
Estimation Perspectives (v5) 14 2005 by Richard D. Stutzke
Migration of Reused Code to the “Dark Side”
0
50
100
150
200
250
300
0.0 0.2 0.4 0.6 0.8 1.0
Elapsed Time (Relative Units)
EstimatedSize
(KSLOC)
Total Size (All Code)
Original Total
Modified + Unmodified
Unmodified Only
Additional New
New
Modified
Copied
Estimation Perspectives (v5) 15 2005 by Richard D. Stutzke
Project Cost and Schedule
Estimation Perspectives (v5) 16 2005 by Richard D. Stutzke
The Precursors of Estimation
• Customer’s needs and operational environment
• Products and services to be delivered
• Production process to be used
• Project goals and constraints
• Estimate’s purpose and constraints
• Applicable estimating techniques and tools
Estimation Perspectives (v5) 17 2005 by Richard D. Stutzke
Project Information Flow
Production Process
Product Architecture
SOWContract
CLINs__________________
Technical Baseline
___________________________________________________
Resource Loaded Network
Cost Baseline
1.0 ____ $1.1 ____ $1.2 ____ $
2.0 ____ $
Master ScheduleAnalyzeDesignCodeTestReviewDeliver
WBS
(Work Packages)
Schedule Baseline
Earned Value Plan
BCWS $ $ $ $BCWP $ $ $ $ACWP $ $ $ $
TPM
Planning Tracking
SpecProduction
ProcessProduct
Architecture
SOWContract
CLINs__________________
Technical Baseline
___________________________________________________
Resource Loaded Network
Cost Baseline
1.0 ____ $1.1 ____ $1.2 ____ $
2.0 ____ $
Master ScheduleAnalyzeDesignCodeTestReviewDeliver
WBS
(Work Packages)
WBS
(Work Packages)
Schedule BaselineSchedule Baseline
Earned Value Plan
BCWS $ $ $ $BCWP $ $ $ $ACWP $ $ $ $
Earned Value Plan
BCWS $ $ $ $BCWP $ $ $ $ACWP $ $ $ $
TPM
Planning Tracking
SpecSpec
Actuals
Project DataEngineering DataProcurement Data
Estimation Perspectives (v5) 18 2005 by Richard D. Stutzke
Estimating Techniques*
• Expert Judgment (Delphi or PERT)
• Analogy (“nearest neighbors”)
• Scaling (additive, multiplicative)
• Top-Down
• Bottom-Up
• Parametric Models
*See comparison in Chapter 22 of [Boehm, 1981].
Estimation Perspectives (v5) 19 2005 by Richard D. Stutzke
Usage of Estimating Techniques Changes
0
20
40
60
80
100
Phase of the Product's Life Cycle
MethodUse (%)
Experts
Analogies
Bottom Up(Using WBS)
Extrapolationof Actual Data
Parametric Models
Conception Elaboration Construction Transition Operation
Estimation Perspectives (v5) 20 2005 by Richard D. Stutzke
Product Performance
Estimation Perspectives (v5) 21 2005 by Richard D. Stutzke
Reasons to Estimate Product Performance
• Evaluate feasibility of specified performance• Establish bounds on expected performance• Identify type and amount of computer resources needed
– “Capacity planning”– System sizing
• Identify significant design parameters• Provide information when cannot measure
– Product not yet built– Product is inaccessible (e.g., 24 x 7 x 52)– Measurement is too dangerous or expensive
Estimation Perspectives (v5) 22 2005 by Richard D. Stutzke
Typical TPMs for Computer Systems
Accuracy- Correctness (corresponds to real world)- Adequate precision (resolution)
Dependability- Reliability- Availability- Probability Of Failure On Demand
Speed- Response time (GUI)- Execution time (per function, transaction, etc.)- Data transfer rates
Resource Consumption- CPU usage- Storage (memory, disk)- Transmission channel usage (I/O, LAN)- Peripherals
Estimation Perspectives (v5) 23 2005 by Richard D. Stutzke
Resource Usage versus System Performance
Software Algorithms
Assumed Operational Profile
Actual Operational Profile
Estimated System
Performance
Hardware Costs
Mission Success
Estimated Resource Usage
Actual Resource Usage
Actual System Performance
Platform Capacity and Configuration
User Acceptance
Estimation Perspectives (v5) 24 2005 by Richard D. Stutzke
Estimation and Measurement Challenges
Platform characteristics- Multiple processors- Memory cacheing- Configurable COTS components (e.g., buffer size)- “Hidden” factors (operating system’s scheduler)- Internal workload (other concurrent processes)
External environment and workload- Number of users- External stimuli (probabilistic events, multiple scenarios)
Algorithms- Use of loops, iteration, searches, or sorts- Suitability for assumed workload (scalability)- Choice of parameters (e.g., step size and convergence criteria)- Poorly understood or not yet identified
Relating component behavior to system behavior- Complex (and possibly unknown) interactions of components- Many states and modes (dynamic load balancing, fault tolerance)
Estimation Perspectives (v5) 25 2005 by Richard D. Stutzke
Possible Performance Analysis Techniques
• Order of magnitude calculations• Analytic model• Simple queueing results• Bottleneck analysis• Semi-analytic models (e.g., PDQ by Gunther)• Queueing network models• Simulation models• Smart stubs• Benchmark runs on similar system• Measurement of actual system
Estimation Perspectives (v5) 26 2005 by Richard D. Stutzke
Comparison of Options
Estimation Measurement
Characteristic Analytic Models Simulation Smart Stubs Benchmark Runs
Choice of Performance Measures Limited Many Limited Limited
Level of Detail Limited Arbitrary Varies Varies
Accuracy (Realism and Fidelity) Approximate Good Good Perfect
Steady State Behavior Yes Yes Yes Yes
Transient Response Limited Yes Yes Yes
Range of Off-Nominal Conditions Some Wide Some Limited
Effort to Develop and Validate the Model or Tool High to Extra High Low to High Low Not Applicable
Effort to Use the Model or Tool (set up, configure, calculate, measure)
Low Low to Very High
Nominal to Very High
Medium to Very High
Estimation Perspectives (v5) 27 2005 by Richard D. Stutzke
Performance Estimation is a Cyclic Process
Acceptable performance
Unacceptable performance infeasible
feasible
Assess performance risk
Identify critical use cases
Select keyperformance scenarios
Run and evaluate performance model(s )
Construct performance model(s )
Define quantitative performance objectives
Revise performance objectives
Modify product architecture and design
Modify/create scenarios
Verify and validate models
START
Next IterationAcceptable
performanceUnacceptable performance infeasible
feasible
Assess performance risk
Assess performance risk
Identify critical use cases
Identify critical use cases
Select keyperformance scenarios
Select keyperformance scenarios
Run and evaluate performance model(s )
Run and evaluate performance model(s )
Construct performance model(s )
Construct performance model(s )
Define quantitative performance objectives
Define quantitative performance objectives
Revise performance objectives
Revise performance objectives
Modify product architecture and design
Modify product architecture and design
Modify/create scenarios
Modify/create scenarios
Verify and validate models
Verify and validate models
STARTSTART
*Adapted from Figure 2-1 in [Smith, 2002]. Used with permission.
Next IterationNext Iteration
Estimation Perspectives (v5) 28 2005 by Richard D. Stutzke
Product Quality
Estimation Perspectives (v5) 29 2005 by Richard D. Stutzke
Quality Factors Vary by System and Stakeholder
0
10
20
30
40
50
60
70
80
90
100Correctness
Performance
Dependability
Usability
EfficiencySupportability
Maintainability
Interoperability
Portability
User
Developer
Perfect
Estimation Perspectives (v5) 30 2005 by Richard D. Stutzke
Example: Product Dependability
Installed and Configured
Software
Assumed Operational
Profile
Actual Operational
Profile
Test Platform
Estimated Dependability
Observed Correctness
Actual Dependability
Operational Platform
User Acceptance
Functional Test Cases
Integrated Software
Estimation Perspectives (v5) 31 2005 by Richard D. Stutzke
Estimating Challenges
• Distinguishing faults, failures, and problem reports• Understanding fault generation and detection
– Product requirements and architecture– Process (methods, reviews, tests, tools)
• Relating faults to operational failures– Failures often arise from occurrence of multiple events– Assumed operational profile– Predicting the actions of users and abusers
• The “product” evolves during development– Add new features– Revise the design
Estimation Perspectives (v5) 32 2005 by Richard D. Stutzke
Testing (Measurement) Challenges
• Test profile differs from the operational profile
• Detection is flawed (Type 1 and Type 2 errors)– Test case (or peer review) fails to expose a real fault– Test case reports a defect when there is actually no fault
• Comprehensive testing is seldom possible– Many combinations to cover (inputs, modes, internal states)– Limited time for testing– Never trigger some faults (so never see resulting failure)
• The product changes during testing– Remove defects (“reliability growth”)– Add defects (“bad fixes”)– May even add features– Every change gives a new product to sample
Estimation Perspectives (v5) 33 2005 by Richard D. Stutzke
Comparison of Options
Estimation Measurement
Characteristic Analytic Models Simulation Stress Tests Operational Use
Choice of Quality Measures* Limited Limited Limited Limited
Level of Detail Limited Varies Limited to dependability
Varies
Accuracy (Realism and Fidelity) Approximate Varies Perfect (if product is complete)
Good
Effort to Develop and Validate the Model or Tool High to Extra High Low to High Low Not Applicable
Effort to Use the Model or Tool (set up, configure, calculate, measure)
Low Low to Very High
Low to Medium Nominal to Medium
*Dependability is most objective and mature. Customer preference is subjective.
Estimation Perspectives (v5) 34 2005 by Richard D. Stutzke
Determining Product DependabilityConcept
Definition
SystemDefinition
SoftwareRequirements
Analysis
ProductDesign
DetailedDesign
Coding andUnit Testing
IntegrationTesting
Dependability(Stress)TestingFactory
AcceptanceTesting
SiteOperational
Testing
Operations
SpecifyDependability
Goal
EstimateDependability(“Prediction”)
MeasureDependability(“Estimation”)
ObserveDependability
ApplicationMetrics
RequirementMetrics
DesignMetrics
CodeMetrics
DesignMetrics
TestData
TestData
TestData
TestData
ActualPerformance
Estimation Perspectives (v5) 35 2005 by Richard D. Stutzke
Process Performance
Estimation Perspectives (v5) 36 2005 by Richard D. Stutzke
Process Control: Measurements + Models
Predict Process
Performance
Predict Process
Performance
Control the
Process
Control the
Process
Execute the
Process
Execute the
Process
Measure the
Process
Measure the
Process
Define and Improve the
Process
Define and Improve the
Process
Data, Information, and Measurements
Predictions
Defined (and Improved) Process
Legend
Data, Information, and Measurements
Predictions
Defined (and Improved) Process
Data, Information, and Measurements
Predictions
Defined (and Improved) Process
Legend
Estimation Perspectives (v5) 37 2005 by Richard D. Stutzke
• Key Questions
– What is the current performance?– Is this value "good"?– Is it changing?– How can I make the value “better”?
• Candidate Attributes*
– Definition (completeness, compatibility)– Usage (compliance, consistency)– Stability (repeatability, variability)– Effectiveness (capability)– Efficiency (productivity, affordability)– Predictive Ability (accuracy, effects of tailoring and improvements)
*Motivated by [Florac, 1999, Section 2.4]
Measuring Process Performance
Estimation Perspectives (v5) 38 2005 by Richard D. Stutzke
Goal Measure
Completeness The number of process elements added, changed, and deleted during tailoring.
Compliance Number of discrepancy reports generated by Quality Assurance audits
Stability (volatility) The number of process elements changed within a specified time interval.
Effectiveness Product quality
Effectiveness Defect leakage to subsequent phases
Efficiency Productivity (or production coefficient)
Efficiency Rework as a fraction of total effort
Predictability Probability distribution for an estimated quantity or related statistics
Some Examples
Estimation Perspectives (v5) 39 2005 by Richard D. Stutzke
Type Handles Unstable Processes?
Representation of Process Mechanisms
Examples
Statistical No None Statistical process control
Functional No Explicit Parametric models (algorithms based on causal mechanisms). COQUALMO and staged models.
Dynamic Yes Implicit (via propagation)
System dynamics models (Coupled equations embody the causal mechanisms. Solving numerically gives predicted behavior.)
Types of Process Performance Models
Estimation Perspectives (v5) 40 2005 by Richard D. Stutzke
Comparison of Options
Measurement Estimation
Characteristic Statistical Functional Models Dynamic Simulation
Choice of Performance Measures Limited Limited Many
Level of Detail Limited Varies Arbitrary
Accuracy (Realism and Fidelity) Perfect (for specified scope)
Approximate Varies
Steady State Behavior Yes Yes Yes
Transient Response No Limited Yes
Range of Off-Nominal Conditions Limited Some Wide
Effort to Develop and Validate the Model or Tool Low to High Low to High Low to Very High
Effort to Use the Model or Tool (set up, configure, calculate, measure)
Medium to High Low Low to High
Estimation Perspectives (v5) 41 2005 by Richard D. Stutzke
• The process definition– Detail
– Stability
– Tailoring
• The process execution
– Compliance
– Consistency
• The model’s scope, and validity
– Relevant factors and interactions
– Fidelity
Factors Affecting Predictive Accuracy
Estimation Perspectives (v5) 42 2005 by Richard D. Stutzke
Summary
Estimation Perspectives (v5) 43 2005 by Richard D. Stutzke
Estimation and Measurement Are Coupled
• Estimates provide predicted information to:– Assess product and project feasibility– Evaluate tradeoffs in performance, quality, cost, and schedule– Identify and obtain resources– Prepare plans to coordinate and track work
• Measurements provide actual information to:– Verify expectations and detect deviations– Control processes (preserve estimating assumptions, adapt to
changes)– Prepare revised estimates– Formulate, validate, and calibrate predictive models– Improve decision making for projects and businesses– Help the business adapt to changing conditions
Estimation Perspectives (v5) 44 2005 by Richard D. Stutzke
Recurring Themes
• Estimation is a continuing process– Accuracy usually increases with time (“learning curve”)– You get what you pay for at any particular time– Perfectly accurate estimates are impossible (and are not
needed!)• A little, good data goes a long way (90/10 rule)
– There are never enough resources (time, money)– Use Goal – Question – Measurement to choose– Precise definitions are essential (e.g., “line of code” and
“defect”)• Some quantities are harder to estimate than others
Estimation Perspectives (v5) 45 2005 by Richard D. Stutzke
Relative Estimation Difficulty
1. Project Cost
2. Product Size
3. Project Effort
4. Project Schedule
5. Product Performance
6. Product Quality
7. Process Performance
Increasing Difficulty
Estimation Perspectives (v5) 46 2005 by Richard D. Stutzke
References (1 of 2)General
[Stutzke, 2005] Estimating Software-Intensive Systems: Projects, Products, and Processes, Richard D. Stutzke, Addison-Wesley, 2005, ISBN 0-201-70312-2.
Project Effort, Cost, and Schedule
[Boehm, 2000] Software Cost Estimation with COCOMO II, Barry W. Boehm, Chris Abts, A. Winsor Brown, Sunita Chulani, Bradford K. Clark, Ellis Horowitz, Ray Madachy, Donald Reifer, and Bert Steece, Prentice-Hall, 2000, ISBN 0-13-026692-2.
[Boehm, 1981] Software Engineering Economics, Barry W. Boehm, Prentice-Hall, 1981, ISBN 0-13-822122-7.
Product Performance
[Jain, 1991] The Art of Computer Systems Performance Analysis Techniques for Experimental Design, Measurement, Simulation, and Modeling, Raj Jain, John Wiley and Sons, 1991, ISBN 0-471-50336-3.
[Gunther, 2000] “The Practical Performance Analyst: Performance-By-Design Techniques for Distributed Systems”, Neil J. Gunther, Authors Choice Press (an imprint of iUniverse.com, Inc.), 2000, ISBN 0-595-12674-X. Originally published in 1998 by McGraw-Hill as ISBN 0079-12946-3.
[Smith, 2002] Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software, Connie U. Smith and Lloyd G. Williams, Addison-Wesley, 2002, ISBN 0-201-72229-1.
Estimation Perspectives (v5) 47 2005 by Richard D. Stutzke
References (2 of 2)
Product Quality
[Kan, 2003] Metrics and Models in Software Quality Engineering, 2nd edition, Stephen H. Kan, Addison-Wesley, 2003, ISBN 0-201-72915-6.
[Lyu, 1996] “Handbook of Software Reliability Engineering”, Michael R. Lyu (editor), IEEE Computer Society Press (McGraw-Hill), 1996, ISBN 0-07-039400-8. An excellent, up-to-date coverage of the subject.
Process Performance
[Florac, 1999] Measuring the Software Process: Statistical Process Control for Software Process Improvement , William A. Florac and Anita D. Carlton, Addison-Wesley, 1999, ISBN 0-201-60444-2.
[Madachy, 2005] Software Process Modeling with System Dynamics, Raymond J. Madachy, John Wiley & Sons, 2005, ISBN 0-471-27555-0.
Estimation Perspectives (v5) 48 2005 by Richard D. Stutzke