carnegie mellon software engineering institute © 2006 by carnegie mellon university software...

32
Carnegie Mellon Softw are EngineeringInstitute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213-3890

Upload: milo-ford

Post on 21-Jan-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

Carnegie MellonSoftware Engineering Institute

© 2006 by Carnegie Mellon University

Software ProcessPerformance Measures

James Over

Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213-3890

Page 2: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 2

PurposeWhy are you interested in process improvement?

Hopefully for the process performance benefits.

If so, process performance measurement is a key concern.

Many of the examples in this presentation are from the Team Software ProcessSM , however the concepts are broadly applicable.

SMTeam Software Process is a registered service mark of Carnegie Mellon University

Page 3: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 3

Team Software ProcessThe Team Software Process (TSP) is an integrated set of practices for developing software.

TSP is a process-based solution to common software engineering and management issues.• cost and schedule predictability• productivity and product quality• process improvement

Unlike other methods, TSP• teams are self-directed.• emphasizes measurement and quality management.• provides immediate and measurable benefits.• accelerates CMMI-based improvement.

Page 4: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 4

TSP Performance Summary -1

* From a study of 20 projects in 13 organizations conducted in 2003** Of the unsuccessful projects, average schedule error was 222%

Cancelled29%

On-Time26%

101%-200% late16%

51%-100% late9%

21%-50% late8%

Less than 20% late6%

More than 200% late6%

PerformanceCategory

TSP Impact Study (2003)*

Typical Industry Performance(Standish Group)**

Schedule error

average

6%

Schedule error range

-20% to +27%

Page 5: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 5

TSP Performance Summary -2

* From a study of 20 projects in 13 organizations conducted in 2003

PerformanceCategory

TSP Impact Study (2003)*

Typical Industry Performance

System test defects per thousand instructions

0.4 avg. 0.0 to 0.9

2 to 14

Released defects per thousand instructions

0.06 avg.0.0 to 0.2

1 to 7

System test effort (% of

total effort) 4% avg.

2% to 7% 40%

Page 6: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 6

TSP Performance Summary -3An analysis of 20 projects in 13 organizations showed TSP teams averaged 0.06 defects per thousand lines of new or modified code.

Approximately 1/3 of these projects were defect-free.

These results are substantially better than those achieved in high maturity organizations.

7.5

6.24

4.73

2.28

1.05

0.060

1

2

3

4

5

6

7

8

Level 1 Level 2 Level 3 Level 4 Level 5 TSP

Defects/KLOC

Source: CMU/SEI-2003-TR-014

Page 7: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 7

TSP-CMMI Overall Coverage

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Level 2 Level 3 Level 4 Level 5

CMMI Maturity Level

Perc

enta

ge o

f SPs Unrated

Not Addressed

Part ially Addressed

Supported

Direct ly Addressed

Page 8: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 8

TopicsProcess management concepts

TSP measurement framework

Performance measures

Page 9: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 9

SEI Process Management Premise

“The quality of a software system is governed by the quality of the process

used to develop and evolve it.”

- Watts Humphrey

Page 10: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 10

Managed ProcessThe CMMI defines a managed process as a process with the following characteristics.• a performed process that is planned an executed in

accordance with policy• the process employs skilled people who have adequate

resources to produce controlled outputs• it involves relevant stakeholders• it is monitored, controlled, and reviewed• it is evaluated for adherence to its process description

Page 11: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 11

Process ManagementProcess ownership: key responsibilities for designing, establishing, and implementing the process and the mechanisms for measurement and corrective action is assigned.

Process definition: the design and formal documentation of the components of the process and their relationships.

Process control: the function of ensuring that the process output meets specifications including• measurement• control variable(s)• feedback loop(s)• defect detection, correction, and defect prevention

Source: Quality Process Management by Gabriel Pall

Page 12: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 12

Process Management Concept

Work Process

Input OutputControl

Page 13: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 13

Example

InspectionProcess

Input Output

Control

ReviewRate

ProcessYield

System test yield

Page 14: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 14

Process Management ConclusionsA defined process is a prerequisite for process management.

The enactment of the process should not differ from the defined process in any substantive way.

The key determinants of process performance must be instrumented and measured.

Failure to measure or limited measurement scope can lead to sub-optimization or “process tampering”

The process and measures should be designed to support process management from the start.

Page 15: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 15

TopicsProcess management concepts

TSP measurement framework

Performance measures

Page 16: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 16

Process Measurement IssuesSome common process measurement issues…• substantial variation in measurement reporting

requirements across development groups and suppliers• few measures of quality• standards emphasize derived measures instead of

common base measures • inability to summarize, aggregate, drill-down, extend• cannot benchmark or make comparisons• limited use as a management indicator• lack of accountability

Measurement framework “literally” tied to CMMI process areas and the examples of derived measures from CMMI.

Page 17: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 17

Measurement SystemDesign and a “systems” approach solves many measurement issues.

• Define a few common base measurement categories and establish standards for the most used instances.

• Develop a measurement framework that relates the base measures to the key elements of software process work.

• Create derived measures from the standard base measures.

• Identify process performance models and benchmarks that predict future performance.

• Integrate into monitoring and decision-making processes.

Page 18: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 18

TSP Measurement Framework -1

Base measurement categories Example derived measures• Estimation accuracy• Prediction intervals• Productivity• Cost performance index• Planned value• Earned value• Predicted earned value• Defect density• Defect density by phase• Defect removal rate by phase• Defect removal leverage• Review rates• Process yield• Phase yield• Failure cost of quality• Appraisal cost of quality• Appraisal/Failure COQ ratio• Percent defect free• Defect removal profiles• Quality profile• Quality profile index• …

Size

ScheduleDefects

Effort

Page 19: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 19

TSP Measurement Framework -2A model of key process elements and their relations provides a context for the base measures.• processes and phases• projects and sub-projects• products and parts• teams and team members• tasks• period (week, month, etc.)

The model facilitates• analysis• aggregation and drill-down• queries and views• scalability

Process

Project

Team

Product

Tasks

Period

Page 20: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 20

Estimated and Actual SizeSize is a measure of the magnitude of the software deliverable, e.g. lines of code or function points.

Size is estimated and actual size is measured for each component.

Five size accounting categories are used.• Base• Modifications to the base• Deletions from the base• Added or new• Reused

Size data are used to• estimate effort• track progress• normalize other measures

Page 21: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 21

Estimated and Actual EffortEffort is a measure of time on task.

The TSP effort measure is called a task hour.

Task hours are estimated and measured by• process phase• task• day or week How many task hours

are there in a 40 hour week?

About 15 to 20.

Page 22: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 22

Estimated and Actual ScheduleSchedule has two components• resource availability• task completion dates

Planned task dates are calculated from estimates of resource availability and planned task hours.

Actual date completed is recorded as tasks are finished.

Actual resource availability is derived from actual task hours.

Page 23: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 23

Estimated and Actual DefectsDefects are the measure of quality.

Estimates of the number of defects injected and removed.

A count of the actual number of defects injected and removed.

Defect data includes• component• phase injected• phase removed

Definition: a work product element that must be changed, after it was completed, in order to ensure proper design, implementation, test, use, or maintenance.

Page 24: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 24

TopicsProcess management concepts

TSP measurement framework

Performance measures

Page 25: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 25

TSP Performance MeasuresThe most often used TSP performance measures are:• Planned value, earned value, predicted earned value• Planned and actual task hours• Estimation error• Growth• Defect density• Percent defect-free• Quality profile and index

These measures support planning and tracking.

Combined with historical data and/or benchmarks, these measures also support process performance modeling.

Page 26: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 26

Process Performance Models

Project dataProcess

performance model

Predicted project

performance

Historical data and Benchmarks

Page 27: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 27

Example: Quality Profile

Project data

•Time in design, design review, coding, and code review•Defects found in compile and unit test.•Product size

Process performance

model

Quality Profile

Predicted value

Likelihood of post system test

defects

Benchmarks

•Development time ratio criteria•Defect density criteria

Page 28: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 28

Quality Profile BenchmarksThese software quality benchmarks predict post-development defects.

Modules that meet these criteria were found to be largely defect free in system test and after deployment.

Software Quality Benchmarks

DerivedMeasure

Desired Value

Design Time vs. Code Time Ratio

1 to 1

Design vs. Design Review Time Ratio

2 to 1

Code vs. Code Review Time Ratio

2 to 1

Compile defect density

< 10 per KLOC

Unit test defect density

< 5 per KLOC

Page 29: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 29

Quality ProfileThe quality profile is a process performance model that provides an early warning indicator for post-development defects.

The quality profile uses the five software quality benchmarks.

Satisfied criteria are plotted at the outside edge of the chart.

Component 2 Risk Factors

Design/Code Time

Code Review Time

Compile D/KLOCUnit Test D/KLOC

Design Review Time

Component 5 Risk Factors

Design/Code Time

Code Review Time

Compile D/KLOCUnit Test D/KLOC

Design Review Time

High quality component Poor quality component

Inadequate design review time results in

design defects escaping to

test and production.

Page 30: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 30

Using the Quality Profile

Quality Profile for Assembly Common Query Changes (BE)

0

0.2

0.4

0.6

0.8

1Design/Code Time

Code Review Time

Compile Defects/KLOCUnit Test Ddefects/KLOC

Design Review Time

Plan

Actual

Quality Profile for Assembly BOM Query Sproc Changes (BE)

0

0.2

0.4

0.6

0.8

1Design/Code Time

Code Review Time

Compile Defects/KLOCUnit Test Ddefects/KLOC

Design Review Time

Plan

Actual

Quality Profile for Assembly User Report Settings (BE)

0

0.2

0.4

0.6

0.8

1Design/Code Time

Code Review Time

Compile Defects/KLOCUnit Test Ddefects/KLOC

Design Review Time

Plan

Actual

Quality Profile for Assembly OEMMOO Delivery.aspx (FE-Server)

0

0.2

0.4

0.6

0.8

1Design/Code Time

Code Review Time

Compile Defects/KLOCUnit Test Ddefects/KLOC

Design Review Time

Plan

Actual

Page 31: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 31

Quality Performance IndexThe Quality Performance Index is the product of the five parameters in the quality profile.

QPI predicts the likelihood of post-development defects in a system.

Unit Test and Compile Defects and Design and Code Review Time vs. Post-Development

Defects/KLOC

0

5

10

15

20

25

0 0.2 0.4 0.6 0.8 1 1.2

Process Quality Index

Po

st-

De

ve

lop

me

nt

De

fec

ts/K

LO

C

Quality Performance Index

Pos

t-D

evel

opm

ent

Def

ect D

ensi

ty

Quality Performance Index vs.Post-Development Defect Density

Interpreting the Quality Performance Index

Range Interpretation

0.0 to 0.2 Re-inspect; test and post-development defects likely

0.2 to 0.4 Re-inspect if test defects are found

0.4 to 1.0 Component is of high-quality

Page 32: Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering

© 2006 by Carnegie Mellon University

Carnegie MellonSoftware Engineering Institute

Software Process Performance Measures.2006.02.01 32

ConclusionMeasurement and process management are inseparable, you should incorporate measurement in your initial processes.

A common problem with software process measurement is a lack of integrated, well designed measurement systems resulting in unnecessary complexity and usability issues such as lack of scalability and extensibility.

Process management can be successfully applied to the software process with a few, simple, derived measures that are integrated into a measurement framework.