test performance indicators

10

Click here to load reader

Upload: idexcelinc

Post on 05-Dec-2014

1.370 views

Category:

Software


2 download

DESCRIPTION

One of the objectives of Idexcel’s Test Policy is to provide a cost effective and efficient software testing process to its customers.

TRANSCRIPT

Page 1: Test performance indicators

Test Performance Indicators

White Paper

idexcel

Keep them Simple. Make them Meaningful. Track and Use them.

Page 2: Test performance indicators

Background

Introduction

Well defined test performance indicators support decision making by management and also provide a methodi-cal approach to assess the efficiency and effectiveness of current test processes. Assessment of software quality and process efficiency gets skewed, resulting in wrong decisions, when proper test performance indicators and measurements are not defined and agreed.

Although there are several software test performance indicators defined by practitioners, researchers and professionals, they may be either used in isolation or worse, wrong indicators may be used. This white paper describes the metrics used by Idexcel, and how they help our clients.

Test Performance Indicators (TPIs) are high level metrics of effectiveness or efficiencies used to guide and control progressive test development, test process and product quality. Idexcel has adopted the Test Maturity Model (TMMi) process model for all its testing projects, and TPI’s are defined in line with the best practices recommended by TMMi. While developing these TPI’s, Idexcel has established procedures for data collection, owners, storage, analysis and reporting of TPI’s. The key project stakeholders periodically analyze the reported data and take necessary action.

idexcel

2 P a g e

Test Performance Indicators

Page 3: Test performance indicators

Test Performance Indicators Used by IdexcelIn line with industry practices, Idexcel has broadly classi-fied TPI’s into two broad categories: “Test Process Indicators”, and “Product Quality Indicators”.

Test Process Indicators (TPIs)

One of the objectives of Idexcel’s Test Policy is to provide a cost effective and efficient software testing process to its customers. To meet this objective, we continuously measure the effectiveness and efficiency of software testing. This helps in assessing the quality and effectiveness of the process and productivity of the personnel involved in testing activities and this in turn helps to improve the software testing procedures, methods, tools and tasks. Gathering these TPI’s help Idexcel to analyze the data and learn from historical data, resulting in improved software testing maturity.

Product Quality Indicators

The TPI’s provide insight into test state and testing status of a software product to engineers and project stakeholders and are generated by test execution, code fixes and deferment. Using these metrics, we measure the product test state and indicative the level of quality. Based on experience gained by working with several customers, we have redefined product quality metrics into two parts, ‘Product Quality TPI’s and ‘Customer Oriented Indicators’.

3 P a g e

We use test performance indicators for:

idexcel

Identifying testing strengths, weaknesses and areas of improvements

Analyzing risk

Benchmarking for future process optimization

Taking product and process decisions

Analyzing the current state of organization’s testing processes

Determining customer involvement and satisfaction

Controlling and monitoring of test process

Measuring test productivity and effectiveness.

Test Performance Indicators

The traditional approach cannot cope with the fast pMeasuring process effectiveness is an objective, order-ly method of quantifying, assessing, adjusting and ultimately improving the testing process. We collect data for each project or program and analyze them at regular intervals to assess software quality and process performance throughout the software test lifecycle and create a baseline and guidance for future projects.

Collect Data and store it in a common repository

Evolve the process based on the analysis

Evolve the process based on the analysis

Review the data and ensure only quality data is available for analysis

Analyze and prepare reports and present them

to management

Page 4: Test performance indicators

4 P a g e

idexcel

Effort Variance

This is a process efficiency indicator which helps in understanding any extra effort injected into a testing project to complete the activities which were planned at the beginning of the project. This indicator combined with the phase wise effort indicator gives insights to address the problematic phases that must be improved in terms of efficiency and effectiveness. On the contrary, if investigations reveal that the benchmarking used for initial efforts estimation needs tweaking based on changed circumstances, management can take the necessary action to revise the benchmark.

Negative results are a good sign for management, as they can use these negative results to revise the organi-zation wide bench mark for effort estimation of similar projects.

Defects Slippage

This is one of the most important ‘Process Effectiveness’ indicators that helps in identifying the effectiveness of the testing process adopted by an organization. Defects Slippage is directly related to the company’s reputation, warranty costs, future business, contract requirement and customer satisfaction. Idexcel regularly monitors this indicator by project and analyzes the data to identi-fy and plug gaps in the existing testing process.

Defect Removal Efficiency

This is a ‘Process Efficiency’ indicator that indicates the defects removed per time unit (hours/-day/weeks/phase)

Idexcel has derived this indicator directly from the orga-nization’s debug policy, and this indicator denotes the efficiency of defect removal methods and is also an

Test Performance Indicators

KeyTest Performance Indicators Used by IdexcelPhase-Wise Testing Effort

This Process efficiency indicator helps in identifying the intensive effort areas during prevention, verification and validation attributes in the project. Prevention activities include planning, training and validation & verification activities and include time spent on tasks such as test case walkthroughs, reviews, environment set up, testing, test requirement, re-test etc. Examples of phase-wise effort in testing is shown in the chart below.

Cost Variance

This is a process efficiency indicator which helps in understanding actual cost versus planned and budgeted expenditure. Whenever testing cost overshoots the budgeted cost, management can analyze and gain in-depth understanding of the lessons learned during the given project, and this understanding can be used to l the testing costs of future projects. However, due to changes in some internal or external factors, the bench-mark used for creating budgets may become outdated, and hence overshoot.

Review

Rework

Test Execution

Environment set up

Test Reporting

Documentation

Planning

Training

tivity of the test team. Productivity may be defined as test cases executed by a team per unit time (hours/-days/weeks/phase/release). This indicator helps in identifying the problematic areas impacting a team’s productivity and helps take remedial action.

Test Coverage

The purpose of the Test Coverage indicator is to assess whether sufficient testing has been performed, and identify areas that may require additional testing. This metric measures the number of test cases tested successfully against total number of test cases planned for each product component at major milestones of the software development life cycle. This indicator tells us, what is missing much better than what is done well.

50%

7%

13%

8%

9%

5% 4%4%

Page 5: Test performance indicators

5 P a g e

idexcel

indirect measurement of the quality of a product. We calculate these indicators at every stage of the software development lifecycle and maintain the DRE for differ-ent stages such as unit, integration, System, UAT, opera-tional readiness, documentation.

Defect Rejection Ratio

Some of the assumptions made during the process controlled software development for testing activities are:

1. Testers understand the application and minor details of the requirements

2. All decisions related to requirements are well com-municated to the testers, and revised documents are shared with them

3. Testing happens in a controlled test environment

4. Configuration management is in place and testers are getting the intended build for testing

5. Testers are well trained in application process and business logic.

A defect initially raised by a tester could be later reject-ed if any one of the above assumption goes wrong. The main objective of this ‘Process Effectiveness’ indicator is to ensure that testers correctly understand the require-ments and get involved in all the phases of the software development lifecycle. Too many rejected defects result in inefficiency, and indicate the lapse in process effec-tiveness.

Test Execution Productivity Trend

This Process efficiency indicator helps find the produc-

Effort Variance

This is a process efficiency indicator which helps in understanding any extra effort injected into a testing project to complete the activities which were planned at the beginning of the project. This indicator combined with the phase wise effort indicator gives insights to address the problematic phases that must be improved in terms of efficiency and effectiveness. On the contrary, if investigations reveal that the benchmarking used for initial efforts estimation needs tweaking based on changed circumstances, management can take the necessary action to revise the benchmark.

Negative results are a good sign for management, as they can use these negative results to revise the organi-zation wide bench mark for effort estimation of similar projects.

Defects Slippage

This is one of the most important ‘Process Effectiveness’ indicators that helps in identifying the effectiveness of the testing process adopted by an organization. Defects Slippage is directly related to the company’s reputation, warranty costs, future business, contract requirement and customer satisfaction. Idexcel regularly monitors this indicator by project and analyzes the data to identi-fy and plug gaps in the existing testing process.

Defect Removal Efficiency

This is a ‘Process Efficiency’ indicator that indicates the defects removed per time unit (hours/-day/weeks/phase)

Idexcel has derived this indicator directly from the orga-nization’s debug policy, and this indicator denotes the efficiency of defect removal methods and is also an

Test Performance Indicators

Phase-Wise Testing Effort

This Process efficiency indicator helps in identifying the intensive effort areas during prevention, verification and validation attributes in the project. Prevention activities include planning, training and validation & verification activities and include time spent on tasks such as test case walkthroughs, reviews, environment set up, testing, test requirement, re-test etc. Examples of phase-wise effort in testing is shown in the chart below.

Cost Variance

This is a process efficiency indicator which helps in understanding actual cost versus planned and budgeted expenditure. Whenever testing cost overshoots the budgeted cost, management can analyze and gain in-depth understanding of the lessons learned during the given project, and this understanding can be used to l the testing costs of future projects. However, due to changes in some internal or external factors, the bench-mark used for creating budgets may become outdated, and hence overshoot.

tivity of the test team. Productivity may be defined as test cases executed by a team per unit time (hours/-days/weeks/phase/release). This indicator helps in identifying the problematic areas impacting a team’s productivity and helps take remedial action.

Test Coverage

The purpose of the Test Coverage indicator is to assess whether sufficient testing has been performed, and identify areas that may require additional testing. This metric measures the number of test cases tested successfully against total number of test cases planned for each product component at major milestones of the software development life cycle. This indicator tells us, what is missing much better than what is done well.

Scope Volatility

After the requirements and deliverables are signed off, requirement volatility becomes a major factor, especial-ly on testing tasks in a program/project. Requirement volatility can be categorized into the following types: addition to existing requirements, deletion from existing requirements, change in scope to an existing require-ment and shift in design. Under the definition of scope

4. Pattern associated with defects that will support prediction

The classification of software development phases contributing to defects is showing in the example below.

Test coverage Metric Example

0% 20% 40% 60% 80% 100% 120%

Admin UI

Website UI

3rd Party Integration

Stored Procedure

Billing

Data Migration

Settlement

Configuration

Page 6: Test performance indicators

6 P a g e

idexcel

volatility, we can also include specific issues related to testing such as re-opening fixed defects, multiple builds handed over to a testing team and change in configura-tions. These indicators not only measure the test team’s efficiency against changes in project dynamics, but also demonstrate process stability and improvements required in the existing process. The following diagram shows the scope volatility caused by different reasons.

Origin of Defects

This is a ‘Test Effectiveness’ indicator showing the software development phase or activity in which the defect occurred. This indicator helps in analyzing:

1. Major problem areas in project/program or company-wide projects/programs,

2. Patterns in defects

3. Scope to build a baseline that characterize errors, faults or failures

Test Performance Indicators

Scope Volatility

After the requirements and deliverables are signed off, requirement volatility becomes a major factor, especial-ly on testing tasks in a program/project. Requirement volatility can be categorized into the following types: addition to existing requirements, deletion from existing requirements, change in scope to an existing require-ment and shift in design. Under the definition of scope

4. Pattern associated with defects that will support prediction

The classification of software development phases contributing to defects is showing in the example below.

Requirement

Specification

Design

Code

Environment

Others51%

7% 13%

14%

10%

5%

0%

20%

40%

60%

80%

100%

120%

RequirementChange

Design Multiple builds forsame version

Configurationchanges

Page 7: Test performance indicators

7 P a g e

idexcel

Defect Priority

This indicator provides insight into the quality of the prod-uct under test. High priority defects indicate a low product quality. This information helps project stake holders to make release decisions based on the number of defects and their priority levels. Defect priorities may be classified as “Critical”, “Major”, “Minor”, “Medium”, and “Low” and must be consistently applied across different projects in an organization.

Defect Distribution across components

This indicator gives information about the way defects are distributed across the various components of the system. We also assign priorities for different components or sub-systems, helping the project stakeholders to assess risks and address issues in components or subsystems that are critical to the product, ahead of others.

Examples of Product Quality Metrics Used by Idexcel

Test Performance Indicators

Critical

Major

Minor

Medium

Low

Time to Fix Defects

This indicator gives insight into the effort required to fix defects by priority. This is the time elapsed between reporting a defect and closing the defect after retesting. Idexcel represents this with two sets of charts: a bar chart with current values with each priority and historical trend over all priorities. This is a key indicator for maintainability of the product and serves as an input to project mainte-nance cost and patch schedules.

Status Indicators

These indicators give status of test execution per unit time. These indicators give management the overall status of test activity and product quality. This indicator includes “number of test cases”, “Number of test cases executed”, “Number of Test cases passed”, “and “Number of Test cases failed”.

Defect Arrival Rate

This indicator gives a high level status of active defects for a project along with daily defect open rates, showing the trend in the quality of the product. Ideally, the defects reported should show a declining trend over time especial-ly when nearing the UAT/production release. Management and project stakeholders can take product release decisions based on this indicator.

Cumulative Defects by Status

This indicator helps to evaluate the number of defects by status in a particular release. This indicator can ensure that all the known defects have been corrected and action plans are in place for open defects. This indicator combined with Defect Removal Rate indicator helps in making release decisions better.

35%

10%

15% 10%

30%

Page 8: Test performance indicators

8 P a g e

idexcel

Examples of decision criteria we set for a particular program are:

1. Zero open critical defects

2. Less than 5% open defects with stake holders’ approval and workaround

3. Open minor defects are less than the agreed numbers

4. Defect arrival rate shows a declining trend towards zero over the last few measurement periods.

Metrics or performance indicators discussed so far address the needs of testing process product quality. How-ever, we believe that successful customer experience management requires collection and analysis of different types of customer related metrics. Customer related metrics helps us to manage customer relationships and identify improvement opportunities for improving customer satisfaction.

Some of the customer related metrics are discussed below.

Customer Problem Metrics These metrics measure the problems that customers encountered while using the product. We consider all

Test Performance Indicators

Examples of Customer Related Indicators Used by Idexcel

problems encountered by our customers while using the software product, as problems with the software and not just the valid defects. Problems that are not valid defects could be usability problems, ambiguous documentation etc. By carefully analyzing these metrics, we can:

• Improve the software development life cycle processes and reduce product defects

• Reduce the non-defect-oriented problems by improving all aspects of a products (such as usability, documenta-tion), customer education, and support.

Financial Performance Index

These metrics will help us in achieving a lower targeted level of budget spend on testing, and help us to turn testing into a more efficient process.

Page 9: Test performance indicators

9 P a g e

ConclusionContinuous improvement is the key to success for any process. Well defined test performance indicators provide a methodical approach to assess the efficiency and effectiveness of the current test processes and significantly improve the testing process in terms of coverage, time and quality.

idexcelTest Performance Indicators

Page 10: Test performance indicators

About the AuthorHarsha B N works as a Test Architect in the Mobility division of Idexcel. He has twelve years of experience in develop-ment and testing mobile applications. Prior to joining Idexcel Harsha worked with Nokia for eight years in various capacities as Program Manager, Chief Test Engineer, Project Manager working on OTA infrastructure development, Mobile Payments services, S60 SDK.

About IdexcelIdexcel is an innovative provider of IT Products & Services focused on emerging technologies. We help world leading companies build efficiencies and stronger businesses. With more than 15 years into existence Idexcel’s main focus is client satisfaction and technology innovation. Our industry expertise and a global, collaborative workforce forms the backbone of our services. We offer high degree of skills in Enterprise Applications, Cloud Services, Data-warehousing, Big Data, Analytic, QA & Testing Services, IT consulting and Staffing. Idexcel product line includes: NDS, ERP, and Cync - A revolutionary credit monitoring application for the manufacturing and �nancial management.For more information log on to www.idexcel.com.

Global Head quarters459 Herndon Parkway Suite 11Herndon, VA 20170Tel: 703-230-2600Fax: 703-467-0218Email: [email protected]

India Operations“Crystal Plaza” 9, 10 ,11Bhuvanappa Layout, Hosur RoadBengaluru – 560 029KarnatakaTel: +91-80-2550 8830Email: [email protected]

© Copyright, Idexcel. All rights reserved. No part of this document may be reproduced, stored in a retrieval system, transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the express written permission from Idexcel. The information contained herein is subject to change without notice. All other trademarks mentioned herein are the property of their respective owners.

idexcelTest Performance Indicators