isvv effectiveness measurement in esa space projects

33
ESA UNCLASSIFIED – For Official Use ISVV Effectiveness Measurement in ESA Space Projects Pedro A. Barrios, Maria Hernek, Marek Prochazka European Space Agency NASA IV&V Workshop 11-13 September 2012

Upload: raja-mccarty

Post on 02-Jan-2016

42 views

Category:

Documents


0 download

DESCRIPTION

ISVV Effectiveness Measurement in ESA Space Projects. Pedro A. Barrios, Maria Hernek , Marek Prochazka European Space Agency NASA IV&V Workshop 11-13 September 2012. Objective / Outline. Objective - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: ISVV Effectiveness Measurement in ESA Space Projects

ESA UNCLASSIFIED – For Official Use

ISVV Effectiveness Measurement in ESA Space Projects

Pedro A. Barrios, Maria Hernek, Marek ProchazkaEuropean Space Agency

NASA IV&V Workshop 11-13 September 2012

Page 2: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 2

ESA UNCLASSIFIED – For Official Use

Objective / Outline

Objective

Present the results of an ESA study to assess the effectiveness of the ISVV process carried out in the scope of ESA missions

Assessment of past ISVV projects, with the following final objectives:

– Identify what is useful in ISVV process (i.e. what brings results)

– Identify what needs to be improved (i.e. added/removed/clarified/...)

– Make unified metrics collection an integrated part of the process

Outline

• ESA ISVV process: a quick overview

• ISVV metrics definition

• ISVV metrics collection & analysis

• Conclusions and future work

Page 3: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 3

ESA UNCLASSIFIED – For Official Use

Independent Software Verification & Validation (ISVV) by ESA

1. ISVV is required for Mission and Safety Critical software, (ECSS-E-40/ECSS-Q-80)

2. ISVV tasks are additional and complementary to the nominal SW supplier’s verification and validations tasks

3. ISVV tasks cover verification and validation of software requirements, design, code and tests (typically starting at SW-SRR and finishing before the SW-QR)

4. ISVV supplier is required to be an organization independent of the software supplier as well as the prime/system integrator (full technical, managerial, and financial independence)

5. Most ESA projects implement the ISVV process as an industrial contract placed by the Prime contractor

Page 4: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 4

ESA UNCLASSIFIED – For Official Use

ESA ISVV Process overview

MAN. Management

IVA. Independent Validation

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

MAN.VV.ISVV level definition

IVA.Validation

MAN.PM.ISVV Process Management

• 6 activities/STAGES: Management (MAN), Verification (IVE) and Validation (IVA)

• Activities are composed of TASKS, and these are further split into SUBTASKS

1. Management (MAN.PM and MAN.VV) is concerned with issues such as ISVV objectives and scope, planning, roles, responsibilities, budget, communication, competence, confidentiality, schedule and ISVV level definition (to limit the scope of ISVV)

2. Technical Specification Analysis (IVE.TA) is verification of the software requirements

3. Design Analysis (IVE.DA) is verification of the SW Architectural Design and the Software Detailed Design

4. Code Analysis (IVE.CA) is verification of the SW source code

5. Validation (IVA) is testing of the SW to demonstrate that the implementation meets the technical specification

Page 5: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 5

ESA UNCLASSIFIED – For Official Use

ESA ISVV Process overview

Example of a Task/Subtask description

• Activity: Technical Specification Analysis

• Task: SW Requirements Verification

• Subtasks: T1.S1, T1.S2 … T1.S11

• Start/End Events

• Inputs/Outputs

• Methods are identified for each subtask

Some numbers:• IVE.TA 1 task 11 subtasks

• IVE.DA 3 tasks 15/12/5 subtasks

• IVE.CA 3 tasks 10/5/3 subtasks

• IVA 3 tasks 3/3/3 subtasks

Page 6: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 6

ESA UNCLASSIFIED – For Official Use

ESA ISVV Process overviewIVE: Technical Specification Analysis

SW Requirements Specification (PDR)

Interfaces Control Document (PDR)

System

Requirements

allocated to

System Requirements allocated to

Software (SRR)

SW-

Requirements

SW-HW Interface

Requirements (SRR)

Subtasks: To verify

• Software Requirements external consistency with the system requirements

• Interface Requirements external consistency with the system requirements

• software requirements correctness

• consistent documentation of the software requirements

• software requirements completeness

• dependability and safety requirements

• readability of the software requirements

• timing and sizing budgets of the software requirements

• Identify test areas and test cases for Independent Validation

• that software requirements are testable

• software requirements conformance with applicable standards

TA.T1: Software Requirements Verification

Page 7: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 7

ESA UNCLASSIFIED – For Official Use

ESA ISVV Process overviewIVE: Design Analysis

Technical Specification (PDR)

SW Architectural Design (PDR)

Interfaces Control Doc

(PDR)

Subtasks: To verify

• SW architectural design external consistency with Technical Specification

• SW architectural design external consistency with Interface Control Documents

• interfaces consistency between different SW components• architectural design correctness • architectural design completeness• dependability & safety of the design• readability of the architectural design• timing and sizing budgets of the software• Identify test areas and test cases for independent Validation• architectural design conformance with applicable standards

if models are produced by the SW suppliers: • Verify test performed on high level model • Verify development and verification and testing methods and

environment• then construct model test cases & model test procedures• then execution of model test procedures

DA.T1: Architectural Design Verification

Page 8: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 8

ESA UNCLASSIFIED – For Official Use

ESA ISVV Process overviewIVE: Design Analysis

Subtasks: To verify• detailed design external consistency with Technical Specification• detailed design external consistency with Interface Control

Documents• detailed design external consistency with Architectural Design• interfaces consistency between different SW components• detailed design correctness• detailed design completeness• dependability & safety of design• readability of detailed design• timing and sizing budgets of software• accuracy of the model (in case models are produced by the SW

suppliers)• Identify test areas and test cases for independent Validation• Verify detailed design conformance with applicable standards

Subtasks: To verify• timing and sizing budgets of software• that dependability & safety aspects on product are

specified in the SUM• readability of User Manual• completeness of User Manual• correctness of User Manual

DA.T2: Detailed Design Verification

DA.T3: Software User Manual Verification

Page 9: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 9

ESA UNCLASSIFIED – For Official Use

ESA ISVV Process overviewIVE: Code Analysis

CA.T1: Source Code Verification

Subtasks: To verify• source code external consistency with Technical Specification• source code external consistency with Interface Control Documents• source code external consistency with Architectural Design and Detailed Design• interfaces consistency between different SW units• source code correctness with respect to technical specification, architectural design and detailed design• source code readability, maintainability and conformance with the applicable standards• dependability & safety of source code• Source code accuracy • Identify test areas and test cases for independent Validation• timing and sizing budgets of the software

Page 10: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 10

ESA UNCLASSIFIED – For Official Use

ESA ISVV Process overviewIVE: Code Analysis

CA.T2: Integration Test Specification and Test Data VerificationSubtasks: To verify• consistency with Technical Specification• consistency with Software Architectural Design• integration test procedures correctness and completeness• If models are produced by the SW suppliers, then evaluate model verification and validation test results• integration test reports

CA.T3: Unit Test Procedure and Test Data VerificationSubtasks: To verify• consistency with Software Detailed Design• unit test procedures correctness and completeness• unit test reports

Page 11: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 11

ESA UNCLASSIFIED – For Official Use

ISVV effectiveness metrics

• Key goal of activity is to estimate effectiveness of the ISVV process carried out in scope of ESA projects

• Major objective is to provide measurements and conclusions to support identification and prioritization of ISVV activities based on their ‘efficiency’

• Improve ISVV process is an additional objective

ISVV effectiveness to be calculated based on number of findings and their acceptance and impact

Based on number of findings, the following metrics are computed: findings per ISVV stage / task / subtask; finding per severity; findings per type and effective findings.

Page 12: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 12

ESA UNCLASSIFIED – For Official Use

Measurement Process

• 3 steps activity: ISVV metrication definition / ISVV metrics collection / ISVV metrics assessment

• Industrial context:

• Measurement needs and processes started by ESA

• Provision of metrics performed through different small contracts granted by ESA to different ESA ISVV suppliers

• Data analysis, collection and metrics analysis and calculation performed by an ESA contractor to this activity

Page 13: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 13

ESA UNCLASSIFIED – For Official Use

Measurement Process

Data gathering, with following contents:

• SW product metrics (size in kLOC, number of requirements, criticality)

• ISVV project metrics (ISVV level, ISVV scope and stages, documentation quality at reviews)

• Findings (task, subtask, which document, type, severity, use of tools, acceptance, impact measured in number of changes)

Note: excel tool was used

Page 14: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 14

ESA UNCLASSIFIED – For Official Use

Measurement Process

• 15 products from 5 projects

• 4 different ISVV suppliers

• The IVE effectiveness metrics are assessed:

o per product

o per SW products of similar size

o In total, i.e. in all projects and SW products considered

• Analysis is performed:

o Per all stages

o Per ISVV project stage

o Per ISVV task /subtask

Findings per stage/task/sub-task, per severity, per type, Effective Findings & Tools usage

Project Product SIZE TYPEGAIA GAIA intermediate Big ASW

CryoSat CryoSat CDMU Medium ASW  CryoSat AOCS Medium ASWLISA PF LISA PF BSW Small BSW  LISA PF ASW Big ASW  LISA PF DHSW Medium ASWGalileo MSF Galileo MSF Big ASW

Galileo MGF Galileo MGF Big ASW

Galileo IPF Galileo IPF AF Medium ASW

  Galileo IPF RTMC Big ASWGalileo NSGU Galileo NSGU Big ASW

Galileo PxSU Galileo PxSU BSW Medium BSW

  Galileo PxSU ASW Medium ASW

ATV ATV FAS Big ASW  ATV MSU Medium ASW       

Note: Only one product classified as small

Page 15: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 15

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (1/10)

Total Findings• Total number of IVE findings for 15 products within this analysis is 2492

• No clear relationship between findings & product size

red=big products ; blue=medium; green=small

Mean & Standard deviation

Page 16: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 16

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (2/10)

Findings per stage

TA: Technical Specification AnalysisCA: Code AnalysisDA: Design Analysis

IVE.CA82433%

IVE.DA71028%

IVE.TA95839%

Share of total findings per stage

Findings per stage per Product

103

349

562

596

413

369

46

27

27

0%10%20%30%40%50%60%70%80%90%

100%

Big Medium Small

IVE.TA

IVE.DA

IVE.CA

Findings per size per stage

• Although there is some variability per product, number of findings are roughly 1/3 for three stages

• The majority of findings are at the TA stage for big type products, and CA stage for small

Page 17: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 17

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (3/10)

Findings per task (TA: Technical Specification Analysis)

IVE.TA.T2 (Requirements verif)70279%

IVE.TA.T1 (Requirements traceab verif)19121%

Share of total findings for TA tasks

154

408

37

267

0

27

0%10%20%30%40%50%60%70%80%90%

100%

Big Medium Small

IVE.TA.T2

IVE.TA.T1

Findings per size for TA tasks

• Majority of findings of TA stage are at TA.T2 task (Software Requirements Verification) for all products, all projects, all product sizes with only one exception.

• As the size of products decreases, more findings are discovered at TA.T2 task

Findings per product for TA tasks

Page 18: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 18

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (4/10)

Findings per task (DA: Design Analysis)

IVE.DA.T2 (AD verification)288

47,1%

IVE.DA.T3 (DD traces verif)32

5,2%

IVE.DA.T4 (DD verif)257

42,1%

IVE.DA.T1 (AD traces verif)33

5,4%

IVE.DA.T5 (SUM verif)1

0,2%

Total share of findings for DA stage

10

170

22

142

1

22

113

8

96

0

152

19

0

0%

20%

40%

60%

80%

100%

Big

Me

diu

m

Sm

all

IVE.DA.T5

IVE.DA.T4

IVE.DA.T3

IVE.DA.T2

IVE.DA.T1

Majority of findings of the DA stage are either at DA.T2 (Architectural Design Verification) task or at DA.T4 task (Detailed Design Verification) depending of the different products.

Findings per product for DA tasks

Findings per size for DA tasks

Page 19: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 19

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (5/10)

Findings per task (CA: Code Analysis)

IVE.CA.T2 (Verif of code)29956%

IVE.CA.T3 (IT tests verification)18535%

IVE.CA.T4 (UT test verification)407%

IVE.CA.T1 (traces of code)122%

Total share of findings for CA stage

Share of findings per product for CA tasks

1

183

85

13

4

95

82

27

7

21

18

0

0%

20%

40%

60%

80%

100%

Big

Me

diu

m

Sm

all

IVE.CA.T4IVE.CA.T3IVE.CA.T2IVE.CA.T1

Findings per size for CA tasks

Majority of findings of CA stage are at the CA.T2 task (Source Code Verification) in totals, then it varies product by product. CA.T3 (IT tests Verification) also represents a big share

Page 20: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 20

ESA UNCLASSIFIED – For Official Use

IVE.TA.T2.S4105

15,0%

IVE.TA.T2.S3100

14,2%

IVE.TA.T2.S592

13,1%

IVE.TA.T2.S810

1,4%

IVE.TA.T2.S101

0,1%

IVE.TA.T2.S1338

48,1%

IVE.TA.T2.S256

8,0%

ISVV metrics collection & analysis (6/10)

Findings per sub-task (e.g. TA subtasks)

IVE.TA.T1.S31

1%

IVE.TA.T1.S174

39%

IVE.TA.T1.S2179%

IVE.TA.T1.S499

51%

• Exact numbers are available for all the subtasks

• There are subtasks producing a reduced number of findings. Three possible cases: subtask not performed within the ISVV project, subtask not producing findings or data not available for the subtask

T2: Software Requirements Verification

T1: Requirements Traceability Verification

Page 21: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 21

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (7/10)

Findings per severity

Major90836%

Minor144358%

Comment, very low1416%

• Most of findings are minor. Major findings account for 36%.• Proportions found across the three stages (TA, DA, CA) are similar to these

numbers

Page 22: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 22

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (8/10)

Findings per type

490

350

67

173

3288610

383

307

191

125723

178

7

12

27

42

5

14

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Big

Medoim

Sm

all

Typo

Readability & Maintainability

N/A

No Problem

Technical feasibility

Internal Consistency

External Consistency

Completeness

Correctness

Most of findings are of type correctness, followed by findings of type completeness

Page 23: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 23

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (9/10)

Tools usageAutomated

1%

Manual97%

Semi-automated

2%

Majority of findings were discovered manually (97% of the total findings) and only very few of them using tools (either to automatically discover the finding or the so-called ‘semi-automated’, using tools to further evaluate to discover any finding)

Page 24: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 24

ESA UNCLASSIFIED – For Official Use

ISVV metrics collection & analysis (10/10)

Effective findings (ISVV findings that implied a change, improvement, correction to the software product)

Effective69%

Not effective

31%

Accepted findings per product

• Majority of findings are effective no matter product & size, except for small product for which majority of findings are not effective

• Majority of findings per stage are effective (72% TA & DA stages; 61% at CA stage)

• Majority of findings are effective for all severities (70% for major, 69% for minor) 880

287

807

418

27

73

0%

20%

40%

60%

80%

100%

Big

Med

ium

Sm

all

Not effective

effective

Accepted findings per size

Page 25: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 25

ESA UNCLASSIFIED – For Official Use

Conclusions (1/2)

• Total number of findings

- Measurements based on number of findings

- Focus on IVE metrics

- No correlation found between number of findings & product size

• Total number of findings per ISVV stage / task /subtask

- Stage: Roughly even distribution (39% TA, 28% DA, 33% CA)

- Task/Subtasks: Identified the tasks producing most of the findings for TA, DA, CA (e.g. ‘Correctness/Completeness’ subtasks are producing many findings; ‘consistency’ subtasks produce some)

• Type of findings: The majority of findings are of type correctness (36%) & completeness (28%)

• Effective findings: The majority of findings (69%) are effective (i.e. implying changes/corrections to the software product)

Page 26: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 26

ESA UNCLASSIFIED – For Official Use

Conclusions (2/2)

• Severity: Most findings are minor at all stages, with 58% minor, 36% major and remaining 6% for other severity (comment, very low)

• Tools: The majority of the findings were discovered manually (97%) and only very few of them using tools. Tools were used only for 3% of findings (especially at the CA stage)

Example: if we started today an ISVV contract on a project, we could expect, on average: 166 findings, from which 115 would be effective findings; out of those, 41 would be major findings, and those would be spread on the different stages as: IVE: 16 TA, 11 DA, 14 CA

Page 27: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 27

ESA UNCLASSIFIED – For Official Use

Future work

• Collect metrics for the upcoming ISVV projects.

• Analyze tasks/sub-tasks not producing many findings (they might need better explanations within the ISVV guide, review the methods and tools proposed to be used when performing them, …)

• Analyze Independent Validation:

o Define useful metrics for IVA and asses IVA effectiveness

o Extend the scope of IVA, to cover Qualification & Acceptance of the SW and the Operational scenarios (i.e. having the operational view to create SW validation campaigns)

• Modeling:

o Some model related sub-tasks have not been ‘profiled’

o Understand how models produced during SW development could be used during ISVV activities (e.g. Model-Based Testing techniques to produce validation campaigns)

• ISVV effectiveness Metrics:

o Some other way how to measure effectiveness?

o Cost figures?

Page 28: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 28

ESA UNCLASSIFIED – For Official Use

Thanks for your attention !!!!

For more information, please contact:

Pedro A. Barrios, European Space Agency

[email protected]

Page 29: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 29

ESA UNCLASSIFIED – For Official Use

Back-up Slides

Page 30: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 30

ESA UNCLASSIFIED – For Official Use

Findings per sub-task (1/3)

IVE: Technical Specification Analysis

TA.T1: Software Requirements Verification(===) IVE.TA.T1.S1: Verify Software Requirements external consistency with the system requirements

(===) IVE.TA.T1.S2: Verify Interface Requirements external consistency with the system requirements

(+++) IVE.TA.T1.S3: Verify software requirements correctness

(===) IVE.TA.T1.S4: Verify the consistent documentation of the software requirements

(+++) IVE.TA.T1.S5: Verify software requirements completeness

(+++) IVE.TA.T1.S6: Verify the dependability and safety requirements

(+++) IVE.TA.T1.S7: Verify the readability of the software requirements

(---) IVE.TA.T1.S8: Verify the timing and sizing budgets of the software requirements

(---) IVE.TA.T1.S9: Identify test areas and test cases for Independent Validation

(---) IVE.TA.T1.S10: Verify that the software requirements are testable

(---) IVE.TA.T1.S11: Verify software requirements conformance with applicable standards

High level view of number of findings per sub-taskLegend:

(+++): Subtask producing a considerable number of findings

(===): Subtask producing some findings

(---): Subtask producing a reduced number of findings

(xxx): Metrics not available for that subtask

Page 31: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 31

ESA UNCLASSIFIED – For Official Use

Findings per sub-task (2/3)

IVE: Design Analysis

DA.T1: Architectural Design Verification(===) IVE.DA.T1.S1: Verify the SW architectural design external consistency with the Technical Specification (---) IVE.DA.T1.S2: Verify the SW architectural design external consistency with the Interface Control Documents (===) IVE.DA.T1.S3: Verify interfaces consistency between different SW components(===) IVE.DA.T1.S4: Verify architectural design correctness (===) IVE.DA.T1.S5: Verify architectural design completeness(===) IVE.DA.T1.S6: Verify the dependability & safety of the design(+++) IVE.DA.T1.S7: Verify the readability of the architectural design(===) IVE.DA.T1.S8: Verify the timing and sizing budgets of the software(---) IVE.DA.T1.S9: Identify test areas and test cases for independent Validation(---) IVE.DA.T1.S10: Verify architectural design conformance with applicable standards(xxx) IVE.DA.T1.S11: Verify the test performed on the high level model (xxx) IVE.DA.T1.S12: Verify the development and verification and testing methods and environment(xxx) IVE.DA.T1.S13: then construct model test cases(xxx) IVE.DA.T1.S14: then construct model test procedures(xxx) IVE.DA.T1.S15: then execution of model test procedures

DA.T2: Detailed Design Verification(---) IVE.DA.T2.S1: Verify the detailed design external consistency with the Technical Specification(---) IVE.DA.T2.S2: Verify the detailed design external consistency with the Interface Control Documents(---) IVE.DA.T2.S3: Verify the detailed design external consistency with the Architectural Design(+++) IVE.DA.T2.S4: Verify interfaces consistency between different SW components(===) IVE.DA.T2.S5: Verify detailed design correctness(===) IVE.DA.T2.S6: Verify detailed design completeness(+++) IVE.DA.T2.S7: Verify the dependability & safety of the design(---) IVE.DA.T2.S8: Verify the readability of the detailed design(===) IVE.DA.T2.S9: Verify the timing and sizing budgets of the software(xxx) IVE.DA.T2.S10: Verify the accuracy of the model (in case models are produced by the SW suppliers)(---) IVE.DA.T2.S11: Identify test areas and test cases for independent Validation(---) IVE.DA.T2.S12: Verify detailed design conformance with applicable standards

Page 32: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 32

ESA UNCLASSIFIED – For Official Use

Findings per sub-task (3/3)

DA.T3: Software User Manual Verification(---) IVE.DA.T3.S1: Verify the timing and sizing budgets of the software(---) IVE.DA.T3.S2: Verify the dependability & safety aspects on the product are specified in the SUM(---) IVE.DA.T3.S3; Verify the readability of the User Manual(---) IVE.DA.T3.S4; Verify the completeness of the User Manual(---) IVE.DA.T3.S5: Verify the correctness of the User Manual

IVE: Code Analysis

CA.T1: Source Code Verification(---) IVE.CA.T1.S1: Verify source code external consistency with Technical Specification(---) IVE.CA.T1.S2: Verify source code external consistency with Interface Control Documents(---) IVE.CA.T1.S3: Verify source code external consistency with Architectural Design and Detailed Design(---) IVE.CA.T1.S4: Verify interfaces consistency between different SW units(+++) IVE.CA.T1.S5: Verify source code correctness with respect to technical specification, architectural design & detailed design(+++) IVE.CA.T1.S6: Verify the source code readability, maintainability and conformance with the applicable standards(+++) IVE.CA.T1.S7: Verify the dependability & safety of the source code(---) IVE.CA.T1.S8: Verify the accuracy of the source code(---) IVE.CA.T1.S9: Identify test areas and test cases for independent Validation(===) IVE.CA.T1.S10: Verify the timing and sizing budgets of the software

CA.T2: Integration Test Specification and Test Data Verification(===) IVE.CA.T2.S1: Verify consistency with Technical Specification(---) IVE.CA.T2.S2: Verify consistency with Software Architectural Design(+++) IVE.CA.T2.S3: Verify integration test procedures correctness and completeness(xxx) IVE.CA.T2.S4: If models are produced by the SW suppliers, then evaluate model verification and validation test results(xxx) IVE.CA.T2.S5: Verify integration test reports

CA.T3: Unit Test Procedure and Test Data Verification(---) IVE.CA.T3.S1: Verify consistency with Software Detailed Design(===) IVE.CA.T3.S2: Verify unit test procedures correctness and completeness(xxx) IVE.CA.T3.S3: Verify unit test reports

Page 33: ISVV Effectiveness Measurement in ESA Space Projects

ISVV Effectiveness Measurement | Pedro A. Barrios | European Space Agency | 27/07/2012 | TEC-SWS | Slide 33

ESA UNCLASSIFIED – For Official Use

IVA: Independent Validation

IVA.T1: Identification of Test Cases• IVA.T1.S1: Evaluate Task Input Inspection

• IVA.T1.S2: Perform Analysis

• IVA.T1.S3: Writing Independent Validation Test Plan

IVA.T2: Construction of Test Procedures • IVA.T2.S1: Achieve knowledge about the SVF

• IVA.T2.S2: Implement Test Cases into Test Procedures

• IVA.T2.S3: Updating the Independent Validation Test Plan

IVA.T3: Execution of Test Procedures• IVA.T3.S1: Execute the Test Procedures

• IVA.T3.S2: Investigation of failed tests

• IVA.T3.S3: Produce Test Report