seng 550: software verification and validation v&v processes and techniques prof. bojan cukic...

60
SENG 550: Software Verification and Validation V&V Processes and Techniques Prof. Bojan Cukic Lane Department of Computer Science and Electrical Engineering West Virginia University

Upload: claud-dean

Post on 28-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

SENG 550: SoftwareVerification and Validation

V&V Processes and Techniques

Prof. Bojan Cukic

Lane Department of Computer Science and Electrical Engineering

West Virginia University

2

Overview Software Inspections.

Today Software Metrics.

02/21/2002 Software Reliability Engineering.

02/28/2002

3

Agenda The big picture. Inspection Process. Applying Inspection Process. Utilizing Orthogonal Defect

Classification. Inspection Checklists.

4

The Big Picture:V&V Principles, Foundations V&V MUST be conducted throughout the

entire life-cycle. The outcome of V&V should not be

considered a binary variable. If using models, build them to satisfy

certain objective. Model’s credibility measured wrt. these

objectives. V&V requires ‘some’ independence to

prevent developer’s bias.

5

V&V Principles (2) V&V is difficult in general. Creativity and

insight are required. Credibility can be claimed ONLY for the

prescribed conditions, which have been validated/verified.

Complete model testing is not possible. Testing demonstrates the presence of

defects, not their absence! V&V must be planned and documented.

6

V&V Principles (3) Errors/defects should be detected as

early as possible. A V&V environment must provide for

repeatability. Success in V&V of submodels (modules)

does not imply overall system credibility.

A well founded problem is ESSENTIAL to the acceptability and accreditation of the V&V results.

7

V&V Techniques Informal techniques.

Tools rely heavily on human reasoning and subjectivity.

This does not imply the lack of structure or guidelines. Static techniques.

Concerned with accuracy assessment based on static design (mental execution).

Dynamic techniques. Require instrumentation, execution and analysis.

Formal techniques. Based on mathematical correctness proofs.

8

V&V Techniques

Informal:Inspections Documentation checksReviewsWalkthroughts

Static:Cause-effect graphsControl analysisData analysisFault/Failure analysisInterface analysisSemantic analysisTraceability analysis

Dynamic:Acceptance testsAssertion checkingComparison testsCompliance testsExecution profilingFault injectionInterface testsPartition testsRegression testsSensitivity analysisSpecial input testsStatistical techniquesVisualization and animation

Formal:InductionInductive assertionsInferenceLogical deductionLogical abductionCorrectness proofsConvergence proofsStability proofs

9

Agenda The big picture. Inspection Process. Applying Inspection Process. Utilizing Orthogonal Defect

Classification. Inspection Checklists.

10

Software Inspection Process The goal of software inspections is

to remove as many defects as possible prior to product release. Defect removal efficiency:

Inspections contribute to high defect removal efficiency.

%100]_/____[ xfaultstotalreleasepriorfounddefectsNum

11

Why effective? Most software problems can be traced

to requirements. Requirements are usually written as English

prose. Personnel training problems.

Requirements elicitation, analysis, negotiation, specification, validation, management, etc.

Imprecise, ambiguous, nondeterministic. Software, by its formal nature, is precise,

unambiguous, deterministic (?).

12

What is an Inspection? Rigorous, in-depth technical review. Identifies problems close to the

point of their origin. Developed in the 1970’s at IBM

[Fagan] Objectives:

Upon finding problems, assure that an agreement is reached on the course of action.

13

Inspection objectives (2) Verify rework against predefined

criteria. Provide data on product quality and

process effectiveness Build technical knowledge of team

members. Increase the effectiveness of software

testing. Raise the standards of excellence for

software engineers.

14

Common questions Are inspections formal or informal

V&V technique? They are formal, wrt. The defined

roles and responsibilities of participants.

They are informal wrt. The level of mathematical rigor.

15

Common questions (2) Inspection meeting participants:

Moderator: Coordinates the process, leads discussions.

Producer: Submits the artifacts being inspected. Reader: Presents the artifacts at the meeting. Inspector(s): Inspects the artifact. Recorder: Documents and records problems. Manager: Supports the organization of meetings.

16

Responsibilities: Moderator

A key player. A trained, adequately prepared individual

Technical and managerial skills required. Selects inspection team. Ensures team members devote sufficient

time to preparations. If the team is unprepared, postpones the

meeting. Leads team discussions, mediates disputes. Recognizes issues, keeps the team focused. Ensures proper documentation of problems.

17

Responsibilities: Producer Inspections conducted to HELP him/her.

Ensures product readiness. Resolves problems identified by the

inspection team. Must remain objective (not defensive). At the meeting, clarifies the issues not

clear to the inspector. No need to justify design/implementation

style, unless it affects compliance with the requirements.

18

Responsibilities: Reader Selecting, describing portions of the

product that are the focus of inspection. Diverts attention from producer to

product. Thoroughly familiar with the product. Identifies logical chunks of the product

allowing the meeting to stay focused at one problem at a time.

19

Responsibilities: Inspectors

Selected based on knowledge and familiarity with the product. Represent a cross-section of available skills

(software, marketing, manufacturing…). Look for discrepancies between the product,

documentation, standards. Inspectors are producers too (in different

meetings). Focus on problem identification, not a solution. Objectivity, criticize the product, not the

producer.

20

Responsibilities: Recorder An optional role. Needed if this is a

very time consuming task. Captures and records the description of

noticed problems. Recorder is an inspector too. Provide support for the moderator by

providing/recording additional information.

21

Responsibilities: Manager Helps decide what to inspect.

Must accommodate scheduling issues. Must allocate resources. Supports inspection training. May participate in the selection of

moderators. Discusses the results with the

moderator. Not present at actual meetings.

22

Inspections vs. Walk-Through

23

Inspection Process Attributes

Defined roles and responsibilities. Documentation supporting the process. Collection of product and process metrics.

Support the analysis of global trends in the quality of the product under consideration.

Inspection against documents preceding the current artifact.

Availability of supporting infrastructure. Training, avoiding “why did you do it this way?”

questions. Planning, support of managers and supervisors.

24

What to inspect? Producer and manager make the

choice. Critical product functions. Complex modules. Modules that have been “problematic” in

the past. Experience of the producer. Safety, criticality, reliability, maintainability,

availability, security (integrity and confidentiality)…

25

26

Mechanics of inspections The team must reach consensus on

issues recorded as errors and defects. Error is a problem (lack of compliance

with the requirements) identified at the point of origin.

Defect is found beyond the point of origin (ex. Design problem identified in code).

The producer doesn’t get to vote.

27

Mechanics of inspections (2) Inspection meetings limited to 2

hours. Posting results of individual

inspection meetings is controversial. Consider posting aggregate results. Supports quality improvement without

personalizing the guilt. Inspection is complete when all the

problem reports are closed.

28

Agenda The big picture. Inspection Process. Applying Inspection Process. Utilizing Orthogonal Defect

Classification. Inspection Checklists.

29

Inspection Process Attributes

Inspections MUST BE an integral part of software development. Process must be defined and documented. Flexible, allowing changes. Participants agree to follow the process. The process includes metrics collection. Metrics are utilized for process modifications. Actively managed.

Attributes are indicators of process maturity.

30

Corporate resistance Management issues:

Support for objections, commitment of resources, schedule concerns.

Software development process: Does it exist? Could it accommodate the

inclusion of inspections? Training? Software developers:

A fear of performance reviews. Metrics collection:

Readiness, acceptance, focus on software quality.

31

Requirements Inspections Objectives

Is every requirement traceable to the preceding document? Is every requirement clear and concise, internally consistent, unambiguous, testable/demonstrable.

Prerequisites The preceding document exists and is

accepted, SRS been internally reviewed, availability of a checklist.

32

Requirements Inspections (2)

Planning phase. Diversity of inspector’s backgrounds, include

clients (if possible). Identification of the SRS, other documentation.

Preparation phase. Self-study, inspector-producer discussions.

Inspection meeting. Checking preparedness, discussing

discrepancies. Follow-up phase

33

Design Inspection Objectives

SRS-SDD compliance, traceability, design conformance to standards.

Prerequisites SRS has been inspected and

completed, SDD internally reviewed, availability of checklists, availability of design documentation (CASE tools).

34

Design Inspection (2) Planning phase.

Inspector’s backgrounds include software engineers, QA, hardware engineers.

Identification of the SRS, SDD other documentation. Overview meeting phase.

Producer’s presentation, inspectors ask questions. Preparation phase. Inspection meeting.

Checking preparedness, discussing discrepancies, going through checklists.

Follow-up phase

35

Code Inspections Objectives and Prerequisites

SDD inspected and accepted. Compiled code.

50-100 lines of C code per hour of preparations.

100-200 lines per hour of inspection.

36

Test Script Inspections Objectives:

Accurate validation of requirements in SRS, taking advantage of known design decisions.

Prerequisites: Internally reviewed and executed

tests/scripts, checklists, acceptable test results.

37

Standardized forms The inspection problem report form The inspection provess summary

form Checklists See examples in Appendices C and

D [S Rakitin]

38

Agenda The big picture. Inspection Process. Applying Inspection Process. Utilizing Orthogonal Defect

Classification. Inspection Checklists.

39

Principles of ODC A technique that bridges the gap

between quantitative and qualitative methods.

Extracts semantic information in defects via classification.

Quantitative progression of defect counts through a project lifecycle is shown next.

40

41

Defect-Type Attribute Function: Affects significant

capability, user features, APIs. Requires formal design change.

Assignment: Initializations, etc. Interface: Errors in inter-component

interactions. Checking: Program logic that fails to

properly validate data, loop conditions…

42

Defect-Type Attribute (2) Timing/Serialization: Shared and real-time

resources. Build/Package/Merge: Library systems,

version control. Documentation: Errors in publications and

maintenance nodes. Algorithm: Efficiency or correctness

problems. Requires reimplementing an algorithm or a data

structure.

43

44

Defect Trigger Attribute The activity that facilitates defect

discovery. System test triggers.

Recovery and exception handling, workload and stress, HW and SW configurations, etc.

Function test triggers. Test coverage, test sequencing, test

interaction, coverage (covered later in this class).

45

Review and Inspection Triggers

Backward release compatibility. Lateral compatibility.

Documentation within the same release. Design conformance. Concurrency. Operational semantics.

Understanding the logic flow. Document consistency and completeness. Rare situation.

Extensive experience of an inspector.

46

47

48

49

50

Agenda The big picture. Inspection Process. Applying Inspection Process. Utilizing Orthogonal Defect

Classification. Inspection Checklists: An

experiment.

51

The use of checklists

ChecklistModel

SoftwareDefectReports

DefectModel

SynthesizedChecklist

SourceCode

Source CodeModel

ProcessorIdentifiedDefects

1 2 3 4

6

5

7

52

NASA Inspection Process A peer-review process that

examines the work-in-progress, with the objective of finding defects!

Generic Process Structure: Planning, Overview, Preparation,

Examination, Rework, Follow-up Several variations to this structure

has been explored (e.g., N-fold inspections, two-Person Inspections, GroupWare environments)

53

Orthogonal Defect Classification (ODC) of NASA Project Defect Data

Reasons for selecting ODC: ODC is a proven classification scheme Capable of comparing NASA process with

other research efforts that have chosen to employ ODC

It is believed that the structure of ODC enables the synthesis of checklists

Automated techniques are based on the analysis of the defect model This reflects the latest project experience

and rules most frequently broken Techniques that can find ‘simple’ defects,

which will free precious time to look for more project specific issues

54

ODC of NASA Project Defect Data

Frequency of Defect Triggers

340

189

94

6 10

50100150200250300350400

Logic / Flow languageDependency

InternalDocument

Concurrency RareSituation

Defect Trigger

Fre

qu

en

cy

Frequency of Defects

273

161126

538 5 4 0 0

050

100150200250300

Defect Type

Nu

mb

er

of

De

fect

s

Results from Code Analysis

Project (SLOC > 3 million)

Distributed, real-time system used to control, monitor, & prepare processes that will be used during safety critical events requiring high assurance

55

Inspection Checklists Defect Model (ODC) provides these values

Defect Type = Where to look Defect Trigger = How to detect

Problem: A defect with its associated defect type and trigger is vague

Synthesizing a checklist is a subjective process Where to look: Defect type points to certain

regions of code How to detect: with respect to the defect type,

the defect trigger shows what the inspector was concentrating on when the defect was discovered

56

Inspection Checklists (cont) The “where to look” component of the checklist will

reflect the language constructs that Assignment types refer to.

The “How to detect” component of the checklist is derived by assessing the description of the defect and determining what was wrong.

Assignment Variables

The description of the defect provides the construct

***Future Work - a complete mapping of defect type to language construct***

Assignment Variables

Must be initialized

MAGIC literals must beassigned to a constant

57

Building an Automated Environment

An ideal mechanism to automate defect detection should consist of: The means to model the source code The means to search the source code in

pursuit of violated rules The Purpose of the Environment:

Apply checklists to source code in order to automatically find ‘simple’ defects, which will free up inspectors time to focus attention to more project specific issues

Be reconfigurable to allow a change in a checklist

58

A Prototype

This checklist items are used as examples

Defect No. Where To Look

How To Detect

1 Variables Are variables not being initialized?

2 Commenting Are the comments and comment percentage adequate?

3 Variables Are variables being assigned to MAGIC and/or hard-coded literals?

4 Conditionals Are exact equality tests being used on floating-point numbers

5 Loops Does the value of the condition ever change in the header or the body of the loop?

59

A Prototype (2)

Defect No. 1

Defect No. 2

Defect No. 3

Defect No. 4

Defect No. 5

60

Lessons Learned Automation of inspections is achievable

and desirable. Difficulties lie in generally stating what

the inspectors are concerned about. But this is where the value of inspections too.

Using XML and its family of technologies is widely supported, internet based (supports distributed inspection teams).

XML models of source code are LARGE (the project used during this research, 558 SLOC, generated a JavaML model over 2KSLOC)