nasa iv&v facility software independent verification and validation (iv&v) nasa iv&v...
TRANSCRIPT
NASAIV&V
Facility
Software Independent Verification and Validation
(IV&V)
NASA IV&V FacilityFairmont, West Virginia
Judith N. BrunerActing Director304-367-8202
NASAIV&V
Facility
Content
• Why are we discussing IV&V?
• What is IV&V?
• How is IV&V done?
• IV&V process
• Why perform IV&V?
• Summary
• Points of Contact
NASAIV&V
Facility
Why are we discussing IV&V?
NASAIV&V
Facility
Setting the Stage
In the 90s, the Commanding General of theArmy’s Operational Test and Evaluation
Agency noted that 90 percent of systems that were not ready for scheduled operational tests
had been delayed by immature software.
NASAIV&V
Facility Software “Chaos”
16% Successful- In Budget- On Time- Meets Requirements- User involved
31% Cancelled- Development
53% “Challenged”- Over budget by 189%- Late by 222%
- Missing 39% of Capabilities
Note: For Large Companies- 9% were Successful
- 61.5% Challenged - Over budget by 178% - Late by 230% - Missing 58% of Capabilities- 29.5% were Cancelled
The Standish Group examined 8,380 Software Projects.
NASAIV&V
Facility
68
Error Densities
68%9%
23%RequirementsSpecification Installation &
Commissioning
Design &Implementation
NASAIV&V
Facility
Increasing Cost of Changes
(Normalized to Requirements Phase)
The cost to correct an software error multiplies during thedevelopment lifecycle.
Cos
t sc
ale
fact
or
NASAIV&V
Facility
What is IV&V?
NASAIV&V
Facility
Independent Verification and Validation (IV&V)
• Independent– Technical: IV&V prioritizes its own efforts
– Managerial: Independent reporting route to Program Management
– Financial: Budget is allocated by program and controlled at high level such that IV&V effectiveness is not compromised
• Verification (Are we building the product right?) – The process of determining whether or not the products of a given phase of the
software development cycle fulfill the requirements established during the previous phase
– Is internally complete, consistent and correct enough to support next phase
• Validation (Are we building the right product?)– The process of evaluating software throughout its development process to ensure
compliance with software requirements. This process ensures:• Expected behavior when subjected to anticipated events
• No unexpected behavior when subjected to unanticipated events
• System performs to the customer’s expectations under all operational conditions
NASAIV&V
Facility
Independent Verification & Validation
Software IV&V is a systems engineering process employing rigorous methodologies for evaluating the correctness and quality of the software product throughout the software life cycle
Adapted to characteristics of the target program
NASAIV&V
Facility
How is IV&V done?
NASAIV&V
Facility
IV&V Activities Throughout Lifecycle
Requirements Phase
•System Reqts Analysis•S/W Reqts Analysis•Interface Analysis•Process Analysis•Technical Reviews & Audits
Design Phase
•Design Analysis•Interface Analysis•Test Program Analysis•Supportability Analysis•Process Analysis•Technical Reviews & Audits
Code Phase
•Code Analysis•Test Program Analysis•Supportability Analysis•Process Analysis•Technical Reviews & Audits
Test Phase
•Test Program Analysis•Independent Test•Supportability Analysis•Technical Reviews & Audits
Verify
Verify
VerifyValidate
Catastrophic/Critical/High Risk Functions ListTraceability AnalysisIssues TrackingMetrics AssessmentLoading AnalysisChange Impact AnalysisSpecial Studies
NASAIV&V
FacilityIV&V Life Cycle Functions
• IV&V Process provides tools and analysis procedures appropriate to each phase of the software development life cycle:– Formulation Phase:
• Is development process sound, repeatable, and managed?
– Requirements Phase:• Verify that system and software requirements are correct, complete, traceable and testable
• Analyze system-level requirements: Are test plans and acceptance criteria sufficient to validate system requirements and operational needs?
• Are testing methods sufficient to verify and validate software requirements?
• Are the correct software development, management, and support processes in place?
– Design Phase:• Does the design support the requirements?
• Are test plans and test environments sufficient to verify and validate software and operational requirements?
• Does the design have any characteristics that will cause it to fail under operational scenarios? What solutions are appropriate?
NASAIV&V
Facility
IV&V Life Cycle Functions (cont.)
• Typical IV&V functions by Software life-cycle phase (cont.):– Coding Phase:
• Does the code reflect the design?
• Is the code correct?
• Verify that test cases trace to and cover software requirements and operational needs
• Verify that software test cases, expected results, and evaluation criteria fully meet testing objectives
• Analyze selected code unit test plans and results to verify full coverage of logic paths, range of input conditions, error handling, etc.
– Test Phase:• Analyze correct dispositioning of software test anomalies
• Validate software test results versus acceptance criteria
• Verify tracing and successful completion of all software test objectives
– Operational Phase:• Verify that regression tests are sufficient to identify adverse impacts of changes
NASAIV&V
Facility
IV&V Testing Involvement
• IV&V identifies deficiencies in program’s test planning
• Program changes their procedures to address deficiencies vice IV&V independently test
• IV&V may independently test highly critical software using an IV&V testbed– Whitebox
– Stress
– Endurance
– Limit
• Developer motivated to show software works
• IV&V attempts to break software
NASAIV&V
Facility
IV&V Process
NASAIV&V
Facility
IV&V Process
Integrates IV&V into programProvides IV&V fundingResolves Exception issues
Reflects IV&V in program mgmt planAgrees to data transfer planReflects agreement in subcontracts
IV&V in phasewith development
Program
Developer
IV&V Assessment
info
rmat
ion
IV&
V p
ropo
sal
agre
emen
tPlanning
Introduce issues at lowest levelAllow developer time to respondIssue resolved
Introduce issues at lowest levelAllow developer time to respondIf no resolution, take issue to program
Requirements
Design
Code
Test
Exception
Normal
CA
RA
res
ults
Defines IV&V scopeand objectives
Planning Execute
NASAIV&V
Facility
IV&V Scope
• Scope is determined so as to minimize the risk within the Program’s IV&V budget. Effort is based on:– Criticality and risk of system functions performed/managed by
software
– Budget limitations Program’sIV&V budgetSpecifications Program goals Dev plans/schedules
Criticality Analysis andRisk Assessment
Estimateresourcerequirements
Acceptable?Yes IV&V
Plan
Revise scope:- Breadth vs depth- Exceptions
No
NASAIV&V
Facility CARA Scoring Methodology
Criticality:
Catastrophic=4Critical=3Moderate=2Low=1
Performance and Operations
Safety
Cost/schedule
Category Rating
Average Criticality
Risk:
Category
High=3
Moderate=2
Low=1
Rating
Complexity
Technology Maturity
Reqts Dfn & Stability
Testability
Developer Experience
Average Risk
CARA score
For each Software Function:Set IV&V Analysis Level (IAL) ThresholdsIAL CARA ScoreNone: 1 < CARA < 2Limited (L): 2 < CARA < 5Focused (F): 5 < CARA < 8Comprehensive (C): 8 < CARA < 12
NASAIV&V
Facility
CARA Criticality
Sample Criticality Evaluation Criteria
CRITICALITYAreas
CatastrophicImpact Value=4
CriticalImpact Value=3
ModerateImpact Value=2
LowImpact Value=1
Performanceand Operation
Failure could cause loss ofuse of system for extendedtime , loss of capability toperform all missionobjectives. Failure is notameliorated.
Failures could cause loss ofcritical function notresulting in loss of systemuse, lengthy maintenancedowntime, or loss ofmultiple mission objectives.Failure is partiallyameliorated.
Failure could cause loss ofa single mission objectiveor reduction in operationalcapability. Failure is fullyameliorated.
Failure could causeinconvenience ( e.g., rerunof programs, computerreset, manual intervention).
Safety Failure could result in lossof life or system or causesevere personal injury.
Failure could result in nondisabling personal injury,serious occupational illness,or loss of emergencyprocedures.
Failure could result inminor injury.
No safety implications.
DevelopmentCost/Schedule
Failure could result in costoverruns large enough toresult in unachievableoperational capability.
Failure could result in largecost and schedule overruns.Alternate means toimplement function are notavailable.
Failure results in significantschedule delay. Alternatemeans to implementfunction are available but atreduced operationalcapability. Full operationalcapability delayed.
Failure results in minorimpact to cost and schedule.Problems are easilycorrected with insignificantimpact to cost and schedule.
NASAIV&V
Facility
CARA Risk
Sample Risk Driver Criteria
RISK RISK CATEGORIES AND RATING CRITERIADRIVERS High
Driver Value = 3Moderate
Driver Value = 2Low
Driver Value = 1Complexity ⟨ Highly complex control/logic operations
⟨ Unique devices/complex interfaces⟨ Many interrelated components⟨ Function uses different sensor/effector setin different modes or stages.
⟨ Moderately complex control/logic⟨ May be device dependent⟨ Moderately complex interfaces⟨ Several interrelated components⟨ Function has different behavior indifferent modes or stages.
⟨ Simple control/logic⟨ Not device dependent⟨ Function applies to a singlemode or stage
Maturity ofTechnology
⟨ New/unproven algorithms, languages &support environments
⟨ High probability for redesign⟨ Little or no experience base
⟨ Proven on other systems withdifferent application
⟨ Moderate experience base
⟨ Proven on other systemswith same application
⟨ Mature experience
RequirementsDefinition& Stability
⟨ Rapidly changing, baselines not established⟨ Many organizations required to definerequirements
⟨ Much integration required⟨ High degree of international interaction
⟨ Potential for some changes⟨ Some integration required⟨ Little interaction with internationalcomponents
⟨ Solid requirements - littlepotential for change
⟨ Little to no integrationrequired
⟨ No interaction withinternational components
Testability ⟨ Difficult to test⟨ Requires much data analysis to determineacceptability of results
⟨ Many operational environments and inputs
⟨ Requires some test data analysis todetermine acceptability of results
⟨ Moderate amount of operationalenvironments and inputs
⟨ Acceptability of test resultseasily determined
⟨ Few operationalenvironments and inputs
NASAIV&V
FacilityRequirements Analysis IALs
Activity L F C
Verify documentation meets intended purpose, has appropriate detailand all necessary elements
X X X
Validate ability of requirements to meet system needs X X X
Verify Traceability to and from parent requirements X X X
Analyze data/adaptation requirement X X X
Analyze Testability, Qualification requirements X X X
Analyze Data Flow, Control Flow, moding and sequencing X X X
Assess development metrics X X X
Analyze development risks/mitigation plans X X X
Analyze Timing and Sizing requirements X X X
Review developer timing/sizing, loading engineering analysis X X
Perform engineering analysis of key algorithms X X
Review/use developer prototypes or dynamic models X X
Develop alternative static representations (diagrams, tables) X X
Develop prototypes or models X
Perform timing/sizing/loading analysis X
Apply formal methods X
NASAIV&V
FacilityDesign Analysis IALs
Activity L F C
Verify documentation meets intended purpose, has appropriate detail andall necessary elements
X X X
Validate ability of design to meet system needs X X X
Verify Traceability to and from requirements X X X
Analyze database design X X X
Analyze design Testability, Qualification requirements X X X
Analyze design Data Flow, Control Flow, moding, sequencing X X X
Analyze control logic, error/exception handling design X X X
Assess design development metrics X X X
Analyze development risks/mitigation plans X X X
Review developer timing/sizing, loading engineering analysis X X
Perform design analysis of select critical algorithms X X
Review/use developer prototypes or dynamic models X X
Develop alternative static representations (diagrams, tables) X X
Develop prototypes or models X
Perform timing/sizing/loading analysis X
Apply formal methods X
NASAIV&V
FacilityCode Analysis IALs
Activity L F C
Verify documentation meets intended purpose, has appropriate detail andall necessary elements
X X X
Verify Traceability to and from design X X X
Verify Architectural design compliance (structure, external I/O, & CSCIexecutive moding, sequencing & control)
X X X
Verify supportability and maintainability X X X
Assess code static metrics X X X
Verify CSU & CSC level logical structure and control flow X X X
Verify internal data structures and data flow/usage X X
Verify error and exception handling X X
Verify code & external I/O data consistency X X
Review code compilation results & syntax checking X X
Verify correct adaptation data & ability to reconfigure X X
Verify correct operating system & run time libraries X X
For select algorithms, verify correctness and stability under full range ofpotential input conditions
X
Verify code data compliance with data dictionary X
Verify compliance with coding standards X
NASAIV&V
FacilityTest Analysis IALs
Activity L F C
Analyze System level verification requirements to verify that test definition,objectives, plans and acceptance criteria are sufficient to validate systemrequirements and operational needs associated with CCHR Functions
X X X
Verify Software Test Plan qualification testing methods and plans aresufficient to validate software requirements and operational needs
X X X
Verify test cases traceability and coverage of software requirements,operational needs, and capabilities
X X X
Verify software STD test case definition inputs, expected results, andevaluation criteria comply with STP plans and testing objectives
X X X
Analyze correct dispositioning of software test anomalies X X X
Validate software test results compliance with test acceptance criteria X X X
Verify trace and successful completion of all software test case objectives X X X
Verify ability of software test environment plans and designs to meetsoftware testing objectives
X X X
Verify regression tests are sufficient to determine that the software is notadversely affected by changes
X X X
Analyze STD procedures for test setup, execution, and data collection X X
Monitor execution of software testing X X
Analyze select CSC test plans, procedures, and results to verify adequatelogic path coverage, testing of full range of input conditions, error andexception handling, key algorithm stability, and performance in compliancewith the design.
X
Perform life cycle IV&V on software test environment components X
NASAIV&V
FacilityIV&V Is Process As Well As
Product Oriented
Software schedules, development tracking,critical path analysis, configuration mgmt
Program processes
Ancillary developments
Simulations, trainers, test environments
Increased probability of success- Good processes allow early error identification and correction- Quality documentation enhances software maintenance
NASAIV&V
Facility
IV&V Increases Program Awareness
Program
Iden
tifi
cati
on o
f to
p r
isk
sE
val o
f P
rogr
am D
evel
sta
tus
Eva
l of
Pro
gram
Sch
edu
le s
tatu
sWeek Week Week Week MonthIV&V
Sta
tus
Rev
iew
s
Sta
tus
Rev
iew
s
Sta
tus
Rev
iew
s
Sta
tus
Rev
iew
s
Reqts Design
Ph
ase
com
ple
te a
nal
ysis
rep
ort
IV&V is a program level “tool” to efficiently and effectivelymanage software development risk.
NASAIV&V
Facility
Staffing Paradigm
Developer Site
Program Site
S/W IV&V Facility
Developers IV&V
Pgm Mgmt IV&V
Eyes, Ears, Advocates, &Domain Experts (Validation)
Critical Mass of:- Analysts- Tools
NASAIV&V
Facility
Why perform IV&V?
NASAIV&V
Facility
IV&V Benefits
•Better software/systemPerformance
•Higher Confidence in Software Reliability
•Compliance between Specs & Code
•Criteria for ProgramAcceptance
•Better Visibility intoDevelopment
•Better Decision Criteria
•Second Source TechnicalAlternative
•Reduced maintenance cost
•Reduced Frequency ofOperational Change
Technical Management
NASAIV&V
Facility
Summary
NASAIV&V
FacilityIV&V Key Points
• IV&V works with the Project– Goal is project success
• IV&V is an engineering discipline– IV&V processes are defined and tailored to the specific program
– Mission, operations and systems knowledge is used to perform engineering analyses of system components
• IV&V is most effective when started early– 70% of errors found in testing are traceable to problems in the
requirements and design
• IV&V works problems at the lowest possible level– Primarily work via established informal interfaces with the development
organization - working groups, IPTs, etc.
– Elevate issues only when necessary
NASAIV&V
Facility
IV&V Approach Efficiently Mitigates Risk
• It is not necessary or feasible to perform all IV&V analyses on all software functions
• IV&V resources allocated to reduce overall exposure to operational, development, and cost/schedule risks– Software functions with higher cirticality and development risk receive
enhanced levels of analysis (‘CARA’ process)– Systems analyses performed to reduce costly interface and integration
problems– Process analyses performed to verify ability to produce desired result relative
to program plans, needs and goals• IV&V working interfaces promote timely problem resolution
– Proactive participation on pertinent development teams– Emphasis on early identification of technical problems– Engineering recommendations provided to expedite solution development and
implementation
NASAIV&V
Facility
Analyses Are Value Added and Complementary- Not Duplicative
• Analyses performed from a systems perspective considering mission needs and system use, hazards and interfaces
– Discipline experts assigned to perform analysis across all life cycle phases
– Horizontal specialty skills are matrixed across IV&V functional teams to verify correct systems integration
– Specialized tools and simulations perform complex analyses
• IV&V testing activities complement developer testing enhancing overall software confidence
– Developer testing focuses on demonstrating nominal behavior, IV&V testing activities try to break the software
• Overall program integration, test and verification approach analyzed for completeness, integrity and effectiveness
NASAIV&V
FacilityWhy use NASA IV&V Facility?
Software IV&V, as practiced by the NASA Software IV&V Facility, is a well-defined,
proven, systems engineering discipline designed to reduce the risk in major software
developments.
NASAIV&V
Facility
NASA IV&V FacilityPoints of Contact
• Judy Bruner
Acting Director
304-367-8202
• Bill Jackson
Deputy Director
304-367-8215