1 sqa & reuse katerina goseva-popstojanova, wvu aaron wilson, nasa iv&v kalynnda berens...

19
1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

Upload: preston-cooper

Post on 17-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

1

SQA & Reuse

Katerina Goseva-Popstojanova, WVUAaron Wilson, NASA IV&V

Kalynnda Berens & Richard Plastow, GRCJoanne Bechta Dugan, UVa

David Gilliam JPL

Page 2: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

2

Projects • Real-time Linux Evaluations

Kalynnda Berens & Richard Plastow, GRC

• Performability of Web-based applications

Katerina Goseva-Popstojanova, WVU

• Reducing Software Security Risk through an Integrated Approach, David Gilliam & John Powel, JPL

• Software Assurance of Web-based Applications

Tim Kurtz, GRC

• Software Quality & Safety Assessment Using Bayesian Belief Networks, Joanne Bechta Dugan, UVa

Page 3: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

3

• Performance benchmarking on flight-like hardware:– RTLinux (free version) V3.2 pre3– RTLinux Pro (commercial) V2.0– RTAI V24.1.11– Linux 2.6.7 Kernel (future)– Jaluna (future)

• RTLinux and RTAI are– Stable– Support many processors– Require a learning curve

Real-time Linux Evaluations

Page 4: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

4

Which Real-Time Linux is best?

Use

r-spa

ce T

ask C

rea

tion

Use

r-spa

ce P

rog

ram

Fo

rking

Tim

ing

Jitter (H

arm

on

ic)

Tim

ing

Jitter (N

on

-ha

rmo

nic)

Co

nte

xt Sw

itch T

imin

g

Ha

rdw

are

Inte

rrup

ts

So

ftwa

re In

terru

pts

Ke

rne

l Ta

sk Cre

atio

n

Inte

r-task M

essa

gin

g

Ge

t Se

ma

ph

ore

Ge

t/Re

lea

se S

em

ap

ho

re

Re

lea

se/G

et S

em

ap

ho

re

RTLinux Pro

Best OK Worst Worst Best OK OK Worst Worst Good Good Good

RTLinux free

Worst Best Best OK Very Good

Worst Worst Best OK Worst Worst Worst

RTAI OK OK Good Best Worst Best Best Good Best Best Best Best

Page 5: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

5

Web access log analysis

User session characterization

Realistic workload

Software/hardwareresource utilization

Application & hardware resource

monitoring

Web error log analysis

Request-based and session-based error

characterization

Software/hardwarefailure/recovery

characterization

Performabilitymodel

Session layer

(user view)

Service layer(software

architectural view)

System layer(deployment view)

Reliability/availability

model

Performancemodel

Resource layer(hardware device

view)

Web measurement and modeling framework

Page 6: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

6

0

0.51

1.52

2.5

33.5

4

0 30 60 90 120 150 180 210 240 270 300 330 360 390

k

Hill

es

timat

or

of a

lph

a

Cost effective way to improve quality

0

5

10

15

20

25

30

35

% E

rro

rs

NASAPvt1

NASAPvt2

NASAPvt3

NASAPub1

NASAPub2

NASAPub3

CSEE 1 CSEE 2

File 3

File 2

File 1

10-35% of the total number of errors are due to only 3 files

Fixing the errors with the highest frequency of occurrence is the most cost effective way to improve Web quality

1

2 3

4

5

6-1

0

11-

20

21-

50

51-

100

101-

250

251-

700

701-

1500

1501-

3000

3001-

5000

CS

EE

1

CS

EE

2

1

10

100

1,000

10,000

100,000

Nu

mb

er

of

un

iqu

e e

rro

rs

Frequency of occurrence

Page 7: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

7

Software Component Relationships

C 1 C 2 C 3 C 4

And_1 And_2

Safe Unsafe

Vmatrix

PBT

MC

Attacks not in the wild

D iscovered a ttacks not been seen in the wild Known attacks for Vmatrix / PBT Libaries

Technology Integration

• Software Vulnerabilities Expose IT Systems and Infrastructure to Security Risks

• Goal: Reduce Security Risk in Software and Protect IT Systems, Data, and Infrastructure

•Security Training for System Engineers and Developers

•Software Security Checklist for end-to-end life cycle

•Software Security Assessment Instrument (SSAI)

•Security Instrument Includes:

•Model-Based Verification

•Property-Based Testing

•Security Checklist

•Vulnerability Matrix

•Collection of security tools

NASA

Reducing Software Security Risk Through an Integrated Approach

Page 8: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

8

Womb-to-Tomb Process• Coincides with Organizational Polices and Requirements• Security Risk Mitigation Process in the Software Lifecycle• Software Lifecycle Integration

– Training– Software Security Checklist

• Phase 1 – Provide instrument to integrate security as a formal approach to the software life cycle– Requirements Driven

• Phase 2: – External Release of Software– Release Process

– Vulnerability Matrix – NASA Top 20 – Security Assurance Instruments

• Early Development – Model Checking / FMF• Implementation – Property Based Testing

– Security Assessment Tools (SATs) • Description of available SATs• Pros and Cons of each and related tools with web sites

• Notification Process when Software or Systems are De-Commissioned / Retired

Page 9: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

9

• How should NASA SA assure web-based applications?• Solution

– Implement the same types of controls on web-apps development that are used on other types of software development

– Audit and review projects web-app development activities using a set of checklists

– Pilot the guidebook/checklists

• Deliverables– Best Practices guidebook– Checklists

Software Assurance of Web-based Applications

Page 10: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

10

GETR DecisionHow can we investigate and document the decision process that is used to go from...

to…

I have anacceptablelevel of beliefthat the systemwill operateas specified.

QualityAssuranc

e

Test Results Personal andTeam CMM

PrototypePerformance

RequirementsReview

Is the system good enough to release?

Code InspectionRisk Assessment

Formal Methods

for a computer-based system

Engineering Judgment

Software Quality & Safety Assessment Using BBN

Page 11: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

11

Evaluate

Evidence

SpecificationPhase BN

CodingPhase BN

TestingPhase BN

DesignPhase BN

CausalRelations

BN model of SDP

DefectContent

&ReliabilityEstimates

In - processfeedback

Software metricsProcess

measurements

Observe and Fix

Unified AnalyisFramework

Existing Softwaredevelopment

process

System ReliabilityAnalysis

Estimates

Failure

SW-1SW-2

System-levelFault Tree

BBN model of Software Development Process

Page 12: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

12

Technology Readiness Level

Software Quality & Safety

Web performability

Reducing software security risk

Page 13: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

13

Brief description of the field

–Quality attributes: reliability, performance, security, maintainability, and reusability

–Techniques• Testing: property testing, performance testing• Real system, real workload • Analysis & Modeling: model checking, statistical &

probabilistic analysis, BBN

–Process & product

Page 14: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

14

Potential benefits

• Improved decision support, prioritization, better allocation of resources

• Better product in a cost effective way through integrated approaches

• Increased fidelity without increasing complexity

Page 15: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

15

Directions

• Increased coordination through unified approaches

• Infusion of improved techniques into current processes

• Improving the state of practice

Page 16: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

16

Why

• Potential benefits to NASA– Fewer mission failures– Reduced complexity– Greater reuse of software artifacts and process

improvements – Transference of best practices and lessons

learned

Page 17: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

17

Why not

• Standard traps– “There is no silver bullet” – “Teaching to the test”– Deadline vs. quality driven development – Tunnel vision – Dependencies on hardware and OS– Poor documentation and quality of data

Page 18: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

18

Who is using this technology• NASA projects that are using this technology

– Security checklist at JPL– RT Linux Pro at Glenn– Web performability at NASA IV&V– Web-based process assurance at Glenn– Seal of Approval Process for PRA tools at NASA

HQ

• Other projects outside of NASA that are using these tools/approaches – Web performability at LDCSEE– Formal security verification at Patchlink

Page 19: 1 SQA & Reuse Katerina Goseva-Popstojanova, WVU Aaron Wilson, NASA IV&V Kalynnda Berens & Richard Plastow, GRC Joanne Bechta Dugan, UVa David Gilliam JPL

19

Questions/Issues• Reliability, availability, performance, security

– Integrated approaches needed– What are the interactions & tradeoffs?

• Process & product

• Better, Cheaper, Faster– Can we have it all? – Should we pick (any) two?