1 a comparison of information management using imprecise probabilities and precise bayesian updating...

26
1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh, Ph.D. [email protected] Applied Research Laboratories University of Texas at Austin Jeffrey W. Herrmann, Ph.D. [email protected] Department of Mechanical Engineering and Institute for Systems Research University of Maryland Third International Workshop on Reliable Engineering Computing, NSF Workshop on Imprecise Probability in Engineering Analysis & Design, Savannah, Georgia, February 20-22, 2008.

Upload: rosamond-scott

Post on 17-Jan-2016

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

1

A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates

Jason Matthew Aughenbaugh, [email protected]

Applied Research LaboratoriesUniversity of Texas at Austin

Jeffrey W. Herrmann, [email protected]

Department of Mechanical Engineering and Institute for Systems ResearchUniversity of Maryland

Third International Workshop on Reliable Engineering Computing, NSF Workshop on Imprecise Probability in Engineering Analysis &

Design, Savannah, Georgia, February 20-22, 2008.

Page 2: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

2

Motivation

• Need to estimate reliability of system with components of uncertain reliability.

• Which components should we test to reduce uncertainty about system reliability?

AB

C

Page 3: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

3

Introduction

Data

Existing informationIs it relevant?

Is it accurate?

Prior characterization

Updated / posteriorcharacterization

New experiments

Statisticalmodeling and updating

approach

-3 -2 -1 0 1 2 30

0.5

1

-3 -2 -1 0 1 2 30

0.5

1

-3 -2 -1 0 1 2 30

0.5

1

-3 -2 -1 0 1 2 30

0.5

1

Page 4: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

4

Statistical Approaches

• Compare the following approaches: (Precise) Bayesian Robust Bayesian

• sensitivity analysis of prior

Imprecise probabilities• actual “true” probability is imprecise• the imprecise beta model

}Different philosophicalmotivations, but

equivalent math. forthis problem

Page 5: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

5

Is precise probability sufficient?

• Problem: equiprobable Know nothing or know they are equally likely?

• Why does it matter? Engineer A states that input values 1 and 2 have equal

probabilities Engineer B is designing a component that is very

sensitive to this input Should Engineer B proceed with a costly but versatile

design, or study the problem further?• Case 1: Engineer A had no idea, so stated equal. Study =good• Case 2: Engineer A performed substantial analysis. Additional

study = wasteful.

Page 6: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

6

Moving beyond precise probability

• Start with well established principles and mathematics Conclude it is insufficient

• Abandon probability completely?

• Relax conditions, extend applicability?

Think sensitivity analysis. How much do deviations from a precise prior matter?

Page 7: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

7

Robust Bayes, Imprecise Beta Model

• Instead of one prior, consider many (a set)

1 (1 ) 1,

00

0

Conjugate model:

Beta Model parameterized with and :

( ) (1 )

Prior knowledge:

[ , ] prior estimate of mean

prior "sample size"

Experiment: observe failures in

st s ts t

s t

t t

s

m n

0 00

0 0 0

0

0

trials.

Update:

min{( ) /( )}

max ( ) /( )

n

n

n

n

t s t m s n

t s t m s n

s s n

s s n

Cum

ulat

ive

Pro

babi

lity

θ

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.2 0.4 0.6 0.8 1

alpha = 5, beta = 95alpha = 10, beta = 90alpha = 15, beta = 85

0 00.05, 100t s 0 00.10, 100t s 0 00.15, 100t s

Page 8: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

8

Problem Description

• A simple parallel-series system, some info

• Assume we can test 12 more components How should these tests be allocated? A single test plan can have different outcomes

• Compare different scenarios of existing information

AB

C

Page 9: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

9

Multiple Outcomes of Experiement

• Precise probability Consider one outcome: test A 12 times, 2 fail

• Get one new posterior; precise parameters

Consider all possible outcomes: test A, get…• Get a new posterior for each possible outcome;

sets of parameters

• Imprecise probability One outcome, one SET of posteriors Multiple outcomes, SET of SETS of posteriors

How measure uncertainty? How make comparisons and decisions?

Page 10: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

10

Metrics of Uncertainty: Precise Distributions

• Variance-based sensitivity analysis (SVi) • (Sobol, 1993; Chan et al., 2000)

variance of the conditional expectation / total variance focuses on status quo, next (local) piece of info testing a component with a large sensitivity analysis

should reduce variance of system reliability estimate

• Mean and variance observations• Posterior variance

Page 11: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

11Metrics of Uncertainty: Imprecise Distributions

• Imprecise variance-based sensitivity analysis (Hall, 2006) Does not worry about outcomes; local metric

• Mean and variance dispersion

• Imprecision in the mean

• Imprecision in the variance

,

,

min

max

i pip F

i i pp F

SV SV

SV SV

Page 12: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

12

Scenarios with Precise Distributions

• Components have beta distributions for the prior distributions of failure probability

• Scenario 1 System failure probability:

mean = 0.2201 variance = 0.0203

• Scenario 2 System failure probability:

mean = 0.1691variance = 0.0116

A B C

0

0

0.15

10

t

s

0

0

0.15

2

t

s

0

0

0.55

10

t

s

Scenario 1 priors

A B C

Scenario 2 priors

0

0

0.15

10

t

s

0

0

0.15

2

t

s

0

0

0.15

10

t

s

AB

CXX

Page 13: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

13

Scenario 1 Results

• Variance-based sensitivity analysis:

0.4814

0.4583

0.0181

A

B

C

SV

SV

SV

• Posterior variance:

Table 1. Posterior variance for scenario 1 Posterior Variance Across Test Results Test Plan

#:{ , , }A B Cn n n Min Max

1:{12,0,0} 0.0110 0.0151

2:{0,12,0} 0.0117 0.0175

3:{0,0,12} 0.0131 0.0291

4:{4,4,4} 0.0071 0.0195

5:{6,6,0} 0.0059 0.0181

6:{6,0,6} 0.0094 0.0228

7:{0,6,6} 0.0117 0.0177

Best worst-case

Best best-case

AB

C

Page 14: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

14

Scenario 1 Results

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

mean

varia

nce

Test plan 1: [12, 0, 0]

Test plan 2: [0, 12, 0]Test plan 3: [0, 0, 12]

Test plan 4: [4, 4, 4]

Test plan 5: [6, 6, 0]

Test plan 6: [6, 0, 6]Test plan 7: [0, 6, 6]

Prior

AB

C

0.4814

0.4583

0.0181

A

B

C

SV

SV

SV

1

2

Page 15: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

15

Scenario 2 Results

• Variance-based sensitivity analysis:

• Posterior variance:

0.8982

0.0560

0.0153

A

B

C

SV

SV

SV

Table 1. Posterior variance for scenario 2 Posterior Variance Across Test Results Test Plan

#:{ , , }A B Cn n n Min Max

1:{12,0,0} 0.0042 0.0109

2:{0,12,0} 0.0115 0.0155

3:{0,0,12} 0.0116 0.0218

4:{4,4,4} 0.0064 0.0158

5:{6,6,0} 0.0051 0.0145

6:{6,0,6} 0.0054 0.0160

7:{0,6,6} 0.0115 0.0145

AB

C

Best worst-case

Best best-case

Page 16: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

16

Scenario 2 Results

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

mean

varia

nce

Test plan 1: [12, 0, 0]

Test plan 2: [0, 12, 0]Test plan 3: [0, 0, 12]

Test plan 4: [4, 4, 4]

Test plan 5: [6, 6, 0]

Test plan 6: [6, 0, 6]Test plan 7: [0, 6, 6]

Prior

AB

C

0.8982

0.0560

0.0153

A

B

C

SV

SV

SV

1

2

Page 17: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

17

Scenario 3: Imprecise Distributions

• Component failure probabilities are modeled using imprecise beta distributions

• System failure probability an imprecise distribution: Mean: 0.2201 to 0.4640 Variance: 0.0136 to 0.0332

• Imprecise variance-based sensitivity analysis:

A B C

0

0

0

0

0.15

0.20

10

12

t

t

s

s

0

0

0

0

0.15

0.55

2

5

t

t

s

s

0

0

0

0

0.55

0.60

10

12

t

t

s

s

0.1363 to 0.7204

0.2406 to 0.6960

0.0116 to 0.2512

A

B

C

SV

SV

SV

Since failure probability of B is poorly known,

we allow for a range.

Scenario 3 comparable to precise scenario 1.

Page 18: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

18

Posterior Variance Analysis

Smallest variances, and smallest imprecision in variances.

0.1363 to 0.7204

0.2406 to 0.6960

0.0116 to 0.2512

A

B

C

SV

SV

SV

Table 1. Posterior variance analysis for scenario 3

V Imprecision in V Test Design

#:{ , , }A B Cn n n Minimum minimum

Maximum maximum

Minimum average

Maximum average

Minimum Maximum

Prior 0.0136 0.0332 n.a. n.a. 0.0196

1:{12,0,0} 0.0075 0.0344 0.0094 0.0304 0.0046 0.0259

2:{0,12,0} 0.0099 0.0181 0.0103 0.0153 0.0035 0.0051

3:{0,0,12} 0.0103 0.0465 0.0134 0.0310 0.0070 0.0293

4:{4,4,4} 0.0059 0.0162 0.0075 0.0118 0.0020 0.0054

5:{6,6,0} 0.0056 0.0189 0.0083 0.0150 0.0022 0.0063

6:{6,0,6} 0.0068 0.0458 0.0107 0.0295 0.0041 0.0309

7:{0,6,6} 0.0100 0.0183 0.0109 0.0183 0.0026 0.0060

Page 19: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

19

Results for Scenario 3

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

0.035

mean

varia

nce

Test plan 1: [12, 0, 0]

Test plan 2: [0, 12, 0]

Test plan 5: [6, 6, 0]

Prior

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

0.035

mean

varia

nce

Test plan 3: [0, 0, 12]

Test plan 6: [6, 0, 6]

Test plan 7: [0, 6, 6]

Prior

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

0.035

mean

varia

nce

Test plan 2: [0, 12, 0]

Test plan 4: [4, 4, 4]

Test plan 5: [6, 6, 0]Test plan 7: [0, 6, 6]

Prior

Sample results:[12, 0, 0], [0, 12, 0], [6, 6, 0]

Convex hull of results: [12, 0, 0], [0, 12, 0], [6, 6, 0]

Convex hull of results:[0, 0, 12], [6, 0, 6], [0, 6, 6]

Convex hull of results:[0, 12, 0], [4, 4, 4], [6, 6, 0], [0, 6, 6]

0.1363 to 0.7204

0.2406 to 0.6960

0.0116 to 0.2512

A

B

C

SV

SV

SV

Page 20: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

20

Scenario 4: Imprecise Distributions

• Component failure probabilities are modeled using imprecise beta distributions

• System failure probability is also an imprecise distribution: Mean: 0.1691 to 0.2880 Variance: 0.0100 to 0.0173

• Imprecise variance-based sensitivity analysis:

A B C

0

0

0

0

0.15

0.20

10

12

t

t

s

s

0

0

0

0

0.15

0.55

2

5

t

t

s

s

0

0

0

0

0.15

0.20

10

12

t

t

s

s

0.5438 to 0.9590

0.0210 to 0.1819

0.0095 to 0.2515

A

B

C

SV

SV

SV

Compared to scenario 3,the failure probability of C

is reduced.

This makes it comparable to precise scenario 2.

Page 21: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

21

Results for Scenario 4

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

0.035

mean

varia

nce

Test plan 1: [12, 0, 0]

Test plan 2: [0, 12, 0]

Test plan 5: [6, 6, 0]

Prior

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

0.035

mean

varia

nce

Test plan 3: [0, 0, 12]

Test plan 6: [6, 0, 6]

Test plan 7: [0, 6, 6]

Prior

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

0.035

mean

varia

nce

Test plan 1: [12, 0, 0]

Test plan 4: [4, 4, 4]

Test plan 7: [0, 6, 6]

Prior

Convex hull of results:[12, 0, 0], [0, 12, 0], [6, 6, 0]

Convex hull of results:[0, 0, 12], [6, 0, 6], [0, 6, 6]

Convex hull of results:[12, 0, 0], [4, 4, 4], [0, 6, 6]

0.5438 to 0.9590

0.0210 to 0.1819

0.0095 to 0.2515

A

B

C

SV

SV

SV

Page 22: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

22

Discussion / Future Work

• Multiple sources of uncertainty Existing knowledge Results of future tests

• How do we prioritize different aspects? Variance or imprecision reduction? Best case, worst case, average case of results? Incorporate economic/utility metrics?

• Other imprecision/total uncertainty measures? “Breadth” of p-boxes (Ferson and Tucker, 2006 ) Aggregate uncertainty, others(Klir and Smith, 2001)

Page 23: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

23

Summary

• Shown how to use different statistical approaches for evaluating experimental test plans

• Used direct uncertainty metrics Variance-based sensitivity analysis

• Precise and imprecise Posterior variance Dispersion of the mean and variance Imprecision in the mean and variance

Page 24: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

24

Thank you for your attention.

• Questions? Comments? Discussion?

This work supported in part by the Applied Research Laboratories at UT-Austin Internal IR&D grant 07-09

Page 25: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

25

SVi

2

2 2

2 2

11

11

11

A B C A

B C A B

C B A C

SV E P E P V PV

SV E P E P V PV

SV E P E P V PV

Page 26: 1 A Comparison of Information Management using Imprecise Probabilities and Precise Bayesian Updating of Reliability Estimates Jason Matthew Aughenbaugh,

26

Formulae

/A A A AE P

;

2

1A A

A

A A A A

V P

;

22 1

1A A

A AAA A A A

E P V P E P

.The mathematical model for the reliability of the system shown in Figure 1 follows.

1 (1 )(1 )sys A B CR R R R

sys A B C A B CP P P P P P P

[ ] [ ] [ ] [ ] [ ] [ ] [ ]A B C A B CE E P E P E P E P E P E P

2 2 2

2 2 2 2 2 2 2

[ ] [ ] 2 [ ] [ ] [ ] 2 [ ] [ ] [ ]

[ ] [ ] 2 [ ] [ ] [ ] [ ] [ ] [ ]A A B C A B C

B C A B C A B C

E E P E P E P E P E P E P E P

E P E P E P E P E P E P E P E P

2 2[ ] [ ] ( [ ])V E E