1 chapter 8 revising judgments in the light of new information

40
1 Chapter 8 Revising Judgments in the Light of New Information

Upload: griffin-shepherd

Post on 18-Jan-2018

222 views

Category:

Documents


0 download

DESCRIPTION

3 Bayes ’ theorem Prior probability New information Posterior probability

TRANSCRIPT

Page 1: 1 Chapter 8 Revising Judgments in the Light of New Information

1

Chapter 8Revising Judgments

in the Light of New Information

Page 2: 1 Chapter 8 Revising Judgments in the Light of New Information

2

In this chapter we will look at the process of revising initial probability estimates in the light of new information.

Page 3: 1 Chapter 8 Revising Judgments in the Light of New Information

3

Bayes’ theoremPrior probability

New information

Posterior probability

Page 4: 1 Chapter 8 Revising Judgments in the Light of New Information

4

The components problem (Fig. 8.1)

Page 5: 1 Chapter 8 Revising Judgments in the Light of New Information

5

In total, we would expect 410 (i.e. 140 + 270) components to fail the test.

Now the component you selected is one of these 410 components. Of these, only 140 are 'OK7, so your posterior probability that the component is 'OK7 should be 140/410, which is 0.341, i.e.

P(component OK|failed test) = 140/410 = 0.341

Page 6: 1 Chapter 8 Revising Judgments in the Light of New Information

6

Applying Bayes’ theorem to the components problem (Fig. 8.2)

Page 7: 1 Chapter 8 Revising Judgments in the Light of New Information

7

The steps in the process which we have just applied are summarized below:

(1) Construct a tree with branches representing all the possible events which can occur and write the prior probabilities for these events on the branches.

(2) Extend the tree by attaching to each branch a new branch which represents the new information which you have obtained. On each branch write the conditiona1 probability of obtaining this information given the circumstance represented by the preceding branch.

(3) Obtain the joint probabilities by multiplying each prior probability by the conditional probability which follows it on the tree.

(4) Sum the joint probabilities. (5) Divide the 'appropriate' joint probability by the sum of the

joint probabilities to obtain the required posterior probability.

Page 8: 1 Chapter 8 Revising Judgments in the Light of New Information

8

Example: An engineer makes a cursory inspection of a piece of

equipment and estimates that there is a 75% chance that it is running at peak efficiency. He then receives a report that the operating temperature of the machine is exceeding 80° C. Past records of operating performance suggest that there is only a 0.3 probability of this temperature being exceeded when the machine is working at peak efficiency. The probability of the temperature being exceeded if the machine is not working at peak efficiency is 0.8. What should be the engineer's revised probability that the machine is operating at peak efficiency?

Refer to Fig. 8.3

Page 9: 1 Chapter 8 Revising Judgments in the Light of New Information

9

Another example (more than two events )

A company's sales manager estimates that there is a 0.2 probability that sales in the coming year will be high, a 0.7 probability that they will be medium and a 0.1 probability that they will be low. She then receives a sales forecast from her assistant and the forecast suggests that sales will be high. By examining the track record of the assistant's forecasts she is able to obtain the following probabilities:

Page 10: 1 Chapter 8 Revising Judgments in the Light of New Information

10

p(high sales forecast given that the market will generate high sales) = 0.9

p(high sales forecast given that the market will generate only medium sales) =0.6

p(high sales forecast given that the market will generate only low sales) = 0.3

Refer to Fig. 8.4

Page 11: 1 Chapter 8 Revising Judgments in the Light of New Information

11

We obtain the following posterior probabilities:

p(high sales) = 0.18/0.63 = 0.2857 P(medium sales) = 0.42/0.63 = 0.6667

p(low sales) = 0.03/0.63 = 0.0476

Page 12: 1 Chapter 8 Revising Judgments in the Light of New Information

12

The effect of new information on the revision of probability judgments

It is interesting to explore the relative influence which prior probabilities and new information have on the resulting posterior probabilities.

Consider that a situation where the geologist is not very confident about his prior probabilities and where the test drilling is very reliable.

Page 13: 1 Chapter 8 Revising Judgments in the Light of New Information

13

Vague priors and very reliable information

Page 14: 1 Chapter 8 Revising Judgments in the Light of New Information

14

The posterior probabilities depend only upon the reliability of the new information.

The 'vague' prior probabilities have had no influence on the result.

Page 15: 1 Chapter 8 Revising Judgments in the Light of New Information

15

A more general view of the relationship between the 'vagueness' of the prior probabilities and the reliability of the new information can be seen in Figure 8.6.

Page 16: 1 Chapter 8 Revising Judgments in the Light of New Information

16

The effect of the reliability of information on the modification of prior probabilities (+)

Page 17: 1 Chapter 8 Revising Judgments in the Light of New Information

17

If the test drilling has only a 50% probability of giving a correct result then its result will not be of any interest and the posterior probability will equal the prior, as shown by the diagonal line on the graph.

Page 18: 1 Chapter 8 Revising Judgments in the Light of New Information

18

The more reliable the new information, the greater will be the modification of the prior probabilities.

For any given level of reliability, however, this modification is relatively small either where the prior probability is high, or where the prior probability is very small.

Page 19: 1 Chapter 8 Revising Judgments in the Light of New Information

19

At the extreme, if your prior probability of an event occurring is zero then the posterior probability will also be zero.

In general, assigning prior probabilities of zero or one is unwise.

Page 20: 1 Chapter 8 Revising Judgments in the Light of New Information

20

Applying Bayes’ theorem to a decision problem Decision Low sales High salesHold small stocks $80000 $140000 Hold large stocks $20000 $220000

Profit $20000 $80000 $140000 $220000

Utility 0 0.5 0.8 1.0

Page 21: 1 Chapter 8 Revising Judgments in the Light of New Information

21

The retailer estimates that there is a 0.4 probability that sales will be low and a 0.6 probability that they will be high.

What level of stocks should he hold? In Figure 8.7(a), It can be seen that his

expected utility is maximized if he decides to hold a small stock of the commodity.

Page 22: 1 Chapter 8 Revising Judgments in the Light of New Information

22

The retailer’s problem with prior probabilities

Page 23: 1 Chapter 8 Revising Judgments in the Light of New Information

23

Before implementing his decision the retailer receives a sales forecast which suggests that sales will be high.

P(forecast of high sales|high sales)=0.75

P(forecast of high sales|high sales)=0.2

Page 24: 1 Chapter 8 Revising Judgments in the Light of New Information

24

Applying Bayes’ theorem to the retailer’s problem

Page 25: 1 Chapter 8 Revising Judgments in the Light of New Information

25

Applying posterior probabilities to the retailer’s problem

Page 26: 1 Chapter 8 Revising Judgments in the Light of New Information

26

Assessing the value of new information

New information can remove or reduce the uncertainty involved in a decision and thereby increase the expected payoff.

Whether it is worth obtaining the information in the first place or, if there are several potential sources of information, which one is to be preferred.

Page 27: 1 Chapter 8 Revising Judgments in the Light of New Information

27

The expected value of perfect information

The concept of the expected value of perfect information (EVPI) can still be useful.

A problem is used to show how the value of perfect information can be measured.

For simplicity, we will assume that the decision maker is neutral to risk so that the expected monetary value criterion can be applied.

Refer to the following figure. (Descriptions are in page 227)

Page 28: 1 Chapter 8 Revising Judgments in the Light of New Information

28

Determining the EVPI (Fig. 8.8)

Page 29: 1 Chapter 8 Revising Judgments in the Light of New Information

29

Calculating the EVPI

Page 30: 1 Chapter 8 Revising Judgments in the Light of New Information

30

If the test is perfectly accurate it would not be worth paying them more than $15 000.

It is likely that the test will be less than perfect, in which case the information it yields will be of less value. Nevertheless, the EVPI can be very useful in giving an upper bound to the value of new information.

Page 31: 1 Chapter 8 Revising Judgments in the Light of New Information

31

If the manager is risk averse or risk seeking or if he also has non-monetary objectives then it may be worth him paying more or less than this amount.

Page 32: 1 Chapter 8 Revising Judgments in the Light of New Information

32

The expected value of imperfect information

Suppose that, after making further enquiries, the farm manager discovers that the Ceres test is not perfectly reliable.

If the virus is still present in the soil the test has only a 90% chance of detecting it, while if the virus has been eliminated there is a 20% chance that the test will incorrectly indicate its presence.

How much would it now be worth paying for the test?

Page 33: 1 Chapter 8 Revising Judgments in the Light of New Information

33

Deciding whether to buy imperfect information

Page 34: 1 Chapter 8 Revising Judgments in the Light of New Information

34

If test indicates virus is present

Page 35: 1 Chapter 8 Revising Judgments in the Light of New Information

35

If test indicates virus is absent

Page 36: 1 Chapter 8 Revising Judgments in the Light of New Information

36

Determining the EVII

Page 37: 1 Chapter 8 Revising Judgments in the Light of New Information

37

Expected profit with imperfect information = $62 155

Expected profit without the information = $57 000Expected value of imperfect information (EVII) = $5 155

Refer to Page 232

Page 38: 1 Chapter 8 Revising Judgments in the Light of New Information

38

It would not, therefore, be worth paying Ceres more than $5155 for the test.

You will recall that the expected value of perfect information was $15 000, so the value of information from this test is much less than that from a perfectly reliable test.

Of course, the more reliable the new information, the closer its expected value will be to the EVP1.

Page 39: 1 Chapter 8 Revising Judgments in the Light of New Information

39

A summary of the main stages

(1) Determine the course of action which would be chosen using only the prior probabilities and record the expected payoff of this course of action;

(2) Identify the possible indications which the new information can give;

(3) For each indication: (a) Determine the probability that this indication will

occur; (b) Use Bayes' theorem to revise the probabilities in

the light of this indication; (c) Determine the best course of action in the light of

this indication (i.e. using the posterior probabilities) and the expected payoff of this course of action;

Page 40: 1 Chapter 8 Revising Judgments in the Light of New Information

40

(4) Multiply the probability of each indication occurring by the expected payoff of the course of action which should be taken if that indication occurs and sum the resulting products. This will give the expected payoff with imperfect information;

(5) The expected value of the imperfect information is equal to the expected payoff with imperfect information (derived in stage 4) less the expected payoff of the course of action which would be selected using the prior probabilities (which was derived in stage 1).