chapter 7

162
1 Chapter 7 Properties of Expectation

Upload: beth

Post on 11-Feb-2016

158 views

Category:

Documents


2 download

DESCRIPTION

Chapter 7 . Properties of Expectation. Outline. Introduction Expectation of Sums of R. V. Moments of the Number of Events that Occurs Covariance, Variance of Sum, and Correlations Conditional Expectation Conditional Expectation and Prediction Moment Generating Function - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Chapter 7

1

Chapter 7

Properties of Expectation

Page 2: Chapter 7

2

OutlineIntroductionExpectation of Sums of R. V.Moments of the Number of Events that OccursCovariance, Variance of Sum, and CorrelationsConditional ExpectationConditional Expectation and PredictionMoment Generating FunctionAdditional Properties of Normal R. V.General Definition of Expectation

Page 3: Chapter 7

3

7.1 IntroductionExpected value of r. v. X

Page 4: Chapter 7

4

7.2 Expectation of Sums of R. V.

Page 5: Chapter 7

5

Page 6: Chapter 7

6

EXAMPLE 2a An accident occurs at a point X that is uniformly

distributed on a road of length L. At the time of the accident an ambulance is at a location Y that is also uniformly distributed on the road. Assuming that X and Y are independent, find the expected distance between the ambulance and the point of the accident.

Page 7: Chapter 7

7

Page 8: Chapter 7

8

Page 9: Chapter 7

9

EXAMPLE 2b Suppose that for random variables X and Y. X Y≧ That is, for any outcome of the probability experiment,

the value of the random variable X is greater than or equal the value of the random variable Y. Since the preceding is equivalent to the inequality X -Y ≧ 0, it follows that E[X -Y]≧ 0, or, equivalently,

E[X] E[Y]≧

Using Equation (2.1), a simple induction proof shows that if E[Xi] is finite for all i = 1, …, n, then

E[X1+X2+...+Xn]= E[X1]+...+E[Xn]

Page 10: Chapter 7

10

Example 2c

Page 11: Chapter 7

11

Page 12: Chapter 7

12

Example 2d

Page 13: Chapter 7

13

Page 14: Chapter 7

14

EXAMPLE 2e Expectation of a binomial random variable Let X be a binomial random variable with parameters n

and p. Recalling that such a random variable represents the number of successes in n independent trials when each trial has probability p of being a success, we have that

Page 15: Chapter 7

15

EXAMPLE 2f Mean of a negative binomial random variable If independent trials, having a constant probability p of

being successes, are per-formed, determine the expected number of trials required to amass a total of r successes.

Page 16: Chapter 7

16

EXAMPLE 2g Mean of a hypergeometric random variable If n balls are randomly selected from an urn containing

N balls of which m are white, find the expected number of white balls selected.

Page 17: Chapter 7

17

Page 18: Chapter 7

18

EXAMPLE 2h Expected number of matches A group of N people throw their hats into the center of

a room. The hats are mixed up, and each person randomly selects one. Find the expected number of people that select their own hat.

Page 19: Chapter 7

19

EXAMPLE 2i Coupon-collecting problems Suppose that there are N different types of coupons and

each time one obtains a coupon it is equally likely to be any one of the N types. Find the expected number of coupons one need amass before obtaining a complete set of at least one of each type.

Page 20: Chapter 7

20

Page 21: Chapter 7

21

EXAMPLE 2j Ten hunters are waiting for ducks to fly by. When a

flock of ducks flies overhead, the hunters fire at the same time, but each chooses his target at random, independently of the others. If each hunter independently hits his target with probability p, compute the expected number of ducks that escape unhurt when a flock of size 10 flies overhead.

Page 22: Chapter 7

22

Page 23: Chapter 7

23

EXAMPLE 2k Expected number of runs

Page 24: Chapter 7

24

Page 25: Chapter 7

25

EXAMPLE 2L A random walk in the plane Consider a particle initially

located at a given point in the plane and suppose that it undergoes a sequence of steps of fixed length but in a completely random direction. Specifically, suppose that the new position after each step is one unit of distance from the previous position and at an angle of orientation from the previous position that is uniformly distributed over (0, 2π) (see Figure 7.1). Compute the expected square of the distance from the origin after n steps.

Page 26: Chapter 7

26

Page 27: Chapter 7

27

Page 28: Chapter 7

28

Analyzing the quick-sort algorithm

Page 29: Chapter 7

29

Page 30: Chapter 7

30

Example 2n The probability of a union of events.

Page 31: Chapter 7

31

Page 32: Chapter 7

32

Page 33: Chapter 7

33

Page 34: Chapter 7

34

Example 2O

Page 35: Chapter 7

35

Example 2p

Page 36: Chapter 7

36

342~347

Page 37: Chapter 7

37

7.3 Moments of the Number of Events that Occurs

Page 38: Chapter 7

38

7.4 Covariance, Variance of Sum, and Correlations

Page 39: Chapter 7

39

Page 40: Chapter 7

40

Page 41: Chapter 7

41

Page 42: Chapter 7

42

Page 43: Chapter 7

43

Page 44: Chapter 7

44

Page 45: Chapter 7

45

Page 46: Chapter 7

46

Page 47: Chapter 7

47

Page 48: Chapter 7

48

Page 49: Chapter 7

49

Page 50: Chapter 7

50

Example 4b Variance of a binomial random variable

Page 51: Chapter 7

51

Page 52: Chapter 7

52

Page 53: Chapter 7

53

Page 54: Chapter 7

54

Page 55: Chapter 7

55

Page 56: Chapter 7

56

Page 57: Chapter 7

57

Page 58: Chapter 7

58

Page 59: Chapter 7

59

Page 60: Chapter 7

60

Page 61: Chapter 7

61

Page 62: Chapter 7

62

Page 63: Chapter 7

63

Page 64: Chapter 7

64

7.5 Conditional ExpectationDefinition

Page 65: Chapter 7

65

EXAMPLE 5a If X and Y are independent binomial random variables

with identical parameters n and p, calculate the conditional expected value of X, given that X + Y = m.

Page 66: Chapter 7

66

Page 67: Chapter 7

67

Page 68: Chapter 7

68

Page 69: Chapter 7

69

7.5.2 Computing Expectations by ConditioningLet us denote by E[X |Y] that function of the

random variable Y whose value at Y = y is E[X |Y = y]. Note that E[X |Y] is itself a random variable. An extremely important property of conditional expectations is given by the following proposition.

Proposition 5.1 E[X] = E[E[X | Y]] (5.1)

Page 70: Chapter 7

70

Page 71: Chapter 7

71

Page 72: Chapter 7

72

EXAMPLE 5cA miner is trapped in a mine containing 3 doors.

The first door leads to a tunnel that will take him to safety after 3 hours of travel. The second door leads to a tunnel that will return him to the mine after 5 hours of travel. The third door leads to a tunnel that will return him to the mine after 7 hours. If we assume that the miner is at all times equally likely to choose any one of the doors, what is the expected length of time until he reaches safety?

Page 73: Chapter 7

73

Page 74: Chapter 7

74

Page 75: Chapter 7

75

EXAMPLE 5d Expectation of a sum of a random number of

random variables Suppose that the number of people entering a

department store on a given day is a random variable with mean 50. Suppose further that the amounts of money spent by these customers are independent random variables having a common mean of $8. Assume also that the amount of money spent by a customer is also independent of the total number of customers to enter the store. What is the expected amount of money spent in the store on a given day?

Page 76: Chapter 7

76

Page 77: Chapter 7

77

EXAMPLE 5eThe game of craps is begun by rolling an

ordinary pair of dice. If the sum of the dice is 2, 3, or 12, the player loses. If it is 7 or 11, the player wins. If it is any other number i, the player continues to roll the dice until the sum is either 7 or i. If it is 7, the player loses; if it is i, the player wins. Let R denote the number of rolls of the dice in a game of craps. FindE[R];E[R | player wins];E[R | player loses].

Page 78: Chapter 7

78

Page 79: Chapter 7

79

Page 80: Chapter 7

80

Page 81: Chapter 7

81

Page 82: Chapter 7

82

Example 5f

Page 83: Chapter 7

83

Page 84: Chapter 7

84

Example 5g

Page 85: Chapter 7

85

Page 86: Chapter 7

86

Page 87: Chapter 7

87

EXAMPLE 5h Variance of the geometric distribution Independent trials each resulting in a success with probability p

are successively per-formed. Let N be the time of the first success. Find Var(N).

Page 88: Chapter 7

88

Page 89: Chapter 7

89

Example 5i

Page 90: Chapter 7

90

Page 91: Chapter 7

91

7.5.3 Computing properties by conditioning

Page 92: Chapter 7

92

EXAMPLE 5j The best prize problem Suppose that we are to be presented with n distinct

prizes in sequence. After being presented with a prize we must immediately decide whether to accept it or to reject it and consider the next prize. The only information we are given when deciding whether to accept a prize is the relative rank of that prize compared to ones already seen. That is, for instance. when the fifth prize is presented. we learn how it compares with the four prizes already seen. Suppose that once a prize is rejected it is lost, and that our objective is to maximize the probability of obtaining the best prize. Assuming that all n! orderings of the prizes are equally likely, how well can we do?

Page 93: Chapter 7

93

Page 94: Chapter 7

94

Page 95: Chapter 7

95

Page 96: Chapter 7

96

Page 97: Chapter 7

97

EXAMPLE 5k Let U be a uniform random variable on (0, 1), and

suppose that the conditional dis tribution of X, given that U = p, is binomial with parameters n and p. Find the probability mass function of X.

Page 98: Chapter 7

98

Page 99: Chapter 7

99

Example 5l

Page 100: Chapter 7

100

Example 5m

Page 101: Chapter 7

101

7.5.4 Conditional Variance

Page 102: Chapter 7

102

Page 103: Chapter 7

103

EXAMPLE 5n Suppose that by any time t the number of people that have

arrived at a train depot is a Poisson random variable with mean a.t. If the initial train arrives at the depot at a time (independent of when the passengers arrive) that is uniformly distributed over (0, T), what is the mean and variance of the number of passengers that enter the train?

Page 104: Chapter 7

104

Page 105: Chapter 7

105

Example 5o

Page 106: Chapter 7

106

7.6 Conditional Expectation

Page 107: Chapter 7

107

Page 108: Chapter 7

108

Example 6a

Page 109: Chapter 7

109

EXAMPLE 6bSuppose that if a signal value s is sent from

location A, then the signal value received at location B is normally distributed with parameters (s, 1). If S, the value of the signal sent at A, is normally distributed with parameters Cu., a-2), what is the best estimate of the signal sent if R, the value received at B, is equal to r?

Page 110: Chapter 7

110

Page 111: Chapter 7

111

Page 112: Chapter 7

112

Example 6c

Page 113: Chapter 7

113

Page 114: Chapter 7

114

Page 115: Chapter 7

115

Page 116: Chapter 7

116

Page 117: Chapter 7

117

Page 118: Chapter 7

118

7.7 Conditional Expectation and Prediction

Page 119: Chapter 7

119

Page 120: Chapter 7

120

Page 121: Chapter 7

121

Example 7a

Page 122: Chapter 7

122

Page 123: Chapter 7

123

Example 7b

Page 124: Chapter 7

124

Example 7c

Page 125: Chapter 7

125

Example 7d

Page 126: Chapter 7

126

Page 127: Chapter 7

127

Page 128: Chapter 7

128

Page 129: Chapter 7

129

Continuous Prob. Dist.

Page 130: Chapter 7

130

Page 131: Chapter 7

131

Example 7e

Page 132: Chapter 7

132

Example 7f

Page 133: Chapter 7

133

Page 134: Chapter 7

134

Example 7h

Page 135: Chapter 7

135

Example 7i

Page 136: Chapter 7

136

Example 7j

Page 137: Chapter 7

137

Page 138: Chapter 7

138

Page 139: Chapter 7

139

Page 140: Chapter 7

140

EXAMPLE 7kLet Y denote a uniform random variable on (0,

1), and suppose that conditional on Y = p, the random variable X has a binomial distribution with parameters n and p. In Example 5k we showed that X is equally likely to take on any of the values 0, 1, ... , n. Establish this result by using moment generating functions.

Page 141: Chapter 7

141

Page 142: Chapter 7

142

7.7.1 Joint Moment Generating Functions

Page 143: Chapter 7

143

Example 7l

Page 144: Chapter 7

144

Page 145: Chapter 7

145

Page 146: Chapter 7

146

7.9 Additional Properties of Normal R. V. 7.8.1 Multivariate Normal Dist.

Page 147: Chapter 7

147

Page 148: Chapter 7

148

Page 149: Chapter 7

149

Page 150: Chapter 7

150

Page 151: Chapter 7

151

Page 152: Chapter 7

152

Page 153: Chapter 7

153

Page 154: Chapter 7

154

Page 155: Chapter 7

155

Page 156: Chapter 7

156

Page 157: Chapter 7

157

Page 158: Chapter 7

158

7.9 General Definition of Expectation

Page 159: Chapter 7

159

Page 160: Chapter 7

160

Page 161: Chapter 7

161

Page 162: Chapter 7

162