18.440: lecture 14 .1in more discrete random...
TRANSCRIPT
![Page 1: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/1.jpg)
18.440: Lecture 14
More discrete random variables
Scott Sheffield
MIT
18.440 Lecture 14
![Page 2: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/2.jpg)
Outline
Geometric random variables
Negative binomial random variables
Problems
18.440 Lecture 14
![Page 3: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/3.jpg)
Outline
Geometric random variables
Negative binomial random variables
Problems
18.440 Lecture 14
![Page 4: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/4.jpg)
Geometric random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the first heads is on the X th toss.
I For example, if the coin sequence is T ,T ,H,T ,H,T , . . . thenX = 3.
I Then X is a random variable. What is P{X = k}?I Answer: P{X = k} = (1− p)k−1p = qk−1p, where q = 1− p
is tails probability.
I Can you prove directly that these probabilities sum to one?
I Say X is a geometric random variable with parameter p.
18.440 Lecture 14
![Page 5: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/5.jpg)
Geometric random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the first heads is on the X th toss.
I For example, if the coin sequence is T ,T ,H,T ,H,T , . . . thenX = 3.
I Then X is a random variable. What is P{X = k}?I Answer: P{X = k} = (1− p)k−1p = qk−1p, where q = 1− p
is tails probability.
I Can you prove directly that these probabilities sum to one?
I Say X is a geometric random variable with parameter p.
18.440 Lecture 14
![Page 6: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/6.jpg)
Geometric random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the first heads is on the X th toss.
I For example, if the coin sequence is T ,T ,H,T ,H,T , . . . thenX = 3.
I Then X is a random variable. What is P{X = k}?I Answer: P{X = k} = (1− p)k−1p = qk−1p, where q = 1− p
is tails probability.
I Can you prove directly that these probabilities sum to one?
I Say X is a geometric random variable with parameter p.
18.440 Lecture 14
![Page 7: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/7.jpg)
Geometric random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the first heads is on the X th toss.
I For example, if the coin sequence is T ,T ,H,T ,H,T , . . . thenX = 3.
I Then X is a random variable. What is P{X = k}?
I Answer: P{X = k} = (1− p)k−1p = qk−1p, where q = 1− pis tails probability.
I Can you prove directly that these probabilities sum to one?
I Say X is a geometric random variable with parameter p.
18.440 Lecture 14
![Page 8: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/8.jpg)
Geometric random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the first heads is on the X th toss.
I For example, if the coin sequence is T ,T ,H,T ,H,T , . . . thenX = 3.
I Then X is a random variable. What is P{X = k}?I Answer: P{X = k} = (1− p)k−1p = qk−1p, where q = 1− p
is tails probability.
I Can you prove directly that these probabilities sum to one?
I Say X is a geometric random variable with parameter p.
18.440 Lecture 14
![Page 9: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/9.jpg)
Geometric random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the first heads is on the X th toss.
I For example, if the coin sequence is T ,T ,H,T ,H,T , . . . thenX = 3.
I Then X is a random variable. What is P{X = k}?I Answer: P{X = k} = (1− p)k−1p = qk−1p, where q = 1− p
is tails probability.
I Can you prove directly that these probabilities sum to one?
I Say X is a geometric random variable with parameter p.
18.440 Lecture 14
![Page 10: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/10.jpg)
Geometric random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the first heads is on the X th toss.
I For example, if the coin sequence is T ,T ,H,T ,H,T , . . . thenX = 3.
I Then X is a random variable. What is P{X = k}?I Answer: P{X = k} = (1− p)k−1p = qk−1p, where q = 1− p
is tails probability.
I Can you prove directly that these probabilities sum to one?
I Say X is a geometric random variable with parameter p.
18.440 Lecture 14
![Page 11: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/11.jpg)
Geometric random variable expectation
I Let X be a geometric with parameter p, i.e.,P{X = k} = (1− p)k−1p = qk−1p for k ≥ 1.
I What is E [X ]?
I By definition E [X ] =∑∞
k=1 qk−1pk.
I There’s a trick to computing sums like this.
I Note E [X − 1] =∑∞
k=1 qk−1p(k − 1). Setting j = k − 1, we
have E [X − 1] = q∑∞
j=0 qj−1pj = qE [X ].
I Kind of makes sense. X − 1 is “number of extra tosses afterfirst.” Given first coin heads (probability p), X − 1 is 0. Givenfirst coin tails (probability q), conditional law of X − 1 isgeometric with parameter p. In latter case, conditionalexpectation of X − 1 is same as a priori expectation of X .
I Thus E [X ]− 1 = E [X − 1] = p · 0 + qE [X ] = qE [X ] andsolving for E [X ] gives E [X ] = 1/(1− q) = 1/p.
18.440 Lecture 14
![Page 12: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/12.jpg)
Geometric random variable expectation
I Let X be a geometric with parameter p, i.e.,P{X = k} = (1− p)k−1p = qk−1p for k ≥ 1.
I What is E [X ]?
I By definition E [X ] =∑∞
k=1 qk−1pk.
I There’s a trick to computing sums like this.
I Note E [X − 1] =∑∞
k=1 qk−1p(k − 1). Setting j = k − 1, we
have E [X − 1] = q∑∞
j=0 qj−1pj = qE [X ].
I Kind of makes sense. X − 1 is “number of extra tosses afterfirst.” Given first coin heads (probability p), X − 1 is 0. Givenfirst coin tails (probability q), conditional law of X − 1 isgeometric with parameter p. In latter case, conditionalexpectation of X − 1 is same as a priori expectation of X .
I Thus E [X ]− 1 = E [X − 1] = p · 0 + qE [X ] = qE [X ] andsolving for E [X ] gives E [X ] = 1/(1− q) = 1/p.
18.440 Lecture 14
![Page 13: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/13.jpg)
Geometric random variable expectation
I Let X be a geometric with parameter p, i.e.,P{X = k} = (1− p)k−1p = qk−1p for k ≥ 1.
I What is E [X ]?
I By definition E [X ] =∑∞
k=1 qk−1pk.
I There’s a trick to computing sums like this.
I Note E [X − 1] =∑∞
k=1 qk−1p(k − 1). Setting j = k − 1, we
have E [X − 1] = q∑∞
j=0 qj−1pj = qE [X ].
I Kind of makes sense. X − 1 is “number of extra tosses afterfirst.” Given first coin heads (probability p), X − 1 is 0. Givenfirst coin tails (probability q), conditional law of X − 1 isgeometric with parameter p. In latter case, conditionalexpectation of X − 1 is same as a priori expectation of X .
I Thus E [X ]− 1 = E [X − 1] = p · 0 + qE [X ] = qE [X ] andsolving for E [X ] gives E [X ] = 1/(1− q) = 1/p.
18.440 Lecture 14
![Page 14: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/14.jpg)
Geometric random variable expectation
I Let X be a geometric with parameter p, i.e.,P{X = k} = (1− p)k−1p = qk−1p for k ≥ 1.
I What is E [X ]?
I By definition E [X ] =∑∞
k=1 qk−1pk.
I There’s a trick to computing sums like this.
I Note E [X − 1] =∑∞
k=1 qk−1p(k − 1). Setting j = k − 1, we
have E [X − 1] = q∑∞
j=0 qj−1pj = qE [X ].
I Kind of makes sense. X − 1 is “number of extra tosses afterfirst.” Given first coin heads (probability p), X − 1 is 0. Givenfirst coin tails (probability q), conditional law of X − 1 isgeometric with parameter p. In latter case, conditionalexpectation of X − 1 is same as a priori expectation of X .
I Thus E [X ]− 1 = E [X − 1] = p · 0 + qE [X ] = qE [X ] andsolving for E [X ] gives E [X ] = 1/(1− q) = 1/p.
18.440 Lecture 14
![Page 15: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/15.jpg)
Geometric random variable expectation
I Let X be a geometric with parameter p, i.e.,P{X = k} = (1− p)k−1p = qk−1p for k ≥ 1.
I What is E [X ]?
I By definition E [X ] =∑∞
k=1 qk−1pk.
I There’s a trick to computing sums like this.
I Note E [X − 1] =∑∞
k=1 qk−1p(k − 1). Setting j = k − 1, we
have E [X − 1] = q∑∞
j=0 qj−1pj = qE [X ].
I Kind of makes sense. X − 1 is “number of extra tosses afterfirst.” Given first coin heads (probability p), X − 1 is 0. Givenfirst coin tails (probability q), conditional law of X − 1 isgeometric with parameter p. In latter case, conditionalexpectation of X − 1 is same as a priori expectation of X .
I Thus E [X ]− 1 = E [X − 1] = p · 0 + qE [X ] = qE [X ] andsolving for E [X ] gives E [X ] = 1/(1− q) = 1/p.
18.440 Lecture 14
![Page 16: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/16.jpg)
Geometric random variable expectation
I Let X be a geometric with parameter p, i.e.,P{X = k} = (1− p)k−1p = qk−1p for k ≥ 1.
I What is E [X ]?
I By definition E [X ] =∑∞
k=1 qk−1pk.
I There’s a trick to computing sums like this.
I Note E [X − 1] =∑∞
k=1 qk−1p(k − 1). Setting j = k − 1, we
have E [X − 1] = q∑∞
j=0 qj−1pj = qE [X ].
I Kind of makes sense. X − 1 is “number of extra tosses afterfirst.” Given first coin heads (probability p), X − 1 is 0. Givenfirst coin tails (probability q), conditional law of X − 1 isgeometric with parameter p. In latter case, conditionalexpectation of X − 1 is same as a priori expectation of X .
I Thus E [X ]− 1 = E [X − 1] = p · 0 + qE [X ] = qE [X ] andsolving for E [X ] gives E [X ] = 1/(1− q) = 1/p.
18.440 Lecture 14
![Page 17: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/17.jpg)
Geometric random variable expectation
I Let X be a geometric with parameter p, i.e.,P{X = k} = (1− p)k−1p = qk−1p for k ≥ 1.
I What is E [X ]?
I By definition E [X ] =∑∞
k=1 qk−1pk.
I There’s a trick to computing sums like this.
I Note E [X − 1] =∑∞
k=1 qk−1p(k − 1). Setting j = k − 1, we
have E [X − 1] = q∑∞
j=0 qj−1pj = qE [X ].
I Kind of makes sense. X − 1 is “number of extra tosses afterfirst.” Given first coin heads (probability p), X − 1 is 0. Givenfirst coin tails (probability q), conditional law of X − 1 isgeometric with parameter p. In latter case, conditionalexpectation of X − 1 is same as a priori expectation of X .
I Thus E [X ]− 1 = E [X − 1] = p · 0 + qE [X ] = qE [X ] andsolving for E [X ] gives E [X ] = 1/(1− q) = 1/p.
18.440 Lecture 14
![Page 18: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/18.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 19: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/19.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 20: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/20.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 21: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/21.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 22: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/22.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 23: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/23.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 24: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/24.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 25: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/25.jpg)
Geometric random variable variance
I Let X be a geometric random variable with parameter p.Then P{X = k} = qk−1p.
I What is E [X 2]?
I By definition E [X 2] =∑∞
k=1 qk−1pk2.
I Let’s try to come up with a similar trick.
I Note E [(X − 1)2] =∑∞
k=1 qk−1p(k − 1)2. Setting j = k − 1,
we have E [(X − 1)2] = q∑∞
j=0 qj−1pj2 = qE [X 2].
I Thus E [(X − 1)2] = E [X 2 − 2X + 1] = E [X 2]− 2E [X ] + 1 =E [X 2]− 2/p + 1 = qE [X 2].
I Solving for E [X 2] gives (1− q)E [X 2] = pE [X 2] = 2/p − 1, soE [X 2] = (2− p)/p2.
I Var[X ] = (2−p)/p2−1/p2 = (1−p)/p2 = 1/p2−1/p = q/p2.
18.440 Lecture 14
![Page 26: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/26.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 27: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/27.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?
I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 28: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/28.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 29: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/29.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 30: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/30.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 31: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/31.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 32: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/32.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 33: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/33.jpg)
Example
I Toss die repeatedly. Say we get 6 for first time on X th toss.
I What is P{X = k}?I Answer: (5/6)k−1(1/6).
I What is E [X ]?
I Answer: 6.
I What is Var[X ]?
I Answer: 1/p2 − 1/p = 36− 6 = 30.
I Takes 1/p coin tosses on average to see a heads.
18.440 Lecture 14
![Page 34: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/34.jpg)
Outline
Geometric random variables
Negative binomial random variables
Problems
18.440 Lecture 14
![Page 35: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/35.jpg)
Outline
Geometric random variables
Negative binomial random variables
Problems
18.440 Lecture 14
![Page 36: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/36.jpg)
Negative binomial random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I For example, if r = 3 and the coin sequence isT ,T ,H,H,T ,T ,H,T ,T , . . . then X = 7.
I Then X is a random variable. What is P{X = k}?I Answer: need exactly r − 1 heads among first k − 1 tosses
and a heads on the kth toss.
I So P{X = k} =(k−1r−1)pr−1(1− p)k−rp. Can you prove these
sum to 1?
I Call X negative binomial random variable withparameters (r , p).
18.440 Lecture 14
![Page 37: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/37.jpg)
Negative binomial random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I For example, if r = 3 and the coin sequence isT ,T ,H,H,T ,T ,H,T ,T , . . . then X = 7.
I Then X is a random variable. What is P{X = k}?I Answer: need exactly r − 1 heads among first k − 1 tosses
and a heads on the kth toss.
I So P{X = k} =(k−1r−1)pr−1(1− p)k−rp. Can you prove these
sum to 1?
I Call X negative binomial random variable withparameters (r , p).
18.440 Lecture 14
![Page 38: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/38.jpg)
Negative binomial random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I For example, if r = 3 and the coin sequence isT ,T ,H,H,T ,T ,H,T ,T , . . . then X = 7.
I Then X is a random variable. What is P{X = k}?I Answer: need exactly r − 1 heads among first k − 1 tosses
and a heads on the kth toss.
I So P{X = k} =(k−1r−1)pr−1(1− p)k−rp. Can you prove these
sum to 1?
I Call X negative binomial random variable withparameters (r , p).
18.440 Lecture 14
![Page 39: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/39.jpg)
Negative binomial random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I For example, if r = 3 and the coin sequence isT ,T ,H,H,T ,T ,H,T ,T , . . . then X = 7.
I Then X is a random variable. What is P{X = k}?
I Answer: need exactly r − 1 heads among first k − 1 tossesand a heads on the kth toss.
I So P{X = k} =(k−1r−1)pr−1(1− p)k−rp. Can you prove these
sum to 1?
I Call X negative binomial random variable withparameters (r , p).
18.440 Lecture 14
![Page 40: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/40.jpg)
Negative binomial random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I For example, if r = 3 and the coin sequence isT ,T ,H,H,T ,T ,H,T ,T , . . . then X = 7.
I Then X is a random variable. What is P{X = k}?I Answer: need exactly r − 1 heads among first k − 1 tosses
and a heads on the kth toss.
I So P{X = k} =(k−1r−1)pr−1(1− p)k−rp. Can you prove these
sum to 1?
I Call X negative binomial random variable withparameters (r , p).
18.440 Lecture 14
![Page 41: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/41.jpg)
Negative binomial random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I For example, if r = 3 and the coin sequence isT ,T ,H,H,T ,T ,H,T ,T , . . . then X = 7.
I Then X is a random variable. What is P{X = k}?I Answer: need exactly r − 1 heads among first k − 1 tosses
and a heads on the kth toss.
I So P{X = k} =(k−1r−1)pr−1(1− p)k−rp. Can you prove these
sum to 1?
I Call X negative binomial random variable withparameters (r , p).
18.440 Lecture 14
![Page 42: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/42.jpg)
Negative binomial random variables
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I For example, if r = 3 and the coin sequence isT ,T ,H,H,T ,T ,H,T ,T , . . . then X = 7.
I Then X is a random variable. What is P{X = k}?I Answer: need exactly r − 1 heads among first k − 1 tosses
and a heads on the kth toss.
I So P{X = k} =(k−1r−1)pr−1(1− p)k−rp. Can you prove these
sum to 1?
I Call X negative binomial random variable withparameters (r , p).
18.440 Lecture 14
![Page 43: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/43.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 44: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/44.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 45: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/45.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 46: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/46.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 47: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/47.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 48: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/48.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 49: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/49.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 50: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/50.jpg)
Expectation of binomial random variable
I Consider an infinite sequence of independent tosses of a cointhat comes up heads with probability p.
I Let X be such that the rth heads is on the X th toss.
I Then X is a negative binomial random variable withparameters (r , p).
I What is E [X ]?
I Write X = X1 + X2 + . . .+ Xr where Xk is number of tosses(following (k − 1)th head) required to get kth head. Each Xk
is geometric with parameter p.
I So E [X ] = E [X1 + X2 + . . .+ Xr ] =E [X1] + E [X2] + . . .+ E [Xr ] = r/p.
I How about Var[X ]?
I Turns out that Var[X ] = Var[X1] + Var[X2] + . . .+ Var[Xr ].So Var[X ] = rq/p2.
18.440 Lecture 14
![Page 51: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/51.jpg)
Outline
Geometric random variables
Negative binomial random variables
Problems
18.440 Lecture 14
![Page 52: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/52.jpg)
Outline
Geometric random variables
Negative binomial random variables
Problems
18.440 Lecture 14
![Page 53: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/53.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 54: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/54.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 55: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/55.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 56: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/56.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 57: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/57.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 58: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/58.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 59: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/59.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 60: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/60.jpg)
Problems
I Nate and Natasha have beautiful new baby. Each minute with.01 probability (independent of all else) baby cries.
I Additivity of expectation: How many times do they expectthe baby to cry between 9 p.m. and 6 a.m.?
I Geometric random variables: What’s the probability baby isquiet from midnight to three, then cries at exactly three?
I Geometric random variables: What’s the probability baby isquiet from midnight to three?
I Negative binomial: Probability fifth cry is at midnight?
I Negative binomial expectation: How many minutes do Iexpect to wait until the fifth cry?
I Poisson approximation: Approximate the probability thereare exactly five cries during the night.
I Exponential random variable approximation: Approximateprobability baby quiet all night.
18.440 Lecture 14
![Page 61: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/61.jpg)
More fun problems
I Suppose two soccer teams play each other. One team’snumber of points is Poisson with parameter λ1 and other’s isindependently Poisson with parameter λ2. (You can google“soccer” and “Poisson” to see the academic literature on theuse of Poisson random variables to model soccer scores.)Using Mathematica (or similar software) compute theprobability that the first team wins if λ1 = 2 and λ2 = 1.What if λ1 = 2 and λ2 = .5?
I Imagine you start with the number 60. Then you toss a faircoin to decide whether to add 5 to your number or subtract 5from it. Repeat this process with independent coin tossesuntil the number reaches 100 or 0. What is the expectednumber of tosses needed until this occurs?
18.440 Lecture 14
![Page 62: 18.440: Lecture 14 .1in More discrete random variablesmath.mit.edu/~sheffield/440/Lecture14.pdf18.440: Lecture 14 More discrete random variables Scott She eld MIT 18.440 Lecture 14](https://reader033.vdocuments.net/reader033/viewer/2022060920/60ac316db1283062ff63c51e/html5/thumbnails/62.jpg)
More fun problems
I Suppose two soccer teams play each other. One team’snumber of points is Poisson with parameter λ1 and other’s isindependently Poisson with parameter λ2. (You can google“soccer” and “Poisson” to see the academic literature on theuse of Poisson random variables to model soccer scores.)Using Mathematica (or similar software) compute theprobability that the first team wins if λ1 = 2 and λ2 = 1.What if λ1 = 2 and λ2 = .5?
I Imagine you start with the number 60. Then you toss a faircoin to decide whether to add 5 to your number or subtract 5from it. Repeat this process with independent coin tossesuntil the number reaches 100 or 0. What is the expectednumber of tosses needed until this occurs?
18.440 Lecture 14