07: random variables ii - stanford...
Post on 14-Oct-2020
5 Views
Preview:
TRANSCRIPT
07: Random Variables IILisa Yan
April 22, 2020
1
Binomial RV
2
07d_binomial
Lisa Yan, CS109, 2020
Consider an experiment: π independent trials of Ber(π) random variables.
def A Binomial random variable π is the number of successes in π trials.
Examples:β’ # heads in n coin flips
β’ # of 1βs in randomly generated length n bit string
β’ # of disk drives crashed in 1000 computer cluster(assuming disks crash independently)
Binomial Random Variable
3
π = 0, 1,β¦ , π:
π π = π = π π =ππ
ππ 1 β π πβππ~Bin(π, π)
Support: {0,1,β¦ , π}
PMF
πΈ π = ππVar π = ππ(1 β π)Variance
Expectation
Lisa Yan, CS109, 2020 4
Lisa Yan, CS109, 2020
Reiterating notation
The parameters of a Binomial random variable:
β’ π: number of independent trials
β’ π: probability of success on each trial
5
1. The random
variable
2. is distributed
as a3. Binomial 4. with parameters
π ~ Bin(π, π)
Lisa Yan, CS109, 2020
Reiterating notation
If π is a binomial with parameters π and π, the PMF of π is
6
π ~ Bin(π, π)
π π = π =ππ
ππ 1 β π πβπ
Probability Mass Function for a BinomialProbability that πtakes on the value π
Lisa Yan, CS109, 2020
Three coin flips
Three fair (βheadsβ with π = 0.5) coins are flipped.
β’ π is number of heads
β’ π~Bin 3, 0.5
Compute the following event probabilities:
7
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
π π = 0
π π = 1
π π = 2
π π = 3
π π = 7
P(event)π€
Lisa Yan, CS109, 2020
Three coin flips
Three fair (βheadsβ with π = 0.5) coins are flipped.
β’ π is number of heads
β’ π~Bin 3, 0.5
Compute the following event probabilities:
8
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
π π = 0 = π 0 =30
π0 1 β π 3 =1
8
π π = 1
π π = 2
π π = 3
π π = 7
= π 1 =31
π1 1 β π 2 =3
8
= π 2 =32
π2 1 β π 1 =3
8
= π 3 =33
π3 1 β π 0 =1
8
= π 7 = 0
P(event) PMF
Extra math note:
By Binomial Theorem,
we can prove
Οπ=0π π π = π = 1
Lisa Yan, CS109, 2020
Consider an experiment: π independent trials of Ber(π) random variables.
def A Binomial random variable π is the number of successes in π trials.
Examples:β’ # heads in n coin flips
β’ # of 1βs in randomly generated length n bit string
β’ # of disk drives crashed in 1000 computer cluster(assuming disks crash independently)
Binomial Random Variable
9
π~Bin(π, π)
Range: {0,1,β¦ , π} Variance
Expectation
PMF
πΈ π = ππVar π = ππ(1 β π)
π = 0, 1,β¦ , π:
π π = π = π π =ππ
ππ 1 β π πβπ
Lisa Yan, CS109, 2020
Ber π = Bin(1, π)
Binomial RV is sum of Bernoulli RVs
Bernoulli
β’ π~Ber(π)
Binomial
β’ π~Bin π, π
β’ The sum of π independent Bernoulli RVs
10
π =
π=1
π
ππ , ππ ~Ber(π)
+
+
+
Lisa Yan, CS109, 2020
Consider an experiment: π independent trials of Ber(π) random variables.
def A Binomial random variable π is the number of successes in π trials.
Examples:β’ # heads in n coin flips
β’ # of 1βs in randomly generated length n bit string
β’ # of disk drives crashed in 1000 computer cluster(assuming disks crash independently)
Binomial Random Variable
11
π~Bin(π, π)
Range: {0,1,β¦ , π}
πΈ π = ππVar π = ππ(1 β π)Variance
Expectation
PMF π = 0, 1,β¦ , π:
π π = π = π π =ππ
ππ 1 β π πβπ
Proof:
Lisa Yan, CS109, 2020
Consider an experiment: π independent trials of Ber(π) random variables.
def A Binomial random variable π is the number of successes in π trials.
Examples:β’ # heads in n coin flips
β’ # of 1βs in randomly generated length n bit string
β’ # of disk drives crashed in 1000 computer cluster(assuming disks crash independently)
Binomial Random Variable
12
π~Bin(π, π)
Range: {0,1,β¦ , π}
PMF
πΈ π = ππVar π = ππ(1 β π)
Weβll prove
this later in
the course
Variance
Expectation
π = 0, 1,β¦ , π:
π π = π = π π =ππ
ππ 1 β π πβπ
Lisa Yan, CS109, 2020
No, give me the variance proof right now
13
proofwiki.org
Poisson
14
08a_poisson
Lisa Yan, CS109, 2020
Before we start
The natural exponent π:
https://en.wikipedia.org/wiki/E_(mathematical_constant)
15
limπββ
1 βπ
π
π
= πβπ
Jacob Bernoulli
while studying
compound interest
in 1683
Lisa Yan, CS109, 2020
Algorithmic ride sharing
16
πββοΈ
πββοΈ
πββοΈ
πββοΈπββοΈ
Probability of π requests from this area in the next 1 minute?
On average, π = 5 requests per minuteSuppose we know:
Lisa Yan, CS109, 2020
Algorithmic ride sharing, approximately
At each second:β’ Independent trial
β’ You get a request (1) or you donβt (0).
Let π = # of requests in minute.
πΈ π = π = 5
17
Probability of π requests from this area in the next 1 minute?
On average, π = 5 requests per minute
0 0 1 0 1 β¦ 0 0 0 0 1
1 2 3 4 5 60
π ~ Bin π = 60, π = 5/60
Break a minute down into 60 seconds:
π π = π =60π
5
60
π
1 β5
60
πβπ
But what if there are tworequests in the same second?π€
Lisa Yan, CS109, 2020
Algorithmic ride sharing, approximately
At each millisecond:β’ Independent trial
β’ You get a request (1) or you donβt (0).
Let π = # of requests in minute.
πΈ π = π = 5
18
Probability of π requests from this area in the next 1 minute?
On average, π = 5 requests per minute
Break a minute down into 60,000 milliseconds:
π π = π =ππ
π
π
π
1 βπ
π
πβπ
β¦
1 60,000
π ~ Bin π = 60000, π = π/π
But what if there are tworequests in the same millisecond?
π€
Lisa Yan, CS109, 2020
Algorithmic ride sharing, approximately
For each time bucket:β’ Independent trial
β’ You get a request (1) or you donβt (0).
Let π = # of requests in minute.
πΈ π = π = 5
19
Probability of π requests from this area in the next 1 minute?
On average, π = 5 requests per minute
Break a minute down into infinitely small buckets:
π π = π
= limπββ
ππ
π
π
π
1 βπ
π
πβπ
Who wants to see some cool math?
OMG so small
1 β
π ~ Bin π, π = π/π
Lisa Yan, CS109, 2020
Binomial in the limit
20
π π = π
= limπββ
ππ
π
π
π
1 βπ
π
πβπ = limπββ
π!
π!(π β π)!
ππ
ππ
1 βlπ
π
1 βlπ
π
limπββ
1 βπ
π
π
= πβπ
= limπββ
π!
ππ(π β π)!
ππ
π!
1 βlπ
π
1 βlπ
π
= limπββ
π!
ππ(π β π)!
ππ
π!
πβπ
1 βlπ
π
= limπββ
π π β 1 β― π β π + 1
πππ β π !
π β π !
ππ
π!
πβπ
1 βlπ
π
= limπββ
ππ
ππππ
π!
πβπ
1=ππ
π!πβπ
Lisa Yan, CS109, 2020
Algorithmic ride sharing
21
πββοΈ
πββοΈ
πββοΈ
πββοΈπββοΈ
Probability of π requests from this area in the next 1 minute?
On average, π = 5 requests per minute
π π = π =ππ
π!πβπ
Lisa Yan, CS109, 2020
Simeon-Denis Poisson
French mathematician (1781 β 1840)
β’ Published his first paper at age 18
β’ Professor at age 21
β’ Published over 300 papers
βLife is only good for two things: doing mathematics and teaching it.β
22
Lisa Yan, CS109, 2020
Consider an experiment that lasts a fixed interval of time.
def A Poisson random variable π is the number of successes over the experiment duration.
Examples:β’ # earthquakes per year
β’ # server hits per second
β’ # of emails per day
Yes, expectation == variance
for Poisson RV! More later.
Poisson Random Variable
23
π π = π = πβπππ
π!π~Poi(π)
Support: {0,1, 2, β¦ }
PMF
πΈ π = πVar π = πVariance
Expectation
Lisa Yan, CS109, 2020
Earthquakes
There are an average of 2.79 major earthquakes in the world each year.
What is the probability of 3 major earthquakes happening next year?
24
π π = πβπππ
π!
1. Define RVs
2. Solve
0
0.05
0.1
0.15
0.2
0.25
0.3
0 1 2 3 4 5 6 7 8 9 10
π(π
= π
)
Number of earthquakes, π
π~Poi(π)
πΈ π = π
Lisa Yan, CS109, 2020
Are earthquakes really Poissonian?
25
Poisson Paradigm
26
08b_poisson_paradigm
Lisa Yan, CS109, 2020
DNA
27
All the movies, images,
emails and other digital
data from more than
600 smartphones
(10,000 GB) can be
stored in the faint pink
smear of DNA at the end
of this test tube.
What is the probability
that DNA storage stays
uncorrupted?
Lisa Yan, CS109, 2020
DNA
What is the probability that DNA storage stays uncorrupted?β’ In DNA (and real networks), we store large strings.β’ Let string length be long, e.g., π β 104
β’ Probability of corruption of each base pair is very small, e.g., π = 10β6
β’ Let π = # of corruptions.
What is P(DNA storage is uncorrupted) = π π = 0 ?
28
1. Approach 1:
π~Bin π = 104, π = 10β6
π π = π =ππ
ππ 1 β π πβπ
= 104
010β6β 0 1 β 10β6 104β0
β 0.99049829
2. Approach 2:
π~Poi π = 104 β 10β6 = 0.01
π π = π = πβπππ
π!= πβ0.01
0.010
0!
= πβ0.01
β 0.99049834
β οΈunwieldy!a good
approximation!
β
Lisa Yan, CS109, 2020
The Poisson Paradigm, part 1
Poisson approximates Binomial when π is large, π is small, and π = ππ is βmoderate.β
Different interpretations of βmoderateβ:
β’ π > 20 and π < 0.05
β’ π > 100 and π < 0.1
Poisson is Binomial in the limit:
β’ π = ππ, where π β β, π β 0
29
Poisson can approximate Binomial.
0
0.05
0.1
0.15
0.2
0.25
0.3
0 1 2 3 4 5 6 7 8 9 10
π(π
= π
)
π = π
Bin(10,0.3)
Bin(100,0.03)
Bin(1000,0.003)
Poi(3)
π~Poi(π)
πΈ π = π
π~Bin(π, π)
πΈ π = ππ
Lisa Yan, CS109, 2020
Consider an experiment that lasts a fixed interval of time.
def A Poisson random variable π is the number of occurrences over the experiment duration.
Examples:β’ # earthquakes per year
β’ # server hits per second
β’ # of emails per day
Time to show intuition for why
expectation == variance!
Poisson Random Variable
30
π π = π = πβπππ
π!π~Poi(π)
Support: {0,1, 2, β¦ } Variance
Expectation
PMF
πΈ π = πVar π = π
Lisa Yan, CS109, 2020
Properties of Poi(π) with the Poisson paradigm
Recall the Binomial:
Consider π~Poi(π), where π = ππ (π β β, π β 0):
Proof:
πΈ π = ππ = πVar π = ππ 1 β π β π 1 β 0 = π
31
π~Bin(π, π)Variance
Expectation πΈ π = ππVar π = ππ(1 β π)
Expectation πΈ π = πVar π = π
π~Poi(π)Variance
Lisa Yan, CS109, 2020
A Real License Plate Seen at Stanford
No, itβs not mineβ¦ but I kind of wish it was.
Lisa Yan, CS109, 2020
Poisson Paradigm, part 2
Poisson can still provide a good approximation of the Binomial,even when assumptions are βmildlyβ violated.
You can apply the Poisson approximation when:
β’ βSuccessesβ in trials are not entirely independente.g.: # entries in each bucket in large hash table.
β’ Probability of βSuccessβ in each trial varies (slightly),like a small relative change in a very small pe.g.: Average # requests to web server/sec may fluctuate
slightly due to load on network
33
π
We wonβt explore this too much,
but I want you to know it exists.
Other Discrete RVs
34
08c_other_discrete
Lisa Yan, CS109, 2020
Grid of random variables
35
Number of
successes
Ber(π)One trial
Several
trials
Interval
of time
Bin(π, π)
Poi(π) (tomorrow)
One success
Several
successes
Interval of time to
first success
Time until
success
π = 1
Focus on understanding how and when to use RVs, not on memorizing PMFs.
Lisa Yan, CS109, 2020
Consider an experiment: independent trials of Ber(π) random variables.
def A Geometric random variable π is the # of trials until the first success.
Examples:β’ Flipping a coin (π heads = π) until first heads appears
β’ Generate bits with π bit = 1 = π until first 1 generated
Geometric RV
36
π π = π = 1 β π πβ1ππ~Geo(π)
Support: {1, 2, β¦ }
PMF
πΈ π =1
π
Var π =1βπ
π2Variance
Expectation
Lisa Yan, CS109, 2020
Consider an experiment: independent trials of Ber(π) random variables.
def A Negative Binomial random variable π is the # of trials until π successes.
Examples:β’ Flipping a coin until ππ‘β heads appears
β’ # of strings to hash into table until bucket 1 has π entries
Negative Binomial RV
37
π π = π =π β 1π β 1
1 β π πβππππ~NegBin(π, π)
Support: {π, π + 1,β¦ }
PMF
πΈ π =π
π
Var π =π 1βπ
π2Variance
Expectation
(fixed lecture error)
Geo π = NegBin(1, π)
Lisa Yan, CS109, 2020
Grid of random variables
38
Number of
successes
Ber(π)One trial
Several
trials
Interval
of time
Bin(π, π)
Poi(π)
Geo(π)
NegBin(π, π)
(tomorrow)
One success
Several
successes
Interval of time to
first success
Time until
success
π = 1 π = 1
Lisa Yan, CS109, 2020
Catching Pokemon
Wild Pokemon are captured by throwing Pokeballs at them.
β’ Each ball has probability p = 0.1 of capturing the Pokemon.
β’ Each ball is an independent trial.
What is the probability that you catch the Pokemon on the 5th try?
39
1. Define events/ RVs & state goal
A. π~Bin 5, 0.1B. π~Poi 0.5C. π~NegBin 5, 0.1D. π~NegBin 1, 0.1E. π~Geo 0.1F. None/other
2. Solve
π~some distribution
Want: π π = 5
π€
Lisa Yan, CS109, 2020
Wild Pokemon are captured by throwing Pokeballs at them.
β’ Each ball has probability p = 0.1 of capturing the Pokemon.
β’ Each ball is an independent trial.
What is the probability that you catch the Pokemon on the 5th try?
A. π~Bin 5, 0.1B. π~Poi 0.5C. π~NegBin 5, 0.1D. π~NegBin 1, 0.1E. π~Geo 0.1F. None/other
Catching Pokemon
40
1. Define events/ RVs & state goal
2. Solve
π~some distribution
Want: π π = 5
Lisa Yan, CS109, 2020
2. Solve
Catching Pokemon
Wild Pokemon are captured by throwing Pokeballs at them.
β’ Each ball has probability p = 0.1 of capturing the Pokemon.
β’ Each ball is an independent trial.
What is the probability that you catch the Pokemon on the 5th try?
41
1. Define events/ RVs & state goal
2. Solve
π~Geo 0.1
Want: π π = 5
π~Geo(π) π π = 1 β π πβ1π
(live)08: Random Variables IIIIOishi Banerjee and Cooper RaterinkAdapted from Lisa YanJuly 8, 2020
42
Lisa Yan, CS109, 2020
Our first common RVs
43
π ~ Ber(π)
π ~ Bin(π, π)
1. The random
variable
2. is distributed
as a3. Bernoulli 4. with parameter
Example: Heads in one coin flip,
P(heads) = 0.8 = p
Example: # heads in 40 coin flips,
P(heads) = 0.8 = p
otherwise Identify PMF, or
identify as a function of an
existing random variable
Review
ThinkThe next slide has a matching question to go over by yourself. Weβll go over it together afterwards.
Post any clarifications here!
https://us.edstem.org/courses/667/discussion/84212
Think by yourself: 2 min
44
π€(by yourself)
Lisa Yan, CS109, 2020
Visualizing Binomial PMFs
45
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
πΈ π = ππ
C. D.
Match the distribution
to the graph:
1. Bin 10,0.5
2. Bin 10,0.3
3. Bin 10,0.7
4. Bin 5,0.5
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
A. B.
π
ππ=π
π
ππ=π
π
ππ=π
π
ππ=π
π€(by yourself)
Lisa Yan, CS109, 2020
Visualizing Binomial PMFs
46
Match the distribution
to the graph:
1. Bin 10,0.5
2. Bin 10,0.3
3. Bin 10,0.7
4. Bin 5,0.5
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
C. D.
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
0
0.1
0.2
0.3
0 1 2 3 4 5 6 7 8 9 10
A. B.
π
ππ=π
π
ππ=π
π
ππ=π
π
ππ=π
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
πΈ π = ππ
Lisa Yan, CS109, 2020
Binomial RV is sum of Bernoulli RVs
Bernoulli
β’ π~Ber(π)
Binomial
β’ π~Bin π, π
β’ The sum of π independent Bernoulli RVs
47
π =
π=1
π
ππ , ππ ~Ber(π)
+
+
+
Review
Lisa Yan, CS109, 2020
NBA Finals and genetics
48
Think, thenBreakout Rooms
Check out the questions on the next slide. Post any clarifications here!
https://us.edstem.org/courses/667/discussion/84212
By yourself: 2 min
Breakout rooms: 5 min.
49
π€
Lisa Yan, CS109, 2020
NBA Finals and genetics
1. The Golden State Warriors are going to play the Toronto Raptors in a7-game series during the 2019 NBA finals.
β’ The Warriors have a probability of 58% of winning each game, independently.
β’ A team wins the series if they win at least 4 games (we play all 7 games).
What is P(Warriors winning)?
2. Each person has 2 genes per trait (e.g., eye color).β’ Child receives 1 gene (equally likely) from each parent
β’ Brown is βdominantβ, blue is βrecessiveβ:
β’ Child has brown eyes if either (or both) genes are brown
β’ Blue eyes only if both genes are blue.
β’ Parents each have 1 brown and 1 blue gene.
A family has 4 children. What is P(exactly 3 children with brown eyes)?
50
π€
Lisa Yan, CS109, 2020
NBA Finals
The Golden State Warriors are going to play the TorontoRaptors in a 7-game series during the 2019 NBA finals.β’ The Warriors have a probability of 58% of
winning each game, independently.
β’ A team wins the series if they win at least 4 games(we play all 7 games).
What is P(Warriors winning)?
51
1. Define events/ RVs & state goal
π: # games Warriors win
π~Bin(7, 0.58)
Want:
Desired probability? (select all that apply)
A. π π > 4B. π π β₯ 4C. π π > 3D. 1 β π π β€ 3E. 1 β π π < 3
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
Lisa Yan, CS109, 2020
Desired probability? (select all that apply)
A. π π > 4B. π π β₯ 4C. π π > 3D. 1 β π π β€ 3E. 1 β π π < 3
NBA Finals
The Golden State Warriors are going to play the TorontoRaptors in a 7-game series during the 2019 NBA finals.β’ The Warriors have a probability of 58% of
winning each game, independently.
β’ A team wins the series if they win at least 4 games(we play all 7 games).
What is P(Warriors winning)?
52
1. Define events/ RVs & state goal
π: # games Warriors win
π~Bin(7, 0.58)
Want:
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
Lisa Yan, CS109, 2020
NBA Finals
The Golden State Warriors are going to play the Toronto Raptors in a 7-game series during the 2019 NBA finals.β’ The Warriors have a probability of 58% of
winning each game, independently.
β’ A team wins the series if they win at least 4 games(we play all 7 games).
What is P(Warriors winning)?
53
Cool Algebra/Probability Fact: this is identical to the probability
of winning if we define winning = first to win 4 games
1. Define events/ RVs & state goal
2. Solve
π: # games Warriors win
π~Bin(7, 0.58)
Want: π π β₯ 4
π π β₯ 4 =
π=4
7
π π = π =
π=4
7
7π
0.58π 0.42 7βπ
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
Lisa Yan, CS109, 2020
Genetic inheritance
Each person has 2 genes per trait (e.g., eye color).β’ Child receives 1 gene (equally likely) from each parent
β’ Brown is βdominantβ, blue is βrecessiveβ:
β’ Child has brown eyes if either (or both) genes are brown
β’ Blue eyes only if both genes are blue.
β’ Parents each have 1 brown and 1 blue gene.
A family has 4 children. What is P(exactly 3 children with brown eyes)?
54
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
A. Product of 4 independent events
B. Probability tree
C. Bernoulli, success π = 3 children with brown eyes
D. Binomial, π = 3 trials, success π = brown-eyed child
E. Binomial, π = 4 trials, success π = brown-eyed child
Subset
of ideas:
Lisa Yan, CS109, 2020
Each person has 2 genes per trait (e.g., eye color).β’ Child receives 1 gene (equally likely) from each parent
β’ Brown is βdominantβ, blue is βrecessiveβ:
β’ Child has brown eyes if either (or both) genes are brown
β’ Blue eyes only if both genes are blue.
β’ Parents each have 1 brown and 1 blue gene.
A family has 4 children. What is P(exactly 3 children with brown eyes)?
55
Genetic inheritance
1. Define events/ RVs & goal
3. Solve
π: # brown-eyed children,
π~Bin(4, π)π: π brownβeyed child
Want: π π = 3
2. Identify knownprobabilities
π~Bin(π, π) π π =ππ
ππ 1 β π πβπ
Interlude for jokes/announcements
57
Lisa Yan, CS109, 2020
Announcements
58
Midterm Quiz
Time frame: Mon-Tues, July 20-21 5pm-5pm PT
Covers: Up to and including Lecture 11
Info and practice: http://web.stanford.edu/class/archive/cs/cs109/cs109.1208/exams/quizzes.ht
ml
Lisa Yan, CS109, 2020
Interesting probability news
59
https://theconversation.com/p
olly-knows-probability-this-
parrot-can-predict-the-chances-
of-something-happening-
132767
Discrete RVs
60
LIVE
The hardest part of problem-solving is
determining what is a random variable .
Lisa Yan, CS109, 2020
Grid of random variables
61
Number of
successes
Ber(π)One trial
Several
trials
Interval
of time
Bin(π, π)
Poi(π)
Geo(π)
NegBin(π, π)
(today!)
One success
Several
successes
Interval of time to
first success
Time until
success
π = 1 π = 1
Review
Lisa Yan, CS109, 2020
Grid of random variables
62
Number of
successes
Ber(π)One trial
Several
trials
Interval
of time
Bin(π, π)
Poi(π)
Geo(π)
NegBin(π, π)
(today!)
One success
Several
successes
Interval of time to
first success
Time until
success
π = 1 π = 1
Review
Breakout Rooms
Check out the question on the next slide. Post any clarifications here!
https://us.edstem.org/courses/667/discussion/84212
Breakout rooms: 5 min. Introduce yourself!
63
π€
Lisa Yan, CS109, 2020
An RV Tour
How would you model the following?
1. # of snapchats you receive in a day
2. # of children until the first one withbrown eyes (same parents)
3. Whether stock went up or down in a day
4. # of probability problems you try until you get 5 correct (if you are randomly correct)
5. # of years in some decade with at least 6 Atlantic hurricanes
64
Choose from:
A. Ber πB. Bin π, π
C. Poi πD. Geo πE. NegBin π, π
π€
Lisa Yan, CS109, 2020
An RV Tour
How would you model the following?
1. # of snapchats you receive in a day
2. # of children until the first one withbrown eyes (same parents)
3. Whether stock went up or down in a day
4. # of probability problems you try until you get 5 correct (if you are randomly correct)
5. # of years in some decade with at least 6 Atlantic hurricanes
65
E. NegBin π = 5, π
Choose from:
A. Ber πB. Bin π, π
A. Ber π or B. Bin 1, π
D. Geo π or E. NegBin 1, π
C. Poi π
B. Bin π = 10, π , where
π = π β₯ 6 hurricanes in a year
calculated from C. Poi π
C. Poi πD. Geo πE. NegBin π, π
Lisa Yan, CS109, 2020
CS109 Learning Goal: Use new RVs
Letβs say you are learning about servers/networks.
You read about the M/D/1 queue:
βThe service time busy period is distributed as a Borel with parameterπ = 0.2.β
Goal: You can recognize terminology and understand experiment setup.
66
π
Poisson RV
67
LIVE
Lisa Yan, CS109, 2020
Poisson Random Variable
In CS109, a Poisson RV π~Poi(π)most often models
β’ # of successes over a fixed interval of time.π = πΈ[π], average success/interval
β’ Approximation of π~Bin(π, π) where π is large and π is small.π = πΈ π = ππ
β’ Approximation of Binomial even when successin trials are not entirely independent.
68
π π = π = πβπππ
π!π~Poi(π)
Support: {0,1, 2,β¦ }
PMF
πΈ π = πVar π = πVariance
Expectation
Review
(explored in problem set 3)
Breakout Rooms The next slide has two questions to go over
in groups.
Post any clarifications here!
https://us.edstem.org/courses/667/discussion/84212
Breakout rooms: 5 mins
69
π€
Lisa Yan, CS109, 2020
Web server load
1. Consider requests to a web server in 1 second.β’ In the past, server load averages 2 hits/second.
β’ Let π = # hits the server receives in a second.
What is π π < 5 ?
2. Can the following BinomialRVs be approximated with Poisson?
70
π~Poi(π)π π = πβπ
ππ
π!πΈ π = π
π€
Lisa Yan, CS109, 2020
1. Web server load
Consider requests to a web server in 1 second.β’ In the past, server load averages 2 hits/second.
β’ Let π = # hits the server receives in a second.
What is π π < 5 ?
71
π~Poi(π)π π = πβπ
ππ
π!
1. Define RVs 2. Solve
πΈ π = π
Lisa Yan, CS109, 2020
2. Can these Binomial RVs be approximated?
72
0
0.05
0.1
0 10 20 30 40 50 60 70 80 90
π(π
= π
)
Bin(100,0.5)Poi(50)
0
0.1
0.2
0.3
0 10 20 30 40 50 60 70 80 90
π(π
= π
)
Bin(100,0.04)Poi(4)
0
0.1
0.2
0.3
0 10 20 30 40 50 60 70 80 90
π(π
= π
) Bin(100,0.96) Poi(4)
β
β
β οΈCan approximate
Bin(100,1-0.96)
Poisson approximates Binomial when π is large, π is small, and π = ππ is βmoderate.β
Different interpretations of βmoderateβ:
β’ π > 20 and π < 0.05
β’ π > 100 and π < 0.1
top related