Download - Chap 5 PME
EEE251 PROBABILITY METHODS INENGINEERING
Bakhtiar AliAssistant Professor,Electrical Engineering,COMSATS, Islamabad.
In this chapter we are InShaAllah going to study 5.1 Two Random Variables 5.2 Pairs of Discrete Random Variables
5.2.1 Marginal Probability Mass Function 5.3 The Joint CDF of X And Y 5.4 The Joint Pdf Of Two Continuous Random Variables 5.5 Independence Of Two Random Variables 5.6 Joint Moments And Expected Values Of A Function Of
Two Random Variables 5.6.1 Expected Value Of A Function Of Two Random Variables 5.6.2 Joint Moments, Correlation, And Covariance
5.7 Conditional Probability And Conditional Expectation 5.8 Functions Of Two Random Variables
The notion of a random variable as a mapping is easilygeneralized to the case where two quantities are ofinterest.
Consider a random experiment with sample space Sand event class F. We are interested in a function thatassigns a pair of real numbers πΏπΏ ππ = ππ ππ ,ππ ππ toeach outcome in S.
Basically we are dealing with a vector function thatmaps S into π π 2 the real plane.
Example 5.1: Let a random experiment consist ofselecting a studentβs name from an urn. Let ππ denotethe outcome of this experiment, and define thefollowing two functions:
π»π»(ππ) = height of student ππ in centimetersW(ππ) = weight of student ππ in kilograms
(π»π»(ππ) , W(ππ) ) assigns a pair of numbers to each ππ in S.
We are interested in events involving the pair (H, W).For example, the event π΅π΅ = π»π» β€ 183,ππ β€ 82represents students with height less that 183 cm (6feet) and weight less than 82 kg (180 lb).
The events involving a pair of random variables (X,Y)are specified by conditions that we are interested inand can be represented by regions in the plane. Figureshows three examples of events:
{ }10A X Y= + β€( ){ }min , 5B X Y= β€
{ }2 2 100C X Y= + β€
Let the vector random variable X = (X, Y) assumevalues from some countable set
The joint probability mass function of X specifiesthe probabilities of the event
The probability of any event B is the sum of the pmfover the outcomes in B:
When the event B is the entire sample space we have:
( ){ }, , , 1, 2, , 1, 2,X Y j kS x y j k= = =
{ } { }X x Y y= β© =
( ) { } { }[ ] ( )
,
2
,
, for ,X YP x y P X x Y y
P X x Y y x y R
= = β© = = = β
[ ] ( )( )
,in B,
in ,j k
X Y j kx y
P X B p x y= β β
( ),1 1
, 1X Y j kj k
p x yβ β
= =
=ββ
Graphical representation of pmfβs
a) Table formatb) Use of arrows to show heightc) Labeled dots corresponding
to pmf values
Example 5.6: A random experiment consists of tossingtwo βloadedβ dice and noting the pair of numbers (X, Y)facing up. The joint pmf ππππ,ππ(ππ,ππ) for ππ = 1, β¦ , 6 and ππ =1, β¦ , 6 is given by the two dimensional table shown infigure. The (j, k) entry in the table contains the valueππππ,ππ(ππ,ππ). Find the ππ[min ππ,ππ = 3].
ππ[min ππ,ππ = 3] =
= ππππ,ππ 6,3 + ππππ,ππ 5,3 +ππππ,ππ 4,3 + ππππ,ππ 3,3 +ππππ,ππ 3,4 + ππππ,ππ 3,5 +ππππ,ππ(3,6)= 6 1
42+ 2
42= 8
42.
The joint pmf of X provides the information about thejoint behavior of X and Y. We are also interested in theprobabilities of events involving each of the randomvariables in isolation. These can be found in terms ofthe marginal probability mass functions:
and similarly
The joint cumulative distribution function of Xand Y is defined as the probability of the eventππ β€ π₯π₯1 β© ππ β€ ππ1
πΉπΉππ,ππ π₯π₯1,π¦π¦1 = ππ[ππ β€ π₯π₯1,ππ β€ π¦π¦1]The joint cdf satisfies the following properties. The joint cdf is a non-decreasing function of x and y:
We obtain the marginal cumulative distributionfunctions by removing the constraint on one of thevariables.
The joint cdf is continuous from the βnorthβ and fromthe βeast,β that is,
The probability of the rectangle {π₯π₯1 < π₯π₯ β€ π₯π₯2,π¦π¦1 < π¦π¦ β€π¦π¦2} is given by:
Example 5.11: The joint cdf for the pair of randomvariables πΏπΏ = (ππ,ππ) is given by
Plot the joint cdf and find the marginal cdf of X.
The marginal cdf of X is:πΉπΉππ π₯π₯ = πΉπΉππ,ππ π₯π₯,β
=
X is uniformly distributed in the unit interval.
Example 5.12: The joint cdf for the vector of randomvariable πΏπΏ = (ππ,ππ) is given by
Find the marginal cdfβs.πΉπΉππ π₯π₯ =πΉπΉππ π¦π¦ =
Example 5.13: Find the probability of the events π΄π΄ ={ππ β€ 1,ππ β€ 1}, π΅π΅ = {ππ > π₯π₯,ππ > π¦π¦}, where π₯π₯ > 0 and π¦π¦ >0, and π·π· = {1 < ππ β€ 2, 2 < ππ β€ 5}
The probability of B requires more work. ByDeMorganβs rule:
The joint probability density function of X and Y isdefined as
For discrete random variable
For a continuous random variable
The probabilityof A is theintegral ofππππ,ππ(π₯π₯,π¦π¦) overthe regiondefined by A.
When B is the entire plane, the integral must equal one
The joint cdf can be obtained in terms of the joint pdf ofjointly continuous random variables by integrating overthe semi-infinite rectangle defined by (x, y):
The probability of a rectangular region is obtained by
The marginal pdfβs are obtained by
Example 5.16: Find the normalization constant c andthe marginal pdfβs for the following joint pdf:
Example 5.17: Find ππ[ππ + ππ β€ 1] in Example 5.16. Figure shows the intersection of the event and the
region where the pdf is nonzero.
ππ ππ + ππ β€ 1 =
X and Y are independent random variables if anyevent π΄π΄1 defined in terms of X is independent of anyevent π΄π΄2 defined in terms of Y; that is,
ππ ππ ππππ π΄π΄1,ππ ππππ π΄π΄2 = ππ ππ ππππ π΄π΄1 ππ[ππ ππππ π΄π΄2].
If X and Y are independent discrete random variables,then the joint pmf is equal to the product of themarginal pmfβs.
In general, it can be shown that the random variablesX and Y are independent if and only if their joint cdf isequal to the product of its marginal cdfβs:
πΉπΉππ,ππ π₯π₯,π¦π¦ = πΉπΉππ(π₯π₯)πΉπΉππ(π¦π¦) for all x and y Similarly, if X and Y are jointly continuous, then X and
Y are independent if and only if their joint pdf is equalto the product of the marginal pdfβs:
ππππ,ππ π₯π₯,π¦π¦ = ππππ(π₯π₯)ππππ(π¦π¦) for all x and y
Example 5.21: Are the random variables X and Y inExample 5.16 independent?
No, the product of the marginal pdfβs would not give us the joint pdf.
In the case of two random variables we are interestedin how X and Y vary together. In particular, we areinterested in whether the variation of X and Y arecorrelated. For example, if X increases does Y tend toincrease or to decrease? The joint moments of X and Y,which are defined as expected values of functions of Xand Y, provide this information.
5.6.1 Expected Value of a Function of TwoRandom Variables
Example 5.24 Sum of Random Variables: Let ππ =ππ + ππ. Find E[Z].
The joint moments of two random variables X and Ysummarize information about their joint behavior. Thejk-th joint moment of X and Y is defined by
In electrical engineering, it is customary to call the ππ =1,ππ = 1 moment, πΈπΈ[ππππ], the correlation of X and Y. IfπΈπΈ ππππ = 0 then we say that X and Y are orthogonal.
The jkth central moment of X and Y is defined asthe joint moment of the centered random variables, ππ βπΈπΈ[ππ] and ππ β πΈπΈ[ππ].
πΈπΈ ππ β πΈπΈ[ππ] ππ ππ β πΈπΈ[ππ] ππ
The covariance of X and Y is defined as the centralmoment:
COV ππ,ππ = πΈπΈ ππ β πΈπΈ[ππ] ππ β πΈπΈ[ππ] The above equation can be simplified to
COV ππ,ππ = πΈπΈ ππππ β πΈπΈ[ππ]πΈπΈ[ππ] Note that COV ππ,ππ = πΈπΈ ππππ if either of the random
variables has mean zero. Example 5.26 Covariance of Independent
Random Variables: Let X and Y be independentrandom variables. Find their covariance.
COV ππ,ππ
The correlation coefficient of X and Y is defined by ππππ,ππ = COV(ππ,ππ)
ππππππππ= πΈπΈ ππππ βπΈπΈ ππ πΈπΈ[ππ]
ππππππππ The correlation coefficient is a number that is at most 1
in magnitude:β1 β€ ππππ,ππβ€ 1
When X and Y are related linearly, ππ = ππππ + ππthen ππππ,ππ = 1. if ππ > 0 and ππππ,ππ = β1. if ππ < 0.
X and Y are said to be uncorrelated if ππππ,ππ = 0 If X and Y are independent, then COV ππ,ππ = 0
so ππππ,ππ = 0. Thus if X and Y are independent, then X and Y are uncorrelated.
It is possible for X and Y to be uncorrelated butnot independent.
FIGURE 5.3A scattergramfor 200 observations of four different pairs of random variables.
5.48. Let X and Y be independent random variablesthat are uniformly distributed in [0,1] . Find theprobability of the following events:
a) ππ[ππ2 < 1/2, ππ <1/2]
5.58. Find πΈπΈ[ππ2ππππ] where X and Y are independentrandom variables, X is a zero-mean, unit-varianceGaussian random variable, and Y is a uniform randomvariable in the interval [0, 3].
πΈπΈ[ππ2ππππ]= πΈπΈ ππ2 πΈπΈ ππππ = 1 Γ13οΏ½0
3ππππ ππππ =
13
(ππ3 β 1)
= ππ[ππ < 1/ 2]ππ[ππ < 1/2]
Many random variables of practical interest are notindependent: The output Y of a communication channelmust depend on the input X in order to conveyinformation; consecutive samples of a waveform thatvaries slowly are likely to be close in value and henceare not independent.
5.7.1 Conditional Probability
Case 1: X Is a Discrete Random Variable: For Xand Y discrete random variables, the conditional pmfof Y given X = x is defined by:
The conditional pmf satisfies all the properties of apmf.
The probability of an event A given ππ = π₯π₯ππ is found byadding the pmf values of the outcomes in A:
If X and Y are independent, then
In other words, knowledge that ππ = π₯π₯ππ does not affectthe probability of events A involving Y.
Example 5.29 Loaded Dice: Find ππππ(π¦π¦|5) in theloaded dice experiment considered in Examples 5.6 and5.8.
ππππ π¦π¦ 5 =ππππ,ππ(5,π¦π¦)ππππ 5
ππππ 5 5 =
ππππ 2 5 =
2717
Suppose Y is a continuous random variable. Define theconditional cdf of Y given πΏπΏ = ππππ
It is easy to show that πΉπΉππ(π¦π¦|π₯π₯ππ) satisfies all theproperties of a cdf.
The conditional pdf of Y given πΏπΏ = ππππ if thederivative exists, is given by
If X and Y are independent, ππ ππ β€ π¦π¦,ππ = π₯π₯ππ =ππ ππ β€ π¦π¦ ππ[ππ = π₯π₯ππ] , so πΉπΉππ π¦π¦ π₯π₯ = πΉπΉππ(π¦π¦) and ππππ π¦π¦ π₯π₯ =ππππ(π¦π¦)
Example 5.31 Binary Communications System: The input Xto a communication channel assumes the values +1 or -1 withprobabilities 1/3 and 2/3. The output Y of the channel is given byππ = ππ + ππ where N is a zero-mean, unit variance Gaussianrandom variable. Find the conditional pdf of Y given ππ = +1 andgiven ππ = β1. Find ππ[ππ = +1|ππ > 0].
If X is a continuous random variable
The conditional pdf of Y given ππ = π₯π₯ is then:
It is easy to show that ππππ(π¦π¦|π₯π₯) satisfies the properties ofa pdf.
The probability of event A given ππ = π₯π₯ is obtained asfollows:
Example 5.32: Let X and Y be the random variables inExample 5.16. Find ππππ(π₯π₯|π¦π¦) and ππππ(π¦π¦|π₯π₯).
Using the marginal pdfβs obtained in Example 5.8, wehave
ππππ π₯π₯ π¦π¦ = 2ππβπ₯π₯ππβπ¦π¦
2ππβ2π¦π¦= ππβπ₯π₯πππ¦π¦ for π₯π₯ β₯ π¦π¦
ππππ π¦π¦ π₯π₯ = 2ππβπ₯π₯ππβπ¦π¦
2ππβπ₯π₯(1βππβπ₯π₯)= ππβπ¦π¦
(1βππβπ₯π₯)for 0 < π¦π¦ < π₯π₯
The conditional expectation of Y given πΏπΏ = ππ isdefined by
An interesting corollary
5.8.1 One Function of Two Random Variables: Letthe random variable Z be defined as a function of tworandom variables:
The cdf of Z is found by first finding the equivalentevent of ππ β€ π§π§ that is, the set π π π§π§ = {ππ =π₯π₯,π¦π¦ such that ππ(ππ) β€ ππ} then
The pdf of Z is then found by taking the derivative ofπΉπΉππ(π§π§).
Example 5.39 Sum of Two Random Variables: Letππ = ππ + ππ. Find πΉπΉππ(π§π§) and ππππ(π§π§) in terms of the joint pdfof X and Y.
The cdf of Z is found by integrating the joint pdf of Xand Y over the region of the plane corresponding to theevent {ππ β€ π§π§}, as shown in the figure.
Thus the pdf for the sum of two random variables is given by a superposition integral.
If X and Y are independent random variables, then bythe last equation on previous slide the pdf is given bythe convolution integral of the marginal pdfβs of X andY:
5.8. For the pair of random variables (X, Y) sketch theregion of the plane corresponding to the followingevents. Identify which events are of product form.
5.28. The random vector (X,Y) is uniformly distributed(i.e., ππππ,ππ π₯π₯,π¦π¦ = ππ ) in the regions shown in Fig. andzero elsewhere.
(a) Find the value of k in each case.(b) Find the marginal pdffor X and for Y in each case.(c) Find ππ[ππ > 0,ππ > 0]
5.48. Let X and Y be independent random variablesthat are uniformly distributed in [-1,+1]. Find theprobability of the following events:
ππππ π¦π¦ π₯π₯ = ππππππ(π₯π₯,π¦π¦)ππππ(π₯π₯)
ππππ π₯π₯ = οΏ½β 1βπ₯π₯2
1βπ₯π₯2 1ππ πππ¦π¦ =
2ππ 1 β π₯π₯2
ππππ π¦π¦ π₯π₯ =ππππππ(π₯π₯,π¦π¦)ππππ(π₯π₯) =
1/ππ2ππ 1 β π₯π₯2
=1
2 1 β π₯π₯2
πΈπΈ π¦π¦ ππ = π₯π₯ = β«βββ π¦π¦ ππππ π¦π¦ π₯π₯ πππ¦π¦
πΈπΈ π¦π¦ ππ = π₯π₯ = οΏ½β 1βπ₯π₯2
1βπ₯π₯2
π¦π¦1
2 1 β π₯π₯2πππ¦π¦
πΈπΈ π¦π¦ ππ = π₯π₯ =1
2 1 β π₯π₯2π¦π¦2
2|β 1βπ₯π₯21βπ₯π₯2 = 0
πΈπΈ π¦π¦ = πΈπΈ πΈπΈ π¦π¦ ππ = π₯π₯ = 0
Example 5.34: X is selected at random from the unitinterval; Y is then selected at random from the interval(0, X). Find the cdf of Y.
When ππ = π₯π₯, Y is uniformly distributed in (0, x) so theconditional cdf given ππ = π₯π₯ is
The corresponding pdf is obtained by taking thederivative of the cdf:
πΉπΉππ ππ = ππ ππ β€ π§π§ = ππ ππ + ππ β€ π§π§ = β«0π§π§/2 β«π¦π¦
π§π§βπ¦π¦ ππβ(π₯π₯+π¦π¦)πππ₯π₯ πππ¦π¦
5.103. Find the joint cdf of ππ = min(ππ,ππ) and ππ =max(ππ,ππ) if X and Y are independent exponentialrandom variables with the same mean.
ππππππ π₯π₯,π¦π¦ = ππ2ππβπππ₯π₯ππβπππ¦π¦πΉπΉππ π€π€ = ππ min ππ,ππ β€ π€π€
= οΏ½0
π€π€
οΏ½0
β
ππππππ π₯π₯,π¦π¦ πππ¦π¦πππ₯π₯ + οΏ½π€π€
β
οΏ½0
π€π€
ππππππ π₯π₯,π¦π¦ πππ¦π¦πππ₯π₯