Download - Chap 5 PME

Transcript
Page 1: Chap 5 PME

EEE251 PROBABILITY METHODS INENGINEERING

Bakhtiar AliAssistant Professor,Electrical Engineering,COMSATS, Islamabad.

Page 2: Chap 5 PME
Page 3: Chap 5 PME

In this chapter we are InShaAllah going to study 5.1 Two Random Variables 5.2 Pairs of Discrete Random Variables

5.2.1 Marginal Probability Mass Function 5.3 The Joint CDF of X And Y 5.4 The Joint Pdf Of Two Continuous Random Variables 5.5 Independence Of Two Random Variables 5.6 Joint Moments And Expected Values Of A Function Of

Two Random Variables 5.6.1 Expected Value Of A Function Of Two Random Variables 5.6.2 Joint Moments, Correlation, And Covariance

5.7 Conditional Probability And Conditional Expectation 5.8 Functions Of Two Random Variables

Page 4: Chap 5 PME

The notion of a random variable as a mapping is easilygeneralized to the case where two quantities are ofinterest.

Consider a random experiment with sample space Sand event class F. We are interested in a function thatassigns a pair of real numbers 𝑿𝑿 𝜁𝜁 = 𝑋𝑋 𝜁𝜁 ,π‘Œπ‘Œ 𝜁𝜁 toeach outcome in S.

Basically we are dealing with a vector function thatmaps S into 𝑅𝑅2 the real plane.

Page 5: Chap 5 PME

Example 5.1: Let a random experiment consist ofselecting a student’s name from an urn. Let 𝜁𝜁 denotethe outcome of this experiment, and define thefollowing two functions:

𝐻𝐻(𝜁𝜁) = height of student 𝜁𝜁 in centimetersW(𝜁𝜁) = weight of student 𝜁𝜁 in kilograms

(𝐻𝐻(𝜁𝜁) , W(𝜁𝜁) ) assigns a pair of numbers to each 𝜁𝜁 in S.

We are interested in events involving the pair (H, W).For example, the event 𝐡𝐡 = 𝐻𝐻 ≀ 183,π‘Šπ‘Š ≀ 82represents students with height less that 183 cm (6feet) and weight less than 82 kg (180 lb).

Page 6: Chap 5 PME

The events involving a pair of random variables (X,Y)are specified by conditions that we are interested inand can be represented by regions in the plane. Figureshows three examples of events:

{ }10A X Y= + ≀( ){ }min , 5B X Y= ≀

{ }2 2 100C X Y= + ≀

Page 7: Chap 5 PME

Let the vector random variable X = (X, Y) assumevalues from some countable set

The joint probability mass function of X specifiesthe probabilities of the event

The probability of any event B is the sum of the pmfover the outcomes in B:

When the event B is the entire sample space we have:

( ){ }, , , 1, 2, , 1, 2,X Y j kS x y j k= = =

{ } { }X x Y y= ∩ =

( ) { } { }[ ] ( )

,

2

,

, for ,X YP x y P X x Y y

P X x Y y x y R

= = ∩ = = = ∈

[ ] ( )( )

,in B,

in ,j k

X Y j kx y

P X B p x y= βˆ‘ βˆ‘

( ),1 1

, 1X Y j kj k

p x y∞ ∞

= =

=βˆ‘βˆ‘

Page 8: Chap 5 PME

Graphical representation of pmf’s

a) Table formatb) Use of arrows to show heightc) Labeled dots corresponding

to pmf values

Page 9: Chap 5 PME

Example 5.6: A random experiment consists of tossingtwo β€œloaded” dice and noting the pair of numbers (X, Y)facing up. The joint pmf πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ(𝑗𝑗,π‘˜π‘˜) for 𝑗𝑗 = 1, … , 6 and π‘˜π‘˜ =1, … , 6 is given by the two dimensional table shown infigure. The (j, k) entry in the table contains the valueπœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ(𝑗𝑗,π‘˜π‘˜). Find the 𝑃𝑃[min 𝑋𝑋,π‘Œπ‘Œ = 3].

𝑃𝑃[min 𝑋𝑋,π‘Œπ‘Œ = 3] =

= πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ 6,3 + πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ 5,3 +πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ 4,3 + πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ 3,3 +πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ 3,4 + πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ 3,5 +πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ(3,6)= 6 1

42+ 2

42= 8

42.

Page 10: Chap 5 PME

The joint pmf of X provides the information about thejoint behavior of X and Y. We are also interested in theprobabilities of events involving each of the randomvariables in isolation. These can be found in terms ofthe marginal probability mass functions:

and similarly

Page 11: Chap 5 PME

The joint cumulative distribution function of Xand Y is defined as the probability of the event𝑋𝑋 ≀ π‘₯π‘₯1 ∩ π‘Œπ‘Œ ≀ π‘Œπ‘Œ1

𝐹𝐹𝑋𝑋,π‘Œπ‘Œ π‘₯π‘₯1,𝑦𝑦1 = 𝑃𝑃[𝑋𝑋 ≀ π‘₯π‘₯1,π‘Œπ‘Œ ≀ 𝑦𝑦1]The joint cdf satisfies the following properties. The joint cdf is a non-decreasing function of x and y:

We obtain the marginal cumulative distributionfunctions by removing the constraint on one of thevariables.

The joint cdf is continuous from the β€œnorth” and fromthe β€œeast,” that is,

Page 12: Chap 5 PME

The probability of the rectangle {π‘₯π‘₯1 < π‘₯π‘₯ ≀ π‘₯π‘₯2,𝑦𝑦1 < 𝑦𝑦 ≀𝑦𝑦2} is given by:

Page 13: Chap 5 PME

Example 5.11: The joint cdf for the pair of randomvariables 𝑿𝑿 = (𝑋𝑋,π‘Œπ‘Œ) is given by

Plot the joint cdf and find the marginal cdf of X.

Page 14: Chap 5 PME

The marginal cdf of X is:𝐹𝐹𝑋𝑋 π‘₯π‘₯ = 𝐹𝐹𝑋𝑋,π‘Œπ‘Œ π‘₯π‘₯,∞

=

X is uniformly distributed in the unit interval.

Page 15: Chap 5 PME

Example 5.12: The joint cdf for the vector of randomvariable 𝑿𝑿 = (𝑋𝑋,π‘Œπ‘Œ) is given by

Find the marginal cdf’s.𝐹𝐹𝑋𝑋 π‘₯π‘₯ =πΉπΉπ‘Œπ‘Œ 𝑦𝑦 =

Example 5.13: Find the probability of the events 𝐴𝐴 ={𝑋𝑋 ≀ 1,π‘Œπ‘Œ ≀ 1}, 𝐡𝐡 = {𝑋𝑋 > π‘₯π‘₯,π‘Œπ‘Œ > 𝑦𝑦}, where π‘₯π‘₯ > 0 and 𝑦𝑦 >0, and 𝐷𝐷 = {1 < 𝑋𝑋 ≀ 2, 2 < π‘Œπ‘Œ ≀ 5}

Page 16: Chap 5 PME

The probability of B requires more work. ByDeMorgan’s rule:

Page 17: Chap 5 PME

The joint probability density function of X and Y isdefined as

For discrete random variable

For a continuous random variable

Page 18: Chap 5 PME

The probabilityof A is theintegral of𝑓𝑓𝑋𝑋,π‘Œπ‘Œ(π‘₯π‘₯,𝑦𝑦) overthe regiondefined by A.

Page 19: Chap 5 PME

When B is the entire plane, the integral must equal one

The joint cdf can be obtained in terms of the joint pdf ofjointly continuous random variables by integrating overthe semi-infinite rectangle defined by (x, y):

The probability of a rectangular region is obtained by

Page 20: Chap 5 PME

The marginal pdf’s are obtained by

Page 21: Chap 5 PME

Example 5.16: Find the normalization constant c andthe marginal pdf’s for the following joint pdf:

Page 22: Chap 5 PME

Example 5.17: Find 𝑃𝑃[𝑋𝑋 + π‘Œπ‘Œ ≀ 1] in Example 5.16. Figure shows the intersection of the event and the

region where the pdf is nonzero.

𝑃𝑃 𝑋𝑋 + π‘Œπ‘Œ ≀ 1 =

Page 23: Chap 5 PME

X and Y are independent random variables if anyevent 𝐴𝐴1 defined in terms of X is independent of anyevent 𝐴𝐴2 defined in terms of Y; that is,

𝑃𝑃 𝑋𝑋 𝑖𝑖𝑖𝑖 𝐴𝐴1,π‘Œπ‘Œ 𝑖𝑖𝑖𝑖 𝐴𝐴2 = 𝑃𝑃 𝑋𝑋 𝑖𝑖𝑖𝑖 𝐴𝐴1 𝑃𝑃[π‘Œπ‘Œ 𝑖𝑖𝑖𝑖 𝐴𝐴2].

If X and Y are independent discrete random variables,then the joint pmf is equal to the product of themarginal pmf’s.

Page 24: Chap 5 PME

In general, it can be shown that the random variablesX and Y are independent if and only if their joint cdf isequal to the product of its marginal cdf’s:

𝐹𝐹𝑋𝑋,π‘Œπ‘Œ π‘₯π‘₯,𝑦𝑦 = 𝐹𝐹𝑋𝑋(π‘₯π‘₯)πΉπΉπ‘Œπ‘Œ(𝑦𝑦) for all x and y Similarly, if X and Y are jointly continuous, then X and

Y are independent if and only if their joint pdf is equalto the product of the marginal pdf’s:

𝑓𝑓𝑋𝑋,π‘Œπ‘Œ π‘₯π‘₯,𝑦𝑦 = 𝑓𝑓𝑋𝑋(π‘₯π‘₯)π‘“π‘“π‘Œπ‘Œ(𝑦𝑦) for all x and y

Page 25: Chap 5 PME

Example 5.21: Are the random variables X and Y inExample 5.16 independent?

No, the product of the marginal pdf’s would not give us the joint pdf.

Page 26: Chap 5 PME

In the case of two random variables we are interestedin how X and Y vary together. In particular, we areinterested in whether the variation of X and Y arecorrelated. For example, if X increases does Y tend toincrease or to decrease? The joint moments of X and Y,which are defined as expected values of functions of Xand Y, provide this information.

5.6.1 Expected Value of a Function of TwoRandom Variables

Page 27: Chap 5 PME

Example 5.24 Sum of Random Variables: Let 𝑍𝑍 =𝑋𝑋 + π‘Œπ‘Œ. Find E[Z].

Page 28: Chap 5 PME

The joint moments of two random variables X and Ysummarize information about their joint behavior. Thejk-th joint moment of X and Y is defined by

In electrical engineering, it is customary to call the 𝑗𝑗 =1,π‘˜π‘˜ = 1 moment, 𝐸𝐸[π‘‹π‘‹π‘Œπ‘Œ], the correlation of X and Y. If𝐸𝐸 π‘‹π‘‹π‘Œπ‘Œ = 0 then we say that X and Y are orthogonal.

The jkth central moment of X and Y is defined asthe joint moment of the centered random variables, 𝑋𝑋 βˆ’πΈπΈ[𝑋𝑋] and π‘Œπ‘Œ βˆ’ 𝐸𝐸[π‘Œπ‘Œ].

𝐸𝐸 𝑋𝑋 βˆ’ 𝐸𝐸[𝑋𝑋] 𝑗𝑗 π‘Œπ‘Œ βˆ’ 𝐸𝐸[π‘Œπ‘Œ] π‘˜π‘˜

Page 29: Chap 5 PME

The covariance of X and Y is defined as the centralmoment:

COV 𝑋𝑋,π‘Œπ‘Œ = 𝐸𝐸 𝑋𝑋 βˆ’ 𝐸𝐸[𝑋𝑋] π‘Œπ‘Œ βˆ’ 𝐸𝐸[π‘Œπ‘Œ] The above equation can be simplified to

COV 𝑋𝑋,π‘Œπ‘Œ = 𝐸𝐸 π‘‹π‘‹π‘Œπ‘Œ βˆ’ 𝐸𝐸[𝑋𝑋]𝐸𝐸[π‘Œπ‘Œ] Note that COV 𝑋𝑋,π‘Œπ‘Œ = 𝐸𝐸 π‘‹π‘‹π‘Œπ‘Œ if either of the random

variables has mean zero. Example 5.26 Covariance of Independent

Random Variables: Let X and Y be independentrandom variables. Find their covariance.

COV 𝑋𝑋,π‘Œπ‘Œ

Page 30: Chap 5 PME

The correlation coefficient of X and Y is defined by πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ = COV(𝑋𝑋,π‘Œπ‘Œ)

πœŽπœŽπ‘‹π‘‹πœŽπœŽπ‘Œπ‘Œ= 𝐸𝐸 π‘‹π‘‹π‘Œπ‘Œ βˆ’πΈπΈ 𝑋𝑋 𝐸𝐸[π‘Œπ‘Œ]

πœŽπœŽπ‘‹π‘‹πœŽπœŽπ‘Œπ‘Œ The correlation coefficient is a number that is at most 1

in magnitude:βˆ’1 ≀ πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œβ‰€ 1

When X and Y are related linearly, π‘Œπ‘Œ = π‘Žπ‘Žπ‘‹π‘‹ + 𝑏𝑏then πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ = 1. if π‘Žπ‘Ž > 0 and πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ = βˆ’1. if π‘Žπ‘Ž < 0.

X and Y are said to be uncorrelated if πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ = 0 If X and Y are independent, then COV 𝑋𝑋,π‘Œπ‘Œ = 0

so πœŒπœŒπ‘‹π‘‹,π‘Œπ‘Œ = 0. Thus if X and Y are independent, then X and Y are uncorrelated.

It is possible for X and Y to be uncorrelated butnot independent.

Page 31: Chap 5 PME

FIGURE 5.3A scattergramfor 200 observations of four different pairs of random variables.

Page 32: Chap 5 PME

5.48. Let X and Y be independent random variablesthat are uniformly distributed in [0,1] . Find theprobability of the following events:

a) 𝑃𝑃[𝑋𝑋2 < 1/2, π‘Œπ‘Œ <1/2]

5.58. Find 𝐸𝐸[𝑋𝑋2π‘’π‘’π‘Œπ‘Œ] where X and Y are independentrandom variables, X is a zero-mean, unit-varianceGaussian random variable, and Y is a uniform randomvariable in the interval [0, 3].

𝐸𝐸[𝑋𝑋2π‘’π‘’π‘Œπ‘Œ]= 𝐸𝐸 𝑋𝑋2 𝐸𝐸 π‘’π‘’π‘Œπ‘Œ = 1 Γ—13οΏ½0

3π‘’π‘’π‘Œπ‘Œ π‘‘π‘‘π‘Œπ‘Œ =

13

(𝑒𝑒3 βˆ’ 1)

= 𝑃𝑃[𝑋𝑋 < 1/ 2]𝑃𝑃[π‘Œπ‘Œ < 1/2]

Page 33: Chap 5 PME

Many random variables of practical interest are notindependent: The output Y of a communication channelmust depend on the input X in order to conveyinformation; consecutive samples of a waveform thatvaries slowly are likely to be close in value and henceare not independent.

5.7.1 Conditional Probability

Case 1: X Is a Discrete Random Variable: For Xand Y discrete random variables, the conditional pmfof Y given X = x is defined by:

Page 34: Chap 5 PME

The conditional pmf satisfies all the properties of apmf.

The probability of an event A given 𝑋𝑋 = π‘₯π‘₯π‘˜π‘˜ is found byadding the pmf values of the outcomes in A:

If X and Y are independent, then

In other words, knowledge that 𝑋𝑋 = π‘₯π‘₯π‘˜π‘˜ does not affectthe probability of events A involving Y.

Page 35: Chap 5 PME

Example 5.29 Loaded Dice: Find π‘π‘π‘Œπ‘Œ(𝑦𝑦|5) in theloaded dice experiment considered in Examples 5.6 and5.8.

π‘π‘π‘Œπ‘Œ 𝑦𝑦 5 =𝑝𝑝𝑋𝑋,π‘Œπ‘Œ(5,𝑦𝑦)𝑝𝑝𝑋𝑋 5

π‘π‘π‘Œπ‘Œ 5 5 =

π‘π‘π‘Œπ‘Œ 2 5 =

2717

Page 36: Chap 5 PME

Suppose Y is a continuous random variable. Define theconditional cdf of Y given 𝑿𝑿 = π’™π’™π’Œπ’Œ

It is easy to show that πΉπΉπ‘Œπ‘Œ(𝑦𝑦|π‘₯π‘₯π‘˜π‘˜) satisfies all theproperties of a cdf.

The conditional pdf of Y given 𝑿𝑿 = π’™π’™π’Œπ’Œ if thederivative exists, is given by

If X and Y are independent, 𝑃𝑃 π‘Œπ‘Œ ≀ 𝑦𝑦,𝑋𝑋 = π‘₯π‘₯π‘˜π‘˜ =𝑃𝑃 π‘Œπ‘Œ ≀ 𝑦𝑦 𝑃𝑃[𝑋𝑋 = π‘₯π‘₯π‘˜π‘˜] , so πΉπΉπ‘Œπ‘Œ 𝑦𝑦 π‘₯π‘₯ = πΉπΉπ‘Œπ‘Œ(𝑦𝑦) and π‘“π‘“π‘Œπ‘Œ 𝑦𝑦 π‘₯π‘₯ =π‘“π‘“π‘Œπ‘Œ(𝑦𝑦)

Page 37: Chap 5 PME

Example 5.31 Binary Communications System: The input Xto a communication channel assumes the values +1 or -1 withprobabilities 1/3 and 2/3. The output Y of the channel is given byπ‘Œπ‘Œ = 𝑋𝑋 + 𝑁𝑁 where N is a zero-mean, unit variance Gaussianrandom variable. Find the conditional pdf of Y given 𝑋𝑋 = +1 andgiven 𝑋𝑋 = βˆ’1. Find 𝑃𝑃[𝑋𝑋 = +1|π‘Œπ‘Œ > 0].

Page 38: Chap 5 PME

If X is a continuous random variable

The conditional pdf of Y given 𝑋𝑋 = π‘₯π‘₯ is then:

It is easy to show that π‘“π‘“π‘Œπ‘Œ(𝑦𝑦|π‘₯π‘₯) satisfies the properties ofa pdf.

The probability of event A given 𝑋𝑋 = π‘₯π‘₯ is obtained asfollows:

Page 39: Chap 5 PME

Example 5.32: Let X and Y be the random variables inExample 5.16. Find 𝑓𝑓𝑋𝑋(π‘₯π‘₯|𝑦𝑦) and π‘“π‘“π‘Œπ‘Œ(𝑦𝑦|π‘₯π‘₯).

Using the marginal pdf’s obtained in Example 5.8, wehave

𝑓𝑓𝑋𝑋 π‘₯π‘₯ 𝑦𝑦 = 2π‘’π‘’βˆ’π‘₯π‘₯π‘’π‘’βˆ’π‘¦π‘¦

2π‘’π‘’βˆ’2𝑦𝑦= π‘’π‘’βˆ’π‘₯π‘₯𝑒𝑒𝑦𝑦 for π‘₯π‘₯ β‰₯ 𝑦𝑦

π‘“π‘“π‘Œπ‘Œ 𝑦𝑦 π‘₯π‘₯ = 2π‘’π‘’βˆ’π‘₯π‘₯π‘’π‘’βˆ’π‘¦π‘¦

2π‘’π‘’βˆ’π‘₯π‘₯(1βˆ’π‘’π‘’βˆ’π‘₯π‘₯)= π‘’π‘’βˆ’π‘¦π‘¦

(1βˆ’π‘’π‘’βˆ’π‘₯π‘₯)for 0 < 𝑦𝑦 < π‘₯π‘₯

Page 40: Chap 5 PME

The conditional expectation of Y given 𝑿𝑿 = 𝒙𝒙 isdefined by

An interesting corollary

Page 41: Chap 5 PME

5.8.1 One Function of Two Random Variables: Letthe random variable Z be defined as a function of tworandom variables:

The cdf of Z is found by first finding the equivalentevent of 𝑍𝑍 ≀ 𝑧𝑧 that is, the set 𝑅𝑅𝑧𝑧 = {𝐗𝐗 =π‘₯π‘₯,𝑦𝑦 such that 𝑔𝑔(𝐗𝐗) ≀ 𝒛𝒛} then

The pdf of Z is then found by taking the derivative of𝐹𝐹𝑍𝑍(𝑧𝑧).

Page 42: Chap 5 PME

Example 5.39 Sum of Two Random Variables: Let𝑍𝑍 = 𝑋𝑋 + π‘Œπ‘Œ. Find 𝐹𝐹𝑍𝑍(𝑧𝑧) and 𝑓𝑓𝑍𝑍(𝑧𝑧) in terms of the joint pdfof X and Y.

The cdf of Z is found by integrating the joint pdf of Xand Y over the region of the plane corresponding to theevent {𝑍𝑍 ≀ 𝑧𝑧}, as shown in the figure.

Thus the pdf for the sum of two random variables is given by a superposition integral.

Page 43: Chap 5 PME

If X and Y are independent random variables, then bythe last equation on previous slide the pdf is given bythe convolution integral of the marginal pdf’s of X andY:

Page 44: Chap 5 PME

5.8. For the pair of random variables (X, Y) sketch theregion of the plane corresponding to the followingevents. Identify which events are of product form.

5.28. The random vector (X,Y) is uniformly distributed(i.e., 𝑓𝑓𝑋𝑋,π‘Œπ‘Œ π‘₯π‘₯,𝑦𝑦 = π‘˜π‘˜ ) in the regions shown in Fig. andzero elsewhere.

(a) Find the value of k in each case.(b) Find the marginal pdffor X and for Y in each case.(c) Find 𝑃𝑃[𝑋𝑋 > 0,π‘Œπ‘Œ > 0]

Page 45: Chap 5 PME

5.48. Let X and Y be independent random variablesthat are uniformly distributed in [-1,+1]. Find theprobability of the following events:

Page 46: Chap 5 PME

π‘“π‘“π‘Œπ‘Œ 𝑦𝑦 π‘₯π‘₯ = π‘“π‘“π‘‹π‘‹π‘Œπ‘Œ(π‘₯π‘₯,𝑦𝑦)𝑓𝑓𝑋𝑋(π‘₯π‘₯)

𝑓𝑓𝑋𝑋 π‘₯π‘₯ = οΏ½βˆ’ 1βˆ’π‘₯π‘₯2

1βˆ’π‘₯π‘₯2 1πœ‹πœ‹ 𝑑𝑑𝑦𝑦 =

2πœ‹πœ‹ 1 βˆ’ π‘₯π‘₯2

π‘“π‘“π‘Œπ‘Œ 𝑦𝑦 π‘₯π‘₯ =π‘“π‘“π‘‹π‘‹π‘Œπ‘Œ(π‘₯π‘₯,𝑦𝑦)𝑓𝑓𝑋𝑋(π‘₯π‘₯) =

1/πœ‹πœ‹2πœ‹πœ‹ 1 βˆ’ π‘₯π‘₯2

=1

2 1 βˆ’ π‘₯π‘₯2

𝐸𝐸 𝑦𝑦 𝑋𝑋 = π‘₯π‘₯ = βˆ«βˆ’βˆžβˆž 𝑦𝑦 π‘“π‘“π‘Œπ‘Œ 𝑦𝑦 π‘₯π‘₯ 𝑑𝑑𝑦𝑦

𝐸𝐸 𝑦𝑦 𝑋𝑋 = π‘₯π‘₯ = οΏ½βˆ’ 1βˆ’π‘₯π‘₯2

1βˆ’π‘₯π‘₯2

𝑦𝑦1

2 1 βˆ’ π‘₯π‘₯2𝑑𝑑𝑦𝑦

𝐸𝐸 𝑦𝑦 𝑋𝑋 = π‘₯π‘₯ =1

2 1 βˆ’ π‘₯π‘₯2𝑦𝑦2

2|βˆ’ 1βˆ’π‘₯π‘₯21βˆ’π‘₯π‘₯2 = 0

𝐸𝐸 𝑦𝑦 = 𝐸𝐸 𝐸𝐸 𝑦𝑦 𝑋𝑋 = π‘₯π‘₯ = 0

Page 47: Chap 5 PME

Example 5.34: X is selected at random from the unitinterval; Y is then selected at random from the interval(0, X). Find the cdf of Y.

When 𝑋𝑋 = π‘₯π‘₯, Y is uniformly distributed in (0, x) so theconditional cdf given 𝑋𝑋 = π‘₯π‘₯ is

The corresponding pdf is obtained by taking thederivative of the cdf:

Page 48: Chap 5 PME

𝐹𝐹𝑍𝑍 𝑍𝑍 = 𝑃𝑃 𝑍𝑍 ≀ 𝑧𝑧 = 𝑃𝑃 𝑋𝑋 + π‘Œπ‘Œ ≀ 𝑧𝑧 = ∫0𝑧𝑧/2 βˆ«π‘¦π‘¦

π‘§π‘§βˆ’π‘¦π‘¦ π‘’π‘’βˆ’(π‘₯π‘₯+𝑦𝑦)𝑑𝑑π‘₯π‘₯ 𝑑𝑑𝑦𝑦

5.103. Find the joint cdf of π‘Šπ‘Š = min(𝑋𝑋,π‘Œπ‘Œ) and 𝑍𝑍 =max(𝑋𝑋,π‘Œπ‘Œ) if X and Y are independent exponentialrandom variables with the same mean.

π‘“π‘“π‘‹π‘‹π‘Œπ‘Œ π‘₯π‘₯,𝑦𝑦 = πœ†πœ†2π‘’π‘’βˆ’πœ†πœ†π‘₯π‘₯π‘’π‘’βˆ’πœ†πœ†π‘¦π‘¦πΉπΉπ‘Šπ‘Š 𝑀𝑀 = 𝑃𝑃 min 𝑋𝑋,π‘Œπ‘Œ ≀ 𝑀𝑀

= οΏ½0

𝑀𝑀

οΏ½0

∞

π‘“π‘“π‘‹π‘‹π‘Œπ‘Œ π‘₯π‘₯,𝑦𝑦 𝑑𝑑𝑦𝑦𝑑𝑑π‘₯π‘₯ + �𝑀𝑀

∞

οΏ½0

𝑀𝑀

π‘“π‘“π‘‹π‘‹π‘Œπ‘Œ π‘₯π‘₯,𝑦𝑦 𝑑𝑑𝑦𝑦𝑑𝑑π‘₯π‘₯


Top Related