chap 5 pme

48
EEE251 PROBABILITY METHODS IN ENGINEERING Bakhtiar Ali Assistant Professor, Electrical Engineering, COMSATS, Islamabad.

Upload: farhan-khan-niazi

Post on 09-Jul-2016

27 views

Category:

Documents


1 download

DESCRIPTION

nmb nmbjk mbhj bk n k,k,

TRANSCRIPT

Page 1: Chap 5 PME

EEE251 PROBABILITY METHODS INENGINEERING

Bakhtiar AliAssistant Professor,Electrical Engineering,COMSATS, Islamabad.

Page 2: Chap 5 PME
Page 3: Chap 5 PME

In this chapter we are InShaAllah going to study 5.1 Two Random Variables 5.2 Pairs of Discrete Random Variables

5.2.1 Marginal Probability Mass Function 5.3 The Joint CDF of X And Y 5.4 The Joint Pdf Of Two Continuous Random Variables 5.5 Independence Of Two Random Variables 5.6 Joint Moments And Expected Values Of A Function Of

Two Random Variables 5.6.1 Expected Value Of A Function Of Two Random Variables 5.6.2 Joint Moments, Correlation, And Covariance

5.7 Conditional Probability And Conditional Expectation 5.8 Functions Of Two Random Variables

Page 4: Chap 5 PME

The notion of a random variable as a mapping is easilygeneralized to the case where two quantities are ofinterest.

Consider a random experiment with sample space Sand event class F. We are interested in a function thatassigns a pair of real numbers 𝑿𝑿 𝜁𝜁 = 𝑋𝑋 𝜁𝜁 ,𝑌𝑌 𝜁𝜁 toeach outcome in S.

Basically we are dealing with a vector function thatmaps S into 𝑅𝑅2 the real plane.

Page 5: Chap 5 PME

Example 5.1: Let a random experiment consist ofselecting a student’s name from an urn. Let 𝜁𝜁 denotethe outcome of this experiment, and define thefollowing two functions:

𝐻𝐻(𝜁𝜁) = height of student 𝜁𝜁 in centimetersW(𝜁𝜁) = weight of student 𝜁𝜁 in kilograms

(𝐻𝐻(𝜁𝜁) , W(𝜁𝜁) ) assigns a pair of numbers to each 𝜁𝜁 in S.

We are interested in events involving the pair (H, W).For example, the event 𝐵𝐵 = 𝐻𝐻 ≤ 183,𝑊𝑊 ≤ 82represents students with height less that 183 cm (6feet) and weight less than 82 kg (180 lb).

Page 6: Chap 5 PME

The events involving a pair of random variables (X,Y)are specified by conditions that we are interested inand can be represented by regions in the plane. Figureshows three examples of events:

{ }10A X Y= + ≤( ){ }min , 5B X Y= ≤

{ }2 2 100C X Y= + ≤

Page 7: Chap 5 PME

Let the vector random variable X = (X, Y) assumevalues from some countable set

The joint probability mass function of X specifiesthe probabilities of the event

The probability of any event B is the sum of the pmfover the outcomes in B:

When the event B is the entire sample space we have:

( ){ }, , , 1, 2, , 1, 2,X Y j kS x y j k= = =

{ } { }X x Y y= ∩ =

( ) { } { }[ ] ( )

,

2

,

, for ,X YP x y P X x Y y

P X x Y y x y R

= = ∩ = = = ∈

[ ] ( )( )

,in B,

in ,j k

X Y j kx y

P X B p x y= ∑ ∑

( ),1 1

, 1X Y j kj k

p x y∞ ∞

= =

=∑∑

Page 8: Chap 5 PME

Graphical representation of pmf’s

a) Table formatb) Use of arrows to show heightc) Labeled dots corresponding

to pmf values

Page 9: Chap 5 PME

Example 5.6: A random experiment consists of tossingtwo “loaded” dice and noting the pair of numbers (X, Y)facing up. The joint pmf 𝜌𝜌𝑋𝑋,𝑌𝑌(𝑗𝑗,𝑘𝑘) for 𝑗𝑗 = 1, … , 6 and 𝑘𝑘 =1, … , 6 is given by the two dimensional table shown infigure. The (j, k) entry in the table contains the value𝜌𝜌𝑋𝑋,𝑌𝑌(𝑗𝑗,𝑘𝑘). Find the 𝑃𝑃[min 𝑋𝑋,𝑌𝑌 = 3].

𝑃𝑃[min 𝑋𝑋,𝑌𝑌 = 3] =

= 𝜌𝜌𝑋𝑋,𝑌𝑌 6,3 + 𝜌𝜌𝑋𝑋,𝑌𝑌 5,3 +𝜌𝜌𝑋𝑋,𝑌𝑌 4,3 + 𝜌𝜌𝑋𝑋,𝑌𝑌 3,3 +𝜌𝜌𝑋𝑋,𝑌𝑌 3,4 + 𝜌𝜌𝑋𝑋,𝑌𝑌 3,5 +𝜌𝜌𝑋𝑋,𝑌𝑌(3,6)= 6 1

42+ 2

42= 8

42.

Page 10: Chap 5 PME

The joint pmf of X provides the information about thejoint behavior of X and Y. We are also interested in theprobabilities of events involving each of the randomvariables in isolation. These can be found in terms ofthe marginal probability mass functions:

and similarly

Page 11: Chap 5 PME

The joint cumulative distribution function of Xand Y is defined as the probability of the event𝑋𝑋 ≤ 𝑥𝑥1 ∩ 𝑌𝑌 ≤ 𝑌𝑌1

𝐹𝐹𝑋𝑋,𝑌𝑌 𝑥𝑥1,𝑦𝑦1 = 𝑃𝑃[𝑋𝑋 ≤ 𝑥𝑥1,𝑌𝑌 ≤ 𝑦𝑦1]The joint cdf satisfies the following properties. The joint cdf is a non-decreasing function of x and y:

We obtain the marginal cumulative distributionfunctions by removing the constraint on one of thevariables.

The joint cdf is continuous from the “north” and fromthe “east,” that is,

Page 12: Chap 5 PME

The probability of the rectangle {𝑥𝑥1 < 𝑥𝑥 ≤ 𝑥𝑥2,𝑦𝑦1 < 𝑦𝑦 ≤𝑦𝑦2} is given by:

Page 13: Chap 5 PME

Example 5.11: The joint cdf for the pair of randomvariables 𝑿𝑿 = (𝑋𝑋,𝑌𝑌) is given by

Plot the joint cdf and find the marginal cdf of X.

Page 14: Chap 5 PME

The marginal cdf of X is:𝐹𝐹𝑋𝑋 𝑥𝑥 = 𝐹𝐹𝑋𝑋,𝑌𝑌 𝑥𝑥,∞

=

X is uniformly distributed in the unit interval.

Page 15: Chap 5 PME

Example 5.12: The joint cdf for the vector of randomvariable 𝑿𝑿 = (𝑋𝑋,𝑌𝑌) is given by

Find the marginal cdf’s.𝐹𝐹𝑋𝑋 𝑥𝑥 =𝐹𝐹𝑌𝑌 𝑦𝑦 =

Example 5.13: Find the probability of the events 𝐴𝐴 ={𝑋𝑋 ≤ 1,𝑌𝑌 ≤ 1}, 𝐵𝐵 = {𝑋𝑋 > 𝑥𝑥,𝑌𝑌 > 𝑦𝑦}, where 𝑥𝑥 > 0 and 𝑦𝑦 >0, and 𝐷𝐷 = {1 < 𝑋𝑋 ≤ 2, 2 < 𝑌𝑌 ≤ 5}

Page 16: Chap 5 PME

The probability of B requires more work. ByDeMorgan’s rule:

Page 17: Chap 5 PME

The joint probability density function of X and Y isdefined as

For discrete random variable

For a continuous random variable

Page 18: Chap 5 PME

The probabilityof A is theintegral of𝑓𝑓𝑋𝑋,𝑌𝑌(𝑥𝑥,𝑦𝑦) overthe regiondefined by A.

Page 19: Chap 5 PME

When B is the entire plane, the integral must equal one

The joint cdf can be obtained in terms of the joint pdf ofjointly continuous random variables by integrating overthe semi-infinite rectangle defined by (x, y):

The probability of a rectangular region is obtained by

Page 20: Chap 5 PME

The marginal pdf’s are obtained by

Page 21: Chap 5 PME

Example 5.16: Find the normalization constant c andthe marginal pdf’s for the following joint pdf:

Page 22: Chap 5 PME

Example 5.17: Find 𝑃𝑃[𝑋𝑋 + 𝑌𝑌 ≤ 1] in Example 5.16. Figure shows the intersection of the event and the

region where the pdf is nonzero.

𝑃𝑃 𝑋𝑋 + 𝑌𝑌 ≤ 1 =

Page 23: Chap 5 PME

X and Y are independent random variables if anyevent 𝐴𝐴1 defined in terms of X is independent of anyevent 𝐴𝐴2 defined in terms of Y; that is,

𝑃𝑃 𝑋𝑋 𝑖𝑖𝑖𝑖 𝐴𝐴1,𝑌𝑌 𝑖𝑖𝑖𝑖 𝐴𝐴2 = 𝑃𝑃 𝑋𝑋 𝑖𝑖𝑖𝑖 𝐴𝐴1 𝑃𝑃[𝑌𝑌 𝑖𝑖𝑖𝑖 𝐴𝐴2].

If X and Y are independent discrete random variables,then the joint pmf is equal to the product of themarginal pmf’s.

Page 24: Chap 5 PME

In general, it can be shown that the random variablesX and Y are independent if and only if their joint cdf isequal to the product of its marginal cdf’s:

𝐹𝐹𝑋𝑋,𝑌𝑌 𝑥𝑥,𝑦𝑦 = 𝐹𝐹𝑋𝑋(𝑥𝑥)𝐹𝐹𝑌𝑌(𝑦𝑦) for all x and y Similarly, if X and Y are jointly continuous, then X and

Y are independent if and only if their joint pdf is equalto the product of the marginal pdf’s:

𝑓𝑓𝑋𝑋,𝑌𝑌 𝑥𝑥,𝑦𝑦 = 𝑓𝑓𝑋𝑋(𝑥𝑥)𝑓𝑓𝑌𝑌(𝑦𝑦) for all x and y

Page 25: Chap 5 PME

Example 5.21: Are the random variables X and Y inExample 5.16 independent?

No, the product of the marginal pdf’s would not give us the joint pdf.

Page 26: Chap 5 PME

In the case of two random variables we are interestedin how X and Y vary together. In particular, we areinterested in whether the variation of X and Y arecorrelated. For example, if X increases does Y tend toincrease or to decrease? The joint moments of X and Y,which are defined as expected values of functions of Xand Y, provide this information.

5.6.1 Expected Value of a Function of TwoRandom Variables

Page 27: Chap 5 PME

Example 5.24 Sum of Random Variables: Let 𝑍𝑍 =𝑋𝑋 + 𝑌𝑌. Find E[Z].

Page 28: Chap 5 PME

The joint moments of two random variables X and Ysummarize information about their joint behavior. Thejk-th joint moment of X and Y is defined by

In electrical engineering, it is customary to call the 𝑗𝑗 =1,𝑘𝑘 = 1 moment, 𝐸𝐸[𝑋𝑋𝑌𝑌], the correlation of X and Y. If𝐸𝐸 𝑋𝑋𝑌𝑌 = 0 then we say that X and Y are orthogonal.

The jkth central moment of X and Y is defined asthe joint moment of the centered random variables, 𝑋𝑋 −𝐸𝐸[𝑋𝑋] and 𝑌𝑌 − 𝐸𝐸[𝑌𝑌].

𝐸𝐸 𝑋𝑋 − 𝐸𝐸[𝑋𝑋] 𝑗𝑗 𝑌𝑌 − 𝐸𝐸[𝑌𝑌] 𝑘𝑘

Page 29: Chap 5 PME

The covariance of X and Y is defined as the centralmoment:

COV 𝑋𝑋,𝑌𝑌 = 𝐸𝐸 𝑋𝑋 − 𝐸𝐸[𝑋𝑋] 𝑌𝑌 − 𝐸𝐸[𝑌𝑌] The above equation can be simplified to

COV 𝑋𝑋,𝑌𝑌 = 𝐸𝐸 𝑋𝑋𝑌𝑌 − 𝐸𝐸[𝑋𝑋]𝐸𝐸[𝑌𝑌] Note that COV 𝑋𝑋,𝑌𝑌 = 𝐸𝐸 𝑋𝑋𝑌𝑌 if either of the random

variables has mean zero. Example 5.26 Covariance of Independent

Random Variables: Let X and Y be independentrandom variables. Find their covariance.

COV 𝑋𝑋,𝑌𝑌

Page 30: Chap 5 PME

The correlation coefficient of X and Y is defined by 𝜌𝜌𝑋𝑋,𝑌𝑌 = COV(𝑋𝑋,𝑌𝑌)

𝜎𝜎𝑋𝑋𝜎𝜎𝑌𝑌= 𝐸𝐸 𝑋𝑋𝑌𝑌 −𝐸𝐸 𝑋𝑋 𝐸𝐸[𝑌𝑌]

𝜎𝜎𝑋𝑋𝜎𝜎𝑌𝑌 The correlation coefficient is a number that is at most 1

in magnitude:−1 ≤ 𝜌𝜌𝑋𝑋,𝑌𝑌≤ 1

When X and Y are related linearly, 𝑌𝑌 = 𝑎𝑎𝑋𝑋 + 𝑏𝑏then 𝜌𝜌𝑋𝑋,𝑌𝑌 = 1. if 𝑎𝑎 > 0 and 𝜌𝜌𝑋𝑋,𝑌𝑌 = −1. if 𝑎𝑎 < 0.

X and Y are said to be uncorrelated if 𝜌𝜌𝑋𝑋,𝑌𝑌 = 0 If X and Y are independent, then COV 𝑋𝑋,𝑌𝑌 = 0

so 𝜌𝜌𝑋𝑋,𝑌𝑌 = 0. Thus if X and Y are independent, then X and Y are uncorrelated.

It is possible for X and Y to be uncorrelated butnot independent.

Page 31: Chap 5 PME

FIGURE 5.3A scattergramfor 200 observations of four different pairs of random variables.

Page 32: Chap 5 PME

5.48. Let X and Y be independent random variablesthat are uniformly distributed in [0,1] . Find theprobability of the following events:

a) 𝑃𝑃[𝑋𝑋2 < 1/2, 𝑌𝑌 <1/2]

5.58. Find 𝐸𝐸[𝑋𝑋2𝑒𝑒𝑌𝑌] where X and Y are independentrandom variables, X is a zero-mean, unit-varianceGaussian random variable, and Y is a uniform randomvariable in the interval [0, 3].

𝐸𝐸[𝑋𝑋2𝑒𝑒𝑌𝑌]= 𝐸𝐸 𝑋𝑋2 𝐸𝐸 𝑒𝑒𝑌𝑌 = 1 ×13�0

3𝑒𝑒𝑌𝑌 𝑑𝑑𝑌𝑌 =

13

(𝑒𝑒3 − 1)

= 𝑃𝑃[𝑋𝑋 < 1/ 2]𝑃𝑃[𝑌𝑌 < 1/2]

Page 33: Chap 5 PME

Many random variables of practical interest are notindependent: The output Y of a communication channelmust depend on the input X in order to conveyinformation; consecutive samples of a waveform thatvaries slowly are likely to be close in value and henceare not independent.

5.7.1 Conditional Probability

Case 1: X Is a Discrete Random Variable: For Xand Y discrete random variables, the conditional pmfof Y given X = x is defined by:

Page 34: Chap 5 PME

The conditional pmf satisfies all the properties of apmf.

The probability of an event A given 𝑋𝑋 = 𝑥𝑥𝑘𝑘 is found byadding the pmf values of the outcomes in A:

If X and Y are independent, then

In other words, knowledge that 𝑋𝑋 = 𝑥𝑥𝑘𝑘 does not affectthe probability of events A involving Y.

Page 35: Chap 5 PME

Example 5.29 Loaded Dice: Find 𝑝𝑝𝑌𝑌(𝑦𝑦|5) in theloaded dice experiment considered in Examples 5.6 and5.8.

𝑝𝑝𝑌𝑌 𝑦𝑦 5 =𝑝𝑝𝑋𝑋,𝑌𝑌(5,𝑦𝑦)𝑝𝑝𝑋𝑋 5

𝑝𝑝𝑌𝑌 5 5 =

𝑝𝑝𝑌𝑌 2 5 =

2717

Page 36: Chap 5 PME

Suppose Y is a continuous random variable. Define theconditional cdf of Y given 𝑿𝑿 = 𝒙𝒙𝒌𝒌

It is easy to show that 𝐹𝐹𝑌𝑌(𝑦𝑦|𝑥𝑥𝑘𝑘) satisfies all theproperties of a cdf.

The conditional pdf of Y given 𝑿𝑿 = 𝒙𝒙𝒌𝒌 if thederivative exists, is given by

If X and Y are independent, 𝑃𝑃 𝑌𝑌 ≤ 𝑦𝑦,𝑋𝑋 = 𝑥𝑥𝑘𝑘 =𝑃𝑃 𝑌𝑌 ≤ 𝑦𝑦 𝑃𝑃[𝑋𝑋 = 𝑥𝑥𝑘𝑘] , so 𝐹𝐹𝑌𝑌 𝑦𝑦 𝑥𝑥 = 𝐹𝐹𝑌𝑌(𝑦𝑦) and 𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 =𝑓𝑓𝑌𝑌(𝑦𝑦)

Page 37: Chap 5 PME

Example 5.31 Binary Communications System: The input Xto a communication channel assumes the values +1 or -1 withprobabilities 1/3 and 2/3. The output Y of the channel is given by𝑌𝑌 = 𝑋𝑋 + 𝑁𝑁 where N is a zero-mean, unit variance Gaussianrandom variable. Find the conditional pdf of Y given 𝑋𝑋 = +1 andgiven 𝑋𝑋 = −1. Find 𝑃𝑃[𝑋𝑋 = +1|𝑌𝑌 > 0].

Page 38: Chap 5 PME

If X is a continuous random variable

The conditional pdf of Y given 𝑋𝑋 = 𝑥𝑥 is then:

It is easy to show that 𝑓𝑓𝑌𝑌(𝑦𝑦|𝑥𝑥) satisfies the properties ofa pdf.

The probability of event A given 𝑋𝑋 = 𝑥𝑥 is obtained asfollows:

Page 39: Chap 5 PME

Example 5.32: Let X and Y be the random variables inExample 5.16. Find 𝑓𝑓𝑋𝑋(𝑥𝑥|𝑦𝑦) and 𝑓𝑓𝑌𝑌(𝑦𝑦|𝑥𝑥).

Using the marginal pdf’s obtained in Example 5.8, wehave

𝑓𝑓𝑋𝑋 𝑥𝑥 𝑦𝑦 = 2𝑒𝑒−𝑥𝑥𝑒𝑒−𝑦𝑦

2𝑒𝑒−2𝑦𝑦= 𝑒𝑒−𝑥𝑥𝑒𝑒𝑦𝑦 for 𝑥𝑥 ≥ 𝑦𝑦

𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 = 2𝑒𝑒−𝑥𝑥𝑒𝑒−𝑦𝑦

2𝑒𝑒−𝑥𝑥(1−𝑒𝑒−𝑥𝑥)= 𝑒𝑒−𝑦𝑦

(1−𝑒𝑒−𝑥𝑥)for 0 < 𝑦𝑦 < 𝑥𝑥

Page 40: Chap 5 PME

The conditional expectation of Y given 𝑿𝑿 = 𝒙𝒙 isdefined by

An interesting corollary

Page 41: Chap 5 PME

5.8.1 One Function of Two Random Variables: Letthe random variable Z be defined as a function of tworandom variables:

The cdf of Z is found by first finding the equivalentevent of 𝑍𝑍 ≤ 𝑧𝑧 that is, the set 𝑅𝑅𝑧𝑧 = {𝐗𝐗 =𝑥𝑥,𝑦𝑦 such that 𝑔𝑔(𝐗𝐗) ≤ 𝒛𝒛} then

The pdf of Z is then found by taking the derivative of𝐹𝐹𝑍𝑍(𝑧𝑧).

Page 42: Chap 5 PME

Example 5.39 Sum of Two Random Variables: Let𝑍𝑍 = 𝑋𝑋 + 𝑌𝑌. Find 𝐹𝐹𝑍𝑍(𝑧𝑧) and 𝑓𝑓𝑍𝑍(𝑧𝑧) in terms of the joint pdfof X and Y.

The cdf of Z is found by integrating the joint pdf of Xand Y over the region of the plane corresponding to theevent {𝑍𝑍 ≤ 𝑧𝑧}, as shown in the figure.

Thus the pdf for the sum of two random variables is given by a superposition integral.

Page 43: Chap 5 PME

If X and Y are independent random variables, then bythe last equation on previous slide the pdf is given bythe convolution integral of the marginal pdf’s of X andY:

Page 44: Chap 5 PME

5.8. For the pair of random variables (X, Y) sketch theregion of the plane corresponding to the followingevents. Identify which events are of product form.

5.28. The random vector (X,Y) is uniformly distributed(i.e., 𝑓𝑓𝑋𝑋,𝑌𝑌 𝑥𝑥,𝑦𝑦 = 𝑘𝑘 ) in the regions shown in Fig. andzero elsewhere.

(a) Find the value of k in each case.(b) Find the marginal pdffor X and for Y in each case.(c) Find 𝑃𝑃[𝑋𝑋 > 0,𝑌𝑌 > 0]

Page 45: Chap 5 PME

5.48. Let X and Y be independent random variablesthat are uniformly distributed in [-1,+1]. Find theprobability of the following events:

Page 46: Chap 5 PME

𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 = 𝑓𝑓𝑋𝑋𝑌𝑌(𝑥𝑥,𝑦𝑦)𝑓𝑓𝑋𝑋(𝑥𝑥)

𝑓𝑓𝑋𝑋 𝑥𝑥 = �− 1−𝑥𝑥2

1−𝑥𝑥2 1𝜋𝜋 𝑑𝑑𝑦𝑦 =

2𝜋𝜋 1 − 𝑥𝑥2

𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 =𝑓𝑓𝑋𝑋𝑌𝑌(𝑥𝑥,𝑦𝑦)𝑓𝑓𝑋𝑋(𝑥𝑥) =

1/𝜋𝜋2𝜋𝜋 1 − 𝑥𝑥2

=1

2 1 − 𝑥𝑥2

𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 = ∫−∞∞ 𝑦𝑦 𝑓𝑓𝑌𝑌 𝑦𝑦 𝑥𝑥 𝑑𝑑𝑦𝑦

𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 = �− 1−𝑥𝑥2

1−𝑥𝑥2

𝑦𝑦1

2 1 − 𝑥𝑥2𝑑𝑑𝑦𝑦

𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 =1

2 1 − 𝑥𝑥2𝑦𝑦2

2|− 1−𝑥𝑥21−𝑥𝑥2 = 0

𝐸𝐸 𝑦𝑦 = 𝐸𝐸 𝐸𝐸 𝑦𝑦 𝑋𝑋 = 𝑥𝑥 = 0

Page 47: Chap 5 PME

Example 5.34: X is selected at random from the unitinterval; Y is then selected at random from the interval(0, X). Find the cdf of Y.

When 𝑋𝑋 = 𝑥𝑥, Y is uniformly distributed in (0, x) so theconditional cdf given 𝑋𝑋 = 𝑥𝑥 is

The corresponding pdf is obtained by taking thederivative of the cdf:

Page 48: Chap 5 PME

𝐹𝐹𝑍𝑍 𝑍𝑍 = 𝑃𝑃 𝑍𝑍 ≤ 𝑧𝑧 = 𝑃𝑃 𝑋𝑋 + 𝑌𝑌 ≤ 𝑧𝑧 = ∫0𝑧𝑧/2 ∫𝑦𝑦

𝑧𝑧−𝑦𝑦 𝑒𝑒−(𝑥𝑥+𝑦𝑦)𝑑𝑑𝑥𝑥 𝑑𝑑𝑦𝑦

5.103. Find the joint cdf of 𝑊𝑊 = min(𝑋𝑋,𝑌𝑌) and 𝑍𝑍 =max(𝑋𝑋,𝑌𝑌) if X and Y are independent exponentialrandom variables with the same mean.

𝑓𝑓𝑋𝑋𝑌𝑌 𝑥𝑥,𝑦𝑦 = 𝜆𝜆2𝑒𝑒−𝜆𝜆𝑥𝑥𝑒𝑒−𝜆𝜆𝑦𝑦𝐹𝐹𝑊𝑊 𝑤𝑤 = 𝑃𝑃 min 𝑋𝑋,𝑌𝑌 ≤ 𝑤𝑤

= �0

𝑤𝑤

�0

𝑓𝑓𝑋𝑋𝑌𝑌 𝑥𝑥,𝑦𝑦 𝑑𝑑𝑦𝑦𝑑𝑑𝑥𝑥 + �𝑤𝑤

�0

𝑤𝑤

𝑓𝑓𝑋𝑋𝑌𝑌 𝑥𝑥,𝑦𝑦 𝑑𝑑𝑦𝑦𝑑𝑑𝑥𝑥