jointly distributed random variables article
TRANSCRIPT
-
8/11/2019 Jointly Distributed Random Variables Article
1/6
Jointly Distributed Random Variables
COMP 245 STATISTICS
Dr N A Heard
Contents
1 Jointly Distributed Random Variables 11.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Joint cdfs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Joint Probability Mass Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.4 Joint Probability Density Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 Independence and Expectation 42.1 Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.2 Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.3 Conditional Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.4 Covariance and Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
3 Examples 6
1 Jointly Distributed Random Variables
1.1 Definition
Joint Probability DistributionSuppose we have two random variablesXand Ydefined on a sample spaceSwith proba-
bility measure P(E), ES. Jointly, they form a map(X, Y): SR2 wheres(X(s), Y(s)).From before, we know to define the marginal probability distributions PXand PYby, for
example,
PX(B) =P(X1(B)), B
R.
We now define thejoint probability distributionPXYby
PXY(BX, BY) =P{X1(BX) Y1(BY)}, BX, BY R.So PXY(BX, BY), the probability that X BXandY BY, is given by the probability P of
the set of all points in the sample space that get mapped bothintoBXby Xand intoBYby Y.
More generally, for a single region BXY R2, find the collection of sample space elementsSBXY ={sS|(X(s), Y(s))BXY}
and definePXY(BXY) =P(SBXY).
1
-
8/11/2019 Jointly Distributed Random Variables Article
2/6
1.2 Joint cdfs
Joint Cumulative Distribution FunctionWe thus define thejoint cumulative distribution functionas
FXY(x,y) =PXY((
, x], (
,y]), x,y
R.
It is easy to check that the marginal cdfs for Xand Ycan be recovered by
FX(x) =FXY(x,), xR,FY(y) =FXY(,y), yR,
and that the two definitions will agree.
Properties of a Joint cdfForFXYto be a valid cdf, we need to make sure the following conditions hold.
1. 0FXY(x,y)1, x,yR;2. Monotonicity:x1, x2,y1,y2 R,
x1 < x2FXY(x1,y1)FXY(x2,y1)andy1 < y2FXY(x1,y1)FXY(x1,y2);3.x,yR,
FXY(x, ) =0, FXY(,y) =0 andFXY(,) =1.
Interval Probabilities
Suppose we are interested in whether the
random variable pair (X, Y) lie in the inter-val cross product (x1, x2] (y1,y2]; that is, ifx1 < Xx2and y1 < Yy2.
y1
y2
x1 x2 X
Y
First note that PXY((x1, x2], (,y]) =FXY(x2,y) FXY(x1,y).It is then easy to see that PXY((x1, x2], (y1,y2])is given by
FXY(x2,y2) FXY(x1,y2) FXY(x2,y1) +FXY(x1,y1).1.3 Joint Probability Mass Functions
IfXand Yare both discrete random variables, then we can define the joint probability massfunctionas
pXY(x,y) =PXY({x}, {y}), x,yR.
We can recover the marginal pmfspXand pYsince, by the law of total probability, x,yRpX(x) =
y
pXY(x,y), pY(y) = x
pXY(x,y).
2
-
8/11/2019 Jointly Distributed Random Variables Article
3/6
Properties of a Joint pmfFor pXYto be a valid pmf, we need to make sure the following conditions hold.
1. 0 pXY(x,y)1, x,yR;2.
y xpXY(x,y) =1.
1.4 Joint Probability Density Functions
On the other hand, iffXY : RRR s.t.
PXY(BXY) =
(x,y)BXYfXY(x,y)dxdy, BXY RR,
then we say Xand Yare jointly continuous and we refer to fXYas the joint probabilitydensity functionofXand Y.
In this case, we have
FXY(x,y) = y
t=
xs=
fXY(s, t)dsdt, x,yR,
By the Fundamental Theorem of Calculus we can identify the joint pdf as
fXY(x,y) = 2
xyFXY(x,y).
Furthermore, we can recover the marginal densities fXand fY:
fX(x) = d
dx
FX(x) = d
dx
FXY(x,)
= d
dx
y=
xs=
fXY(s,y)dsdy.
By the Fundamental Theorem of Calculus, and through a symmetric argument for Y, wethus get
fX(x) =
y=fXY(x,y)dy, fY(y) =
x=
fXY(x,y)dx.
Properties of a Joint pdfFor fXYto be a valid pdf, we need to make sure the following conditions hold.
1. fXY(x,y)0, x,yR;
2.
y=
x=
fXY(x,y)dxdy= 1.
3
-
8/11/2019 Jointly Distributed Random Variables Article
4/6
2 Independence and Expectation
2.1 Independence
Independence of Random VariablesTwo random variablesXand Yareindependentiff
BX, BY
R.,
PXY(BX, BY) =PX(BX)PY(BY).
More specifically, two discrete random variablesXand Yare independent iff
pXY(x,y) = pX(x)pY(y), x,yR;
and two continuous random variablesXand Yare independent iff
fXY(x,y) = fX(x)fY(y), x,yR.
Conditional DistributionsFor two r.v.sX, Ywe define theconditional probability distributionPY|Xby
PY|X(BY|BX) =PXY(BX, BY)
PX(BX) , BX, BY R, PX(BX)=0.
This is the revised probability ofYfalling insideBYgiventhat we now knowXBX.
Then we haveXand Yare independent PY|X(BY|BX) =PY(BY), BX, BY R.For discrete r.v.sX, Ywe define theconditional probability mass function p
Y|Xby
pY|X(y|x) = pXY(x,y)
pX(x) , x,yR,pX(x)=0,
and for continuous r.v.sX, Ywe define theconditional probability density function fY|Xby
fY|X(y|x) = fXY(x,y)
fX(x) , x,yR.
[Justification:
P(Yy|X[x, x+dx)) = PXY([x, x+dx), (,y])PX([x, x+dx))
= F(y, x+dx)
F(y, x)
F(x+dx) F(x)=
{F(y, x+dx) F(y, x)}/dx{F(x+dx) F(x)}/dx
= f(y|X[x, x+dx)) = d{F(y, x+dx) F(y, x)}/dxdy{F(x+dx) F(x)}/dx= f(y|x) = lim
dx>0f(y|X[x, x+dx)) = f(y, x)
f(x) .]
In either case,Xand Yare independent pY|X(y|x) = pY(y)or fY|X(y|x) = fY(y), x,yR.
4
-
8/11/2019 Jointly Distributed Random Variables Article
5/6
2.2 Expectation
E{g(X, Y)}Suppose we have a bivariate function of interest of the random variables X and Y, g :
RRR.
IfXandYare discrete, we define E{g(X, Y)}byEXY{g(X, Y)}=
y
x
g(x,y)pXY(x,y).
IfXandYare jointly continuous, we define E{g(X, Y)}by
EXY{g(X, Y)}=
y=
x=
g(x,y)fXY(x,y)dxdy.
Immediately from these definitions we have the following:
Ifg(X, Y) =g1(X) +g2(Y),
EXY{g1(X) +g2(Y)}= EX{g1(X)} +EY{g2(Y)}.
Ifg(X, Y) =g1(X)g2(Y)and Xand Yareindependent,
EXY{g1(X)g2(Y)}= EX{g1(X)}EY{g2(Y)}.
In particular, consideringg(X, Y) =XYfor independentX, Ywe have
EXY(XY) =EX(X)EY(Y).
2.3 Conditional Expectation
EY|X(Y|X= x)In general EXY(XY)=EX(X)EY(Y).
SupposeXand Yare discrete r.v.s with joint pmfp(x,y). If we are given the valuex of ther.v.X, our revised pmf forYis the conditional pmfp(y|x), foryR.
Theconditional expectationofYgivenX=x is therefore
EY|X(Y|X= x) = y
y p(y|x).
Similarly, ifXandYwere continuous,
EY|X(Y|X= x) =
y=y f(y|x)dy.
In either case, the conditional expectation is a function ofx but not the unknownY.
5
-
8/11/2019 Jointly Distributed Random Variables Article
6/6
2.4 Covariance and Correlation
CovarianceFor a single variableXwe considered the expectation ofg(X) = (X X)(X X), called
the variance and denoted2X.
The bivariate extension of this is the expectation ofg(X, Y) = (X X)(Y Y). We definethecovarianceofXand Yby
XY=Cov(X, Y) =EXY[(X X)(Y Y)].
CorrelationCovariance measures how the random variables move in tandem with one another, and so
is closely related to the idea of correlation.We define thecorrelationofXand Yby
XY=Cor(X, Y) = XY
X
Y
.
Unlike the covariance, the correlation is invariant to the scale of the r.v.s XandY.It is easily shown that ifXandYare independent random variables, thenXY=XY=0.
3 Examples
Example 1Suppose that the lifetime,X, and brightness,Yof a light bulb are modelled as continuous
random variables. Let their joint pdf be given by
f(x,y) =12e1x2y, x,y > 0.
Are lifetime and brightness independent?
The marginal pdf for Xis
f(x) =
y=f(x,y)dy =
y=0
12e1x2ydy
=1e1x.
Similarly f(y) =2e2y. Hence f(x,y) = f(x)f(y)and Xand Yare independent.
Example 2
Suppose continuous r.v.s(X, Y)R2 have joint pdff(x,y) =
1, x
2 +y2 10, otherwise.
Determine the marginal pdfs for XandY.
Wellx2 +y2 1 |y| 1 x2. So
f(x) =
y=f(x,y)dy =
1x2y=1x2
1
dy=
2
1 x2.
Similarly f(y) = 2
1 y2. Hence f(x,y)
= f(x)f(y)and Xand Yare not independent.
6