booklet for exam
TRANSCRIPT
-
8/8/2019 Booklet for Exam
1/15
1/15 Version 1 January 2009
Probability
Definition
Outcome: An outcome of an experiment is any
possible observation of that experiment.
Sample Space: The sample space of an
experiment is the finest-grain, mutuallyexclusive, collectively exhaustive set of all
possible outcomes.
Event: An event is a set of outcomes of anexperiment.
De Morgan's law:
( )c c c A B A B =
Axioms of Probability:A probability measure P[.] is a function that
maps events in the sample space to real numbers
such that
Axiom 1. For any event A, [ ] 0P A .
Axiom 2. [ ] 1P S = .Axiom 3. For any countable collection of A1,A2 of mutual exclusive events
[ ] [ ] [ ]1 2 1 2P A A P A P A = + +L L
Theorem:For any event A, and event space
{ }1 2, , , m B B BK ,
[ ] [ ]1
m
i
i
P A P A B=
= .
Conditional Probability:
[ ][ | ]
[ ]
P ABP A B
P B=
Law of Total Probability:
If B1, B2, B3, , Bm is an event space and P[Bi]> 0 for i = 1, , m. then
1
[ ] [ | ] [ ]m
i i
i
P A P A B P B=
=
Independent Events:Event A and B are independent if and only if
[ ] [ ] [ ]P AB P A P B=
Bayes' Theorem:
[ | ] [ ][ | ]
[ ]
P A B P BP B A
P A=
[ ]
1
[ | ] [ ]|
[ | ] [ ]
i ii m
i i
i
P A B P BP B A
P A B P B=
=
Theorem:
The number of k-permutations of n
distinguishable objects is
( ) ( )( ) ( )1 2 1
!
( )!
kn n n n n k
n
n k
= +
=
L
The number of k-combinations of n
distinguishable objects is
)!(!
!
knk
n
k
n
=
Discrete Random Variables
Expectation:
[ ] ( )
X
X X
x S
E X xP x
= = Theorem:
Given a random variable X with PMF PX(x) and
the derived random variable Y=g(X), the
expected value of Y is
E[Y] = Y= XSx
X xPxg )()(
Variance:
( )22
[ ] X XVar X E X = =
[ ]( )
2 2
22
[ ] XVar X E X
E X E X
=
=
Theorem:
, [ ] [ ]. If Y X b Var Y Var X = + =
Theorem:
-
8/8/2019 Booklet for Exam
2/15
2/15 Version 1 January 2009
2, [ ] [ ]. If Y aX Var Y a Var Y= =
Bernoulli RV
For0 1,p
1 0
( ) 1
0
X
p x
P x p x
otherwise
=
= =
E[X] = p Var[X] = p(1 - p)
Binomial RV
For a positive integer n and 0 1,p
( )1 0,1,2, ,( )
0
n xx
X
n p p x n
P x x
otherwise
= =
K
E[X] = np Var[X] = np(1 - p)
Geometric RV
For 0
-
8/8/2019 Booklet for Exam
3/15
3/15 Version 1 January 2009
( ) ( )|1
[ ]i
m
X X B i
i
P x P x P B=
= Theorem:
The conditional PMF
( )|X BP x of X given B
satisfies
( )|
( )
[ ]
0
x
X B
P xx B
P BP x
otherwise
=
Conditional Expected Value:The conditional expected value of random
variable X given condition B is
E[X|B] = X|B =
BxxPX|B(x)
Theorem:For a random variable X resulting from an
experiment with event space B1,,Bm
E[X] = ][][1
i
m
i
i BPBXE=
Continuous Random Variables
CDF: ( ) [ ]XF x P X x=
PDF: ( )( )X
X
dF xf x
dx=
Theorem:
[ ] ( )2
11 2
x
Xx
P x X x f x dx< =
Expectation:
[ ] ( )X E X xf x dx
=
Uniform RV - Continuous
For constant a < b
1
( )
0X
a x bf x b a
otherwise
< 0,
0( )
0
x
X
e xf x
otherwise
=
1[ ]E X
= ,2
1[ ]Var X
=
Erlang RV
For x > 0, and a positive integer n,
( )
1
01 !( )
0
n n x
X
x ex
nf x
otherwise
=
[ ]n
E X
= ,2
[ ]n
Var X
=
Gaussian RV
For constants 0, > < < ,
( )2 2
/ 2
( )2
x
X
e f x x
= < <
[ ]E X = , 2[ ]Var X =
Standard Normal Random Variable Z:
xz
=
Standard Normal CDF:
-
8/8/2019 Booklet for Exam
4/15
4/15 Version 1 January 2009
( )2 / 21
2
zu z e du
=
Standard Normal Complementary CDF:
( ) [ ] ( )2 / 21 1
2
u
zQ z P Z z e du z
= > = =
Theorem:
Given random variable X and a constant a > 0,
the PDF and CDF of Y = aX are
1( ) ( )
( ) ( )
Y X
Y X
y f y f
a a
y F y F
a
=
=
Theorem:
Given random variable X, the PDF and CDF ofV = X + b are
( ) ( )
( ) ( )
V X
V X
f v f v b
F v F v b
=
=
Pairs of Random Variables
Joint CDF:[ ], ( , ) ,X YF x y P X x Y y=
, ,( , ) ( , )x y
X Y X Y F x y f u v dvdu =
Joint PDF:2
,
,
( , )( , )
X Y
X Y
F x y f x y
x y
=
Marginal PDF:
,
,
( ) ( , )
( ) ( , )
X X Y
Y X Y
f x f x y dy
f y f x y dx
=
=
Independence:
, ( , ) ( ) ( ) X Y X Y f x y f x f y=
Joint Probability Mass Function:
, ( , ) [ , ]X YP x y P X x Y y= = =
Marginal PMF:
,
,
( ) ( , )
( ) ( , )
X X Y
y
Y X Y
x
P x P x y
P y P x y
=
=
Expectation:
The expected value of the discrete randomvariable W = g(X,Y) is
[ ] ,( , ) ( , )X Y
X Y
x S y S
E W g x y P x y
=
The expectation of
1( , ) ( , ) ( , )ng X Y g X Y g X Y = + +L is
[ ] [ ] [ ]1( , ) ( , ) ( , )nE g X Y E g X Y E g X Y = + +L
Expectation of the sum of 2 RVs:
[ ] [ ] [ ]E X Y E X E Y + = +
Variance of the sum of 2 RVs:
[ ] [ ] ( )( )
[ ] [ ] [ ]
[ ]
2
2 ,
X Y
Var X Y
Var X Var Y E X Y
Var X Var Y Cov X Y
+
= + +
= + +
Covariance:
[ ] ( )( ), X YCov X Y E X Y =
Correlation:
[ ]XYr E XY =
[ ] [ ], X YCov X Y E XY =
Correlation Coefficient:
[ ],
[ ] [ ]
XY
Cov X Y
Var X Var Y
=
Sums of Random Variables
1n nW X X= + +L
Expectation value of Wn:
-
8/8/2019 Booklet for Exam
5/15
5/15 Version 1 January 2009
[ ] [ ] [ ]1n nE W E X E X = + +L
Variance of Wn:
[ ] [ ]1 1 1
2 ,n n n
n i i j
i i j i
Var W Var X Cov X X = = = +
= +
PDF of W = X + Y:
,
,
( ) ( , )
( , )
W X Y
X Y
f w f x w x dx
f w y y dy
=
=
PDF of W = X + Y, when X and Y are
independent:
( ) ( ) ( )
( ) ( )
W X Y
X Y
f w f x f w x dx
f w y f y dy
=
=
Moment Generating Function (MGF):
( ) sXX s E e =
MGF satisfies:0( ) | 1X ss = =
The MGF of Y = aX + b satisfies:
( ) | ( )sbY X s e as =
The nth moment:
0( ) |
nn X
snd sE X
ds
= =
Theorem: For1, , nX XL a sequence of
independent RVs, the MGF of
1 nW X X= + +L is
1 2( ) ( ) ( ) ( )
nW X X X s s s s = L
For iid1, , nX XL , each with MGF ( )X s ,
the MGF of1 nW X X= + +L is
[ ]( ) ( )n
W Xs s =
Random Sums of independent RVs:
1 N R X X = + +L
( )( ) ln ( ) R N X s s =
[ ] [ ] [ ]E R E N E X =
[ ] [ ] [ ] [ ] [ ]( )2
Var R E N Var X Var N E X = +
Central Limit Theorem Approximation:
Let1n nW X X= + +L be an iid random sum
with [ ] XE X = and [ ]2Var X = , The CDF
of Wn may be approximated by
( )2n
xW
X
w nF w
n
.
Stochastic Process
The Expected Value of a Process:( )( )X t E X t =
Poisson Process of rate :
( )( )
( )
0,1,2!
0
n T
N T
T en
P n n
otherwise
==
L
Theorem:
For a Poisson process of rate , the inter-arrivaltimes X1, X2, are an iid random sequence
with the exponential PDF
0( )
0
x
X
e xf x
otherwise
=
Autocovariance:
( ) ( ) ( ), ,XC t Cov X t X t = +
Theorem:
)()(),(),( += tttRtC XXXX
Autocorrelation:( ) ( ) ( ),XR t E X t X t = +
Stationary Process:
( ) ( )1 1( ) ( ) 1 ( ) ( ) 1
, , , ,m m X t X t m X t X t m
f x x f x x + +=L LK K
Properties of stationary process:
( ) Xt =
-
8/8/2019 Booklet for Exam
6/15
6/15 Version 1 January 2009
( ) ( ) ( ), 0, X X X R t R R = =
)()(),( 2 XXXX CRtC ==
Wide Sense Stationary (WSS) RandomProcess:
[ ]( )E x t =
( ) ( ) ( ), 0, X X X R t R R = =
Properties of ACF for WSS RP:
( )0 0XR
( ) ( )X XR R =
( ) ( )0X XR R Average Power of a WSS RP:
( ) ( )2
0X R E X t =
Cross Correlation Functions (CCF): The cross
correlation of random processes X(t) and Y(t) is
( ) ( ) ( ),XYR t E X t Y t = +
Jointly WSS Processes: The random processesX(t) and Y(t) are jointly wide sense stationary if
X(t) and Y(t) are each wide sense stationary, and
the cross correlation satisfies
( ) ( ), XY XY R t R =
Random Signal Processing
Theorem: If the input to a linear time invariant
filter with impulse response h(t) is a WSS RP
X(t), the output Y(t) is a WSS RP with mean
value
( ) (0)Y X Xh t dt H
= = ,
and ACF function
( ) ( )( ) ( )Y X R h u h v R u v dvdu
= +
Power Spectral Density (PSD): For a WSS RP
X(t), the ACF ( )XR and PSD ( )XS f are theFourier transform pair
( ) ( )
( ) ( )
2
2
j f
X X
j f
X X
S f R e d
R S f e df
=
=
Theorem: For WSS process X(t), the PSD SX(f)
is a real valued function with the following
properties:
0)( fSX
[ ]
== )0()()( 2 XX RtXEdffS
)()( fSfS XX =
Power Spectral Density of a RandomSequence: The PSD for WSS random Sequence
Xn is
+=
=
2
2
12
1)( lim
L
Ln
njn
L
X eXEL
S
Cross Spectral Density:
( ) ( ) 2j f XY XY S f R e d
=
Theorem: When a WSS RP X(t) is the input to alinear time invariant filter with frequency
response H(f), PSD of the output Y(t) is
( ) ( ) ( )2
Y xS f H f S f =
Theorem: Let X(t) be a wide sense stationary
input to a linear time invariant filter H(f). The
input X(t) and output Y(t) satisfy
( ) ( ) ( ) XY X S f H f S f =
( ) ( ) ( )*Y XYS f H f S f =
-
8/8/2019 Booklet for Exam
7/15
7/15 Version 1 January 2009
APPENDIX 1: Standard normal CDF
-
8/8/2019 Booklet for Exam
8/15
8/15 Version 1 January 2009
APPENDIX 2: The standard normal complementary CDF
-
8/8/2019 Booklet for Exam
9/15
9/15 Version 1 January 2009
APPENDIX 3: Moment Generating Function for families of random
variables
-
8/8/2019 Booklet for Exam
10/15
10/15 Version 1 January 2009
APPENDIX 4: Fourier Transform Pairs
-
8/8/2019 Booklet for Exam
11/15
11/15 Version 1 January 2009
APPENDIX 5: Mathematical Tables
Trigonometric Identities
-
8/8/2019 Booklet for Exam
12/15
12/15 Version 1 January 2009
Trigonometric Identities
Approximation
-
8/8/2019 Booklet for Exam
13/15
13/15 Version 1 January 2009
Indefinite Integrals
-
8/8/2019 Booklet for Exam
14/15
14/15 Version 1 January 2009
Definite Integrals
-
8/8/2019 Booklet for Exam
15/15
15/15 Version 1 January 2009
Sequences and Series