adc - lec 2 - probability

31
Advanced Digital Communication

Upload: ahmad-mahmood

Post on 14-Dec-2015

239 views

Category:

Documents


2 download

DESCRIPTION

Introduction to probability How to calculate probability and PDF, CDF of a communication message and find its expected value and median

TRANSCRIPT

Advanced Digital Communication

Probability

Random Experiment

� Random experiment: its outcome, for some reason, cannot be predicted with certainty. Examples: throwing a die, flipping a coin and drawing a card from a deck.

Procedure

(e.g., flipping a coin)

Outcome

(e.g., the value

observed [head, tail] after

flipping the coin)

Sample Space

(Set of All Possible

Outcomes)

Sample Space

� Sample space: the set of all possible outcomes, denoted by S. Outcomes are denoted by w’s and each w lies in S, i.e., w ∈ S.

� A sample space can be discrete or continuous.

� Events are subsets of the sample space for which measures of their occurrences, called probabilities, can be defined or determined.

Example

� Various events can be defined: “the outcome is even number of dots”, “the outcome is smaller than 4 dots”, “the outcome is more than 3 dots”, etc.

Probability

� Consider rolling of a die with six possible outcomes

� The sample space S consists of all these possible outcomes i.e. S= {1, 2, 3, 4, 5, 6}

� Consider an event A which is subset of S and is A = {2, 4}. Ac is complement of A which consists of all points of S not in A i.e. Ac = {1, 3, 5, 6}

� Two events are mutually exclusive if they have no points in common. A and Acmutually exclusiveevents

Probability

� Union of two events is an event which contains all sample points in both events i.e. A U AC= S

� If B = {1, 3, 6} and C = {1, 2, 3} are events of S, then intersection is given as an event which shows points which are common to both i.e. E = B ∩ C = {1, 3}

� For mutually exclusive events intersection is a null event i.e. A ∩ AC= Ø

Probability

� P(A) is the probability of event A in S

� Probability of event A satisfies the condition P(A) >= 0

� Probability of sample space is P(S) = 1

� Probability of mutually exclusive events is that both cannot occur. Their intersection results in null and their union results in sum of the individual probabilities i.e. P(A and B) = 0 and P(A or B) = P(A) +P (B)

Joint event and probabilities

� Perform two experiments together and consider their outcomes. E.g. consider single toss of two dice

� Sample space consists of 36 two-tuples (i,j) where i,j = 1,2 …. 6. If one experiment has outcomes Ai, i=1,2….n and the second experiment has outcomes Bj, j=1,2….m. The combined experiment has joint outcomes (Ai, Bj), i=1,2….n, j=1,2….m.

� The joint probability satisfies the condition 0<=P(Ai,Bj)<=1.

Engr.Ahmad
Highlight
Sequences

Joint event and probabilities

� The outcomes of Bj, j=1,2….m are mutually exclusive: ∑m

j=1P(Ai ,Bj)= P(Ai)

� Similarly we have ∑ni=1P(Ai ,Bj)= P(Bj).

� If outcomes of the two experiments are mutually exclusive ∑n

i=1∑mj=1P(Ai ,Bj)=1

Important properties of probability measures

� P(AC) = 1 − P(A), where AC denotes the complement of A. This property implies that P(AC) + P(A) = 1, i.e., something has to happen.

� P(⊘) = 0 (again, something has to happen).

� P(A ∪ B) = P(A) + P(B) − P(A ∩ B). Note that if two events A and B are mutually exclusive then P(A ∪ B) = P(A) + P(B), otherwise the nonzero common probability P(A ∩ B) needs to be subtracted off.

� If A ⊆ B then P(A) ≤ P(B). This says that if event A is contained in B then occurrence of Ameans B has occurred but the converse is not true.

Conditional Probability

� Suppose event B has occurred and we wish to determine probability of occurrence of event A

� Conditional probability of event A given the occurrence of event B is given as: P(A|B)=P(A,B)/P(B) provided P(B)>0

� In a similar way, B conditioned on occurrence of A is given by: P(B|A)=P(A,B)/P(A) provided P(A)>0

� P(A,B) is the simultaneous occurrence of A and B i.e. A ∩ B. For mutually exclusive events P(A|B)=0. If A is a subset of B, A ∩ B=A => P(A|B)=P(A)/P(B).

� If B is a subset of A, A ∩ B=B => P(A|B)=P(B)/P(B)=1

Bayes’ Rule

� If we have P(A,B)= P(A|B)P(B)=P(B|A)P(A)

� P(A|B) =P(B|A)P(A)/P(B)

� Where P(A), the prior, is the initial degree of belief in A. P(A|B), the posterior, is the degree of belief having accounted for B. The quotient P(B|A)/P(B) represents the support B provides for A.

Statistically Independent

� Consider two or more experiments or repeated trials of the same experiment

� Consider the case of conditional probability P(A|B) and suppose A does not depend on B so P(A|B) =P(A).

� Since P(A,B)= P(A|B)P(B)=P(A)P(B) is the joint probability of statistically independent A and B

� This can be extended to 3 or more events e.g. P(A,B,C)= P(A)P(B)P(C)

� All useful message signals appear random; that is, the receiver does not know, a priori, which of the possible waveform have been sent.

� Let a random variable X(A) represent the functional relationship between a random event A and a real number.

� Notation - Capital letters, usually X or Y, are used to denote random variables. Corresponding lower case letters, x or y, are used to denote particular values of the random variables X or Y.

� Example:

means the probability that the random variable X will take a value less than or equal to 3.

( 3)P X ≤

Random Variables

Types of Random Variables

� Discrete Random Variable

� Continuous Random Variable

� Mixed Random Variable

Discrete Random Variable

� Discrete random variable have a countable (finite or infinite) image

Sx

= {0, 1}

Sx

= {…, -3, -2, -1, 0, 1, 2, 3, …}

� Probability Mass Function is the discrete probability density function that provides the probability of a particular point in the sample space of a discrete random variable

� Probability Mass Function (p(x)) specifies the probability of each outcome (x) and has the properties:

( ) 0

( ) 1

( ) ( )

x

p x

p x

P X x p x

=

= =

Probability Mass Function (pmf)

Cumulative Distribution Function (cdf)

� cdf specifies the probability that the random variable will assume a value less than or equal to a certain variable (x).

� The cumulative distribution, F(x), of a discrete random variable X with probability mass distribution, p(x), is given by:

( ) ( ) ( )x t

F x P X x p t≤

= ≤ =∑

Engr.Ahmad
Highlight
Its for continuous and is calculated by integral and PMF is discrete and is calculated by SummationGenerally its for continuous but here we have relationship of PMF and CDF
Engr.Ahmad
Highlight
Engr.Ahmad
Highlight
Relationship

Examples of cdf of discrete random variables

� The cdf of a discrete random variable generated by flipping of a fair coin is:

� Similarly the cdf of a discrete random variable generated by tossing a fair die is:

Mean, Standard Deviation and Variance of a Discrete Random Variable X

� Mean or Expected Value

� Standard Deviation

∑=µ xall

)x(xp

21

xall

2 )x(p)x(

µ−=σ ∑

21

xall

22 )x(px

µ−= ∑

Mean, Standard Deviation and Variance of a Discrete Random Variable X

� Variance

2 2

all x

var{ } ( ) ( )X x p xσ µ= = −∑

Example

The probability mass function of X is

X p(x)

0 0.001

1 0.027

2 0.243

3 0.729

The cumulative distribution function of X is

X F(x)

0 0.001

1 0.028

2 0.271

3 1.000

p(x)

x

F(x)

x

1

0.5

0

ProbabilityMassFunction

Cumulative Distribution Function

0 1 2 3 4

0 1 2 3 4

Example Contd.

1

0.5

0

Example Contd.

)(XE=µ

( ) ( )

7.2

)729.0)(3(243.0)2(27.0)1(0

)( 3

0x

=

+++=

=∑=

xpx

)(2XVar=σ

,27.0

06561.011907.007803.000729.0

)( )(3

0x

2

=

+++=

−=∑=

xpx µ

σ and

5196.0

0.27

=

=

Questions

� What is the average value of the following random

variable SX={1, 6, 7, 9, 13}?

Ans: 7.2 � What is the expected value of random variable

from the following figure?pX(x)

1 6 7 9 13

0.2

0.1

0.4

0.3

Engr.Ahmad
Highlight

Questions

� Which of the following has higher variance and why?

� Calculate the variance for both cases and justify?

pX(x)

1 6 7 9 13

0.033

0.5

0.4

E{X}=3.87

X

qX(x)

1 6 7 9 13

0.2

0.1

0.4E{X}=5.2

X

Continuous Random Variable

� There are physical systems that generate continuous outcomes

� Continuous random variables have an uncountable imageSx

= (0, 1)

Sx

= R

� E.g. Noise voltage generated by an electronic amplifier

� In such cases random variable is said to be continuous random variable

Example of cdf of continuous random variables

� The cdf of a continuous random variable is:

� This is a smooth non decreasing function of x

Mixed Random Variables

� Mixed random variables have an image which contains continuous and discrete parts

Sx

= {0} U (0, 1)

Example of cdf of mixed random variables

� The cdf of a continuous random variable is:

� The cdf of such a random variable is smooth, non decreasing function in certain parts of the real line and contains jumps at a number of discrete values of x