communication systems by mr.v.sudheer raja, m.tech assistant professor, department of electrical...
TRANSCRIPT
COMMUNICATION SYSTEMSCOMMUNICATION SYSTEMS
BY
Mr.V.Sudheer Raja, M.TechAssistant professor , Department of Electrical Engineering
Adama Science and Technology University
E-Mail : [email protected]
CHAPTER IReview of Probability, Random Variables and
Random Process
Contents:• Introduction • Definition of Probability, Axioms of Probability• Sub-topics of Probability• Random Variable and its Advantages• Probability Mass function & Probability density
function• Conditional, Joint, Bernoulli and Binomial
Probabilities• Random Process
Mr V Sudheer Raja ASTU
IntroductionIntroduction
Models of a Physical Situation:• A model is an approximate representation.
--Mathematical model--Simulation models
• Deterministic versus Stochastic/Random
-- Deterministic models offer repeatability of measurements Ex: Ohm’s Laws, model of a capacitor/inductor/resistor-- Stochastic models don’t: Ex: Processor’s caching, queuing, and estimation of task
execution time
• The emphasis of this course would be on Stochastic Modeling.
Mr V Sudheer Raja ASTU
Probability
In any communication system the signals encountered may be of two types:
-- Deterministic and Random. Deterministic signals are the class of the signals that may be
predicted at any instant of time, and can be modeled as completely specified functions of time.
Random signals, it is not possible to predict its precise value in advance.
It is possible to described these signals in terms of its statistical properties such as the average power in the random signal, or the spectral distribution of the power.
The mathematical discipline that deals with the statistical characterization of random signals is probability theory.
Mr V Sudheer Raja ASTU
Probability Theory
The phenomenon of statistical regularity is used to explain probability.
Probability theory is rooted in phenomena that, explicitly or implicitly, can be modeled by an experiment or situation with an outcome or result respectively that is subject to a chance. If the experiment is repeated the outcome may differ because of underlying random phenomena or chance mechanism. Such an experiment is referred to as random experiment.
To use the idea of statistical regularity to explain the concept of probability, we proceed as follows:
1. We prescribe a basic experiment, which is random in nature and is repeated under identical conditions.
2. We specify all the possible outcomes of the experiment. As the experiment is random in nature, on any trial of the
experiment the above possible outcomes are unpredictable i.e. any of the outcome prescribed may be resulted.
3. For a large number of trials of the experiment, the outcomes exhibit statistical regularity; that is a definite average pattern of
outcomes is observed if the experiment is repeated a large number of times.
Mr V Sudheer Raja ASTU
Random experiment
Random Experiment : The outcome of the experiment varies in an random manner
The possible outcome of an experiment is termed as event
--Ex: In case of coin tossing experiment, the possibility of occurrence of “head” or “tail” is treated as event.
Sample Space:
--The set of possible results or outcomes of an experiment i.e. totality of all events without repetition is called as sample space and denoted as ‘S’.
--Ex :In case of coin tossing experiment, the possible outcomes are either “head” or “tail” ,thus the sample space can be defined
as,
S={head , tail}
Mr V Sudheer Raja ASTU
Sample Spaces for some example experiments
1. Select a ball from an urn containing balls numbered 1 to 50
2. Select a ball from an urn containing balls numbered 1 to 4. Balls 1 and 2 are black, and 3 and 4 are white. Note the number and the color of the ball
3. Toss a coin three times and note the number of heads
4. Pick a number at random between 0 and 15. Measure the time between two message arrivals
at a messaging center6. Pick 2 numbers at random between 0 and 17. Pick a number X at random between zero and
one, then pick a number Y at random between 0 and X. Mr V Sudheer Raja ASTU
Some events in the experiments on the previous slide
1. An even numbered ball is selected
2. The ball is white and even-numbered
3. Each toss is the same outcome
4. The number selected is non-negative
5. Less than 5 seconds elapse between message arrivals
6. Both numbers are less than 0.5
7. The two numbers differ by less than one-tenth
• Null event, elementary event, certain event
Mr V Sudheer Raja ASTU
The urn experiment : Questions?
Consider an urn with three balls in it, labeled 1,2,3
What are the chances that a ball withdrawn at random from the urn is labeled ‘1’? How to quantify this ‘chance’? Is withdrawing any of the three balls equally likely (equi-probable); or if any ball is more likely to be drawn compared to
the others? If someone assigns that ‘1’ means ‘sure occurrence’
and ‘0’ means ‘no chance of occurrence’, then what number would you give to the chance of getting ‘ball 1’?
And how do you compare the chance of withdrawing an odd-numbered ball to that of withdrawing an even-numbered ball?
Mr V Sudheer Raja ASTU
Counts of the selections of ‘kth’ outcome in ‘n’ iterations (trails) of the random experiment is given by
Nk(n)
The ratio Nk(n)/n is called the relative frequency of ‘kth’ outcome is defined as:
If ‘kth’ outcome occurs in none of the trails then Nk(n)=0 i.e., Nk(n)/n = 0 and if ‘kth’ event occurs as outcome for all the trails then Nk(n)=n i.e., Nk(n)/n =1.Clearly the relative frequency is a nonnegative real number less than or equal to 1 i.e.,
0≤ Nk(n)/n ≤ 1
Mr V Sudheer Raja ASTU
InferencesStatistical regularity:
--Long-term averages of repeated iterations of a random experiment tend to yield the same value
A few ideas of note:
And regarding the chances of withdrawing an odd-numbered ball,
Mr V Sudheer Raja ASTU
Axioms of Probability
• A probability system consists of the following triple:
1.A sample space ’S’ of elementary events(outcomes)
2.A class of events that are subset of ‘S’
3.A probability measure P(.) assigned to each event (say ‘A’) in the class, which has the following properties:
(i) P(S)=1,
The probability of sure event is1
(ii) 0≤ P(A) ≤ 1,
The probability of an event A is a non negative real number that is less than or equal to 1
(iii) If A and B are two mutually exclusive events in the given class then,
P(A+B)= P(A)+ P(B)
Mr V Sudheer Raja ASTU
Three axioms are used to define the probability and are also used to define some other properties of the probability.
• Property 1 P(Ā)=1- P(A)
where Ā is the complement of event ‘A’• Property 2-- If M mutually exclusive A1, A2 , --------- AM have the exhaustive property
A1 + A2 +--------AM = SThen
P(A1) + P(A2) + P(A3) -------- P(AM)=1 Property 3 -- When events A and B are not mutually exclusive events then the
probability of union event “A or B” equals P(A+B)= P(A)+ P(B) - P(AB)
Where P(AB) is called a joint probability --Joint probability has the following relative frequency interpretation,
P(AB)=
Mr V Sudheer Raja ASTU
Conditional Probability
If an experiment involves two events A and B . Let P(B/A) denotes the probability of event B , given that event A has occurred . The probability P(B/A) is called the Conditional probability of B given A. Assuming that A has nonzero probability, the conditional probability P(B/A) is defined as,
P(B/A)=P(AB)/P(A),
Where P(AB) is the joint probability of A and B.
P(B/A)= ,where represents the relative frequency of B given A has occurred.
The joint probability of two events may be expressed as the product of conditional probability of one event given the other times the elementary probability of the other. that is ,
P(AB)=P(B/A)* P(A)
= P(A/B)*P(B)
Mr V Sudheer Raja ASTU
Random Variables A function whose domain is a sample space and whose
range is some set of real numbers is called a random variable of the experiment.
Random variable is denoted as X(s) or simply X, where ‘s’ is called sample point corresponds to an outcome that belongs to sample space ‘S’.
Random variables may be discrete or continuous The random variable X is a discrete random variable if X
can take on only a finite number of values in any finite observation interval.
Ex: X(k)=k, where ‘k’ is sample point with the event showing k dots when a die is thrown, where it has limited number of possibilities like either 1 or 2 or 3 or 4 or 5 or 6 dots to show.
If X can take on any value in a whole observation interval, X is called Continuous random variable.
Ex: Random variable representing amplitude of a noise voltage at a particular instant of time because it may take any value between plus and minus infinity.
Mr V Sudheer Raja ASTU
For probabilistic description of random variable, let us consider the random variable X and the probability of the event X ≤ x , denoted by P(X ≤ x) i.e. the probability is the function of the dummy variable x which can be expressed as,
FX (x)= P(X ≤ x)
The function FX (x) is called the cumulative distribution function or distribution function of the random variable X. and it has the following properties,
1. The distribution function FX (x) is bounded between zero and one.
2. The distribution function FX (x) is a monotone non-decreasing function of x, i.e.
FX (x 1)≤ FX (x 2) , if x 1 < x 2
Mr V Sudheer Raja ASTU
Probability density function -- The derivative of distribution function is called Probability density
function. i.e. -------Eq.(1).
--Differentiation in Eq.(1) is with respect to the dummy variable x and the name density function arises from the fact that the probability of the event x 1 < X ≤ x 2 equals
P(x 1 < X ≤ x 2) = P(X ≤ x 2)-P(X ≤ x 1)
= FX (x 2)- FX (x 1)
= ---------Eq.(2)
Since FX (∞)=1,Corresponding to the probability of certain event and
FX (-∞)=0 corresponds to the probability of an impossible event . which follows immediately that
i.e. the probability density function must always be a nonnegative function and with a total area of one.
Mr V Sudheer Raja ASTU
Statistical Averages
The mean value or the expected value of a random variable is defined as,
mX =
where E denotes the expectation operator, that is the mean value mX locates the center of gravity of the area under the probability density function curve of the random variable X.
Mr V Sudheer Raja ASTU
The variance of the random variable X is the measure of the variables randomness, it constrains the effective width of the probability density function fX(x) of the random variable X about the mean mX and is expressed as,
The variance of random variable is normally denoted as
The square root of Variance is called as standard deviation of the random variable X.
Mr V Sudheer Raja ASTU
Random ProcessesRandom Processes
Description of Random Processes Stationary and ergodicty Autocorrelation of Random Processes Cross-correlation of Random
Processes
Random Processes
A RANDOM VARIABLE X, is a rule for assigning to every outcome, of an experiment a number X(.
Note: X denotes a random variable and X( denotes a particular value.
A RANDOM PROCESS X(t) is a rule for assigning to every a function X(t,
Note: for notational simplicity we often omit the dependence on .
Mr V Sudheer Raja ASTU
Ensemble of Sample FunctionsEnsemble of Sample FunctionsThe set of all possible functions is called the ENSEMBLE.
Mr V Sudheer Raja ASTU
Random ProcessesRandom Processes
A general Random or Stochastic Process can be described as:
Collection of time functions (signals) corresponding to various outcomes of random experiments.
Collection of random variables observed at different times.
Examples of random processes in communications:
Channel noise, Information generated by a
source, Interference.
t1 t2
Mr V Sudheer Raja ASTU
Collection of Time FunctionsCollection of Time Functions
Consider the time-varying function representing a random process where i represents an outcome of a random event.
Example: a box has infinitely many resistors (i=1,2, . . .) of
same resistance R. Let i be event that the ith resistor has been
picked up from the box Let v(t, i) represent the voltage of the thermal
noise measured on this resistor.
Mr V Sudheer Raja ASTU
Collection of Random VariablesCollection of Random Variables
For a particular time t=to the value x(to,i is a random variable.
To describe a random process we can use collection of random variables {x(to,1 , x(to,2 , x(to,3 , . . . }.
Type: a random processes can be either discrete-time or continuous-time.
Ex:Probability of obtaining a sample function of a RP that passes through the following set of windows. Probability of a joint event.
Mr V Sudheer Raja ASTU
Description of Random Description of Random ProcessesProcesses
Analytical description: X(t) =f(t,) where is an outcome of a random event.
Statistical description: For any integer N and any choice of (t1, t2, . . ., tN) the joint pdf of {X(t1), X( t2), . . ., X( tN) } is known. To describe the random process completely the PDF f(x) is required.
1 1 1 2
1 2
( ), [ , ,... ]
{ ( ), ( ),.... ( )}N
N
x x t x x x
f f x t x t x t
x
x
Mr V Sudheer Raja ASTU
Activity: EnsemblesActivity: Ensembles Consider the random process: x(t)=At+B Draw ensembles of the waveforms:
B is constant, A is uniformly distributed between [-1,1]
A is constant, B is uniformly distributed between [0,2]
Does having an “Ensemble” of waveforms give you a better picture of how the system performs?B
x(t)
t
Slope Random
2
x(t)
t
B intersect is RandomMr V Sudheer Raja ASTU
Stationarity
Definition: A random process is STATIONARY to the order N if for any t1,t2, . . . , tN,
fx{x(t1), x(t2),...x(tN)}=fx{x(t1+t0), x(t2+t0),...,x(tN +t0)}
This means that the process behaves similarly (follows the same PDF) regardless of when you measure it.
A random process is said to be STRICTLY STATIONARY if it is stationary to the order of N→∞.
Is the random process from the coin tossing experiment stationary?
Mr V Sudheer Raja ASTU
Illustration of StationarityIllustration of Stationarity
Time functions pass through the corresponding windows at different times with the same probability.
Mr V Sudheer Raja ASTU
Example of First-Order StationarityExample of First-Order Stationarity
Assume that A and 0 are constants; 0 is a uniformly distributed RV from ) t is time.
The PDF of given x(t):
Note: there is NO dependence on time, the PDF is not a function of t.
The RP is STATIONARY.
0 0RANDOM PROCESS is sinx t A t
2 2x
1
0 Elsewhere
x Af x A x
x
Mr V Sudheer Raja ASTU
Non-Stationary ExampleNon-Stationary Example
Now assume that A, 0 and 0 are constants; t is time.
Value of x(t) is always known for any time with a probability of 1. Thus the first order PDF of x(t) is
Note: The PDF depends on time, so it is NONSTATIONARY.
0 0sinf x x A t
0 0RANDOM PROCESS is sinx t A t
Mr V Sudheer Raja ASTU
Ergodic Processes
Definition: A random process is ERGODIC if all time averages of any sample function are equal to the corresponding ensemble averages (expectations)
Example, for ergodic processes, can use ensemble statistics to compute DC values and RMS values
Ergodic processes are always stationary; Stationary processes are not necessarily ergodic
Ergodic tationaryS
Mr V Sudheer Raja ASTU
Example: Ergodic ProcessExample: Ergodic Process
A and 0 are constants; 0 is a uniformly distributed RV from ) t is time.
Mean (Ensemble statistics)
Variance 0
1sin 0
2xm x x f d A t d
2
2 2 20
1sin
2 2x
AA t d
0 0RANDOM PROCESS is sinx t A t
Mr V Sudheer Raja ASTU
Example: Ergodic ProcessExample: Ergodic Process Mean (Time Average) T is large
Variance
The ensemble and time averages are the same, so the process is ERGODIC
00
1sin 0lim
T
T
x t A t dtT
2
2 2 200
1sin
2limT
T
Ax t A t dt
T
Mr V Sudheer Raja ASTU
Autocorrelation of Random Process
The Autocorrelation function of a real random process x(t) at two times is:
_________________
1 2 1 2 1 2 1 2, ,1 2x xR t t x t x t x x f x x dx dx
Mr V Sudheer Raja ASTU
Wide-sense StationaryWide-sense Stationary A random process that is stationary to order 2 or
greater is Wide-Sense Stationary: A random process is Wide-Sense Stationary if:
Usually, t1=t and t2=t+ so that t2- t1 = Wide-sense stationary process does not DRIFT with
time. Autocorrelation depends only on the time gap but
not where the time difference is. Autocorrelation gives idea about the frequency
response of the RP.Mr V Sudheer Raja ASTU
Cross Correlations of RP
Cross Correlation of two RP x(t) and y(t) is defined similarly as:
If x(t) and y(t) are Jointly Stationary processes,
If the RP’s are jointly ERGODIC,
_________________
xy 1 2 1 2 xy 1 2 1 2, ,1 2
R t t x t y t x y f x y dx dy
xy 1 2 xy 2 1 xy 2 1, R t t R t t R t t
_________________
xy ( ) ( )R x t y t x t y t
Mr V Sudheer Raja ASTU