notation or random process

22
1 83041 SUMMARY OF THE CONCEPTS RELATING TO RANDOM PROCESSES 1. NOTATION Fourier transform of x(t) , , Fourier series coefficients autovariance function of a non-stationary or stationary random process at time difference covariance function of two non-stationary or stationary random processes and at time difference frequency autocorrelation function of a non-stationary or stationary random process at time difference cross-correlation function of two non-stationary or stationary random processes and at time difference a variable denoting a member of an ensemble the number of members of an ensemble the number of coefficients of a Fourier Series N'th order probability distribution function autocorrelation or autovariance function of an ergodic random process at time difference cross-correlation or covariance function of two ergodic random processes and at time difference spectral density function time limits of a member function time , , ... time values of a member N'th order probability density function N'th order conditional probability density function , , member variables of an ensemble a time difference, e.g. equal to A if ( 29 a n b n c n C xx τ ( 29 xt (29 k { } τ C xy τ ( 29 xt (29 k { } yt (29 k { } τ f K xx τ ( 29 xt (29 k { } τ K xy τ ( 29 xt (29 k { } yt (29 k { } τ k N n P N R xx τ ( 29 xt (29 k { } τ R xy τ ( 29 xt (29 k { } yt (29 k { } τ Sf (29 T , T t t 1 t 2 W N W N ( 29 xt (29 k yt (29 k zt (29 k τ t 1 t 2 ( 29 Issued December 1983 ESDU product issue: 2003-03. For current status, contact ESDU. Observe Copyright.

Upload: mohamedkadry

Post on 29-Jan-2016

48 views

Category:

Documents


0 download

DESCRIPTION

random process notation

TRANSCRIPT

Page 1: notation or random process

83041�

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

SUMMARY OF THE CONCEPTS RELATING TO RANDOM PROCESSES

1. NOTATION

Fourier transform of x(t)

, , Fourier series coefficients

autovariance function of a non-stationary or stationary random process at time difference

covariance function of two non-stationary or stationary random processes and at time difference

frequency

autocorrelation function of a non-stationary or stationary random process at time difference

cross-correlation function of two non-stationary or stationary random processes and at time difference

a variable denoting a member of an ensemble

the number of members of an ensemble

the number of coefficients of a Fourier Series

N'th order probability distribution function

autocorrelation or autovariance function of an ergodic random process at time difference

cross-correlation or covariance function of two ergodic random processes and at time difference

spectral density function

time limits of a member function

time

, , ... time values of a member

N'th order probability density function

N'th order conditional probability density function

, , member variables of an ensemble

a time difference, e.g. equal to

A if( )

an bn cn

Cxx τ( )x t( )k{ } τ

Cxy τ( )x t( )k{ } y t( )k{ } τ

f

Kxx τ( )x t( )k{ } τ

Kxy τ( )x t( )k{ } y t( )k{ } τ

k

N

n

PN

Rxx τ( )x t( )k{ } τ

Rxy τ( )x t( )k{ } y t( )k{ } τ

S f( )

T, T–

t

t1 t2

WN

WN ( )

x t( )k y t( )k z t( )k

τ t1 t2–( )

1Issued December 1983

Page 2: notation or random process

83041�

reference subjectith the

electricalf thesem pastmness”

de, timeristics”

randomhe given

rocess,

rs

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

2. INTRODUCTION

This Item presents a summary of random processes and is intended for use by those who require a to the basic concepts and definitions. It is suited to those who have not previously encountered theor only have a nodding acquaintance with it but will also act as a reference guide to those familiar wsubject. No previous knowledge of statistics is assumed.

There are numerous examples of random processes such as the fluctuation of noise currents in ancircuit, fluid motion turbulence or the vibration response of structures due to wind loading. Each ocases is characterised by an inability to describe precisely the future behaviour of a quantity froknowledge of that quantity. The need exists however to describe the general character of a “randoso that a degree of prediction can be achieved.

Random processes are therefore quantified using statistical properties which describe in the amplituand frequency domains quantities representative of the “level distribution” and the “shape characteof the process.

This Item adopts a strictly mathematical approach to the presentation of the concepts relating to processes. This is necessary so that it is clear what limitations and assumptions apply to each of trelationships.

3. DETERMINISTIC PROCESSES

A process is called deterministic when a known past record of a quantity, which belongs to the pallows all future values of the quantity to be determined.

As an example a process which is characterised by the time-dependent periodic function,

, (3.1)

where A and are constants, for . Then Equation (3.1) is taken to hold for .

phase angle

angular frequency of a function

Special symbols

an ensemble average of the function at

a time average of the function

an ensemble of time functions of a random process consisting of the membefunctions , k = 1, 2, .. , N, for

denotes the ensemble average of a quantity

stands for “the probability that ....”

φ

ω

F x t1( )k[ ]⟨ ⟩ F x t( )k[ ] t t1=

F x t( )k[ ] F x t( )k[ ]

x t( )k{ }x t( )k N ∞→

⟨ ⟩

Pr

x t( ) A ω t φ+( )sin=

φ t T< t T>

2

Page 3: notation or random process

83041�

e valuesm.

e that s) are.s.le time

h record,

riable,

pendent

the

pendent

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

4. RANDOM PROCESSES

There are many processes of interest which are not deterministic, that is, processes in which futurcannot be completely determined from past knowledge of the process. Such processes are rando

4.1 Definition

It is necessary to describe random processes in terms of a mathematical formulation. SupposNidentical experiments (experiments in which identical apparatus is used in identical situationconducted simultaneously and that a quantity, , is recorded, where k refers to the specific experimentThe “ensemble” of records for and k = 1, N where , defines the random procesIt is usual to use curly brackets around the record, , to denote the ensemble of all possibfunctions and keep the unbracketed quantity, , to denote a single representative record, or “member”record, of the ensemble.

The above definition of a random process has time as the independent variable and furthermore eac, is taken to be a continuous function of time. The independent variable need not be time, e.g.

might represent the vibration response of vehicles to road roughnesses where the independent vad,is distance.

It is possible to have discrete random processes, for example the throwing of dice where the indevariable would be the number of times the dice are thrown.

4.2 Ensemble Averages

Consider the ensemble illustrated in Sketch 4.1, where member records , and have been plotted as a function of t. Each member extends mathematically from to and complete ensemble or random process is given when .

The properties of the random process are expressed in terms of ensemble averages.

An ensemble average is found by summing in a ‘vertical’ direction over Sketch 4.1 for all k at a chosentime, say , and dividing this sum by the total number of members. This ensemble average is deupon and independent of k.

The ensemble average of at is mathematically defined as,

, (4.1)

where the angular brackets “ ” around the quantity of interest indicates an ensemble average.

Equation (4.1) is known as the ensemble mean average value of the process at time .

The ensemble mean square value of the process at time is similarly defined as,

. (4.2)

x t( )k

x t( )k{ } ∞ t ∞< <– N ∞→x t( )k{ }

x t( )k

x t( )k x d( )k{ }

x t( )k{ } x t( )1 x t( )2 x t( )N

t ∞–= t ∞=N ∞→

t t1=t1

xk t( ){ } t t1=

xk t1( )⟨ ⟩ LimN ∞→

1N---- xk t1( )

k 1=

N

=

⟨ ⟩

t1

t1

xk t1( )( )2

⟨ ⟩ LimN ∞→

1N---- xk t1( )( )

2

k 1=

N

=

3

Page 4: notation or random process

83041�

emble

ulated,

at .

, in

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

Sketch 4.1 Ensemble member records

If a quantity of interest is a particular function of each record, , at , say , then the ensaverage is given by,

† (4.3)

More general averages, other than those which are simply functions of one value of time can be formfor example the ensemble average of , yields the autocorrelation function of the randomprocess at times and , which is given by,

. (4.4)

As the autocorrelation function reduces to the ensemble mean square value of the process

† An alternative notation for the ensemble average, which some authors adopt, is to use a curly bar over the function which the suffix does not appear, indicating that the sum has been taken over all .

xk t( ) t1 F xk t( )[ ]

F xk t1( )[ ]⟨ ⟩ LimN ∞→

1N---- F xk t1( )[ ]

k 1=

N

=

˜( ) F̃ x t1( )[ ]k k

xk t1( ) xk t2( )⋅⟨ ⟩xk t( ){ } t1 t2

xk t1( ) xk t2( )⋅⟨ ⟩ LimN ∞→

1N---- xk t1( ) xk t2( )⋅

k 1=

N

=

t2 t1→ t1

4

Page 5: notation or random process

83041� is

ese

ence ofn event

lity of

d by

es ofESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

Extending Equation (4.4) to ensemble averages, involving , the following expressionobtained,

. (4.5)

A random process is completely defined from a knowledge of all the ensemble averages for M = 1 for all, for M = 2 for all and , up to for all , , , . . . , . Successive knowledge of all th

ensemble averages given by Equation (4.5) describes the random process in more and more detail.

4.3 Probability

Probability is a measure of the ratio of the probable occurrence of a particular event to the occurrall possible events. An event which cannot possibly occur has zero probability of occurrence and awhich is certain to occur has a probability of unity. Any other event will therefore have a probabioccurrence of between zero and unity.

The probability that at for any member k, is usually written as andreferred to as the first-order probability distribution function, , and so,

= , at time for any k. (4.6)

There is a relationship between and the ensemble . If a function is define

The probability distribution function associated with the ensemble at time , for all valu between and , is illustrated in Sketch 4.2(a).

The probability that at lies between and is defined as

, (4.9)

where, is the first-order probability density function and is illustrated in Sketch 4.2(b).

By definition

and .

, when

(4.7)= 0 , when

then ,

. (4.8)

t1 t2 … tM, , ,

xk t1( ) xk t2( ) … xk tM( )⋅⟨ ⟩ LimN ∞→

1N---- xk t1( ) xk t2( ) … xk tM( )⋅( )

k 1=

N

∑=

t1 t1 t2 M ∞→ t1 t2 t3 t∞

x t( )k X1≤ t t1= Pr x t1( )k X1≤[ ]P1 X1 t1,( )

P1 X1 t1,( ) Pr x t1( ) X1≤k[ ] t t1=

P1 ∞ t1,( ) 1=

P1 ∞ t1,–( ) 0=

P1 X1 t1,( ) x t( )k{ } F x t1( )k[ ]

F xk t1( )[ ] 1= x t1( )k X1<

x t1( )k X1>

P1 X1 t1,[ ] F x t1( )k[ ]⟨ ⟩ 1N---- F xk t1( )[ ]

k 1=

N

∑= =

{ x t1( )}k t1X1 ∞– ∞

x t( )k t t1= X1 X1 ∆X1+

W1 X1 t1,( ) . ∆X1 Pr X1 x t1( )k< X1 ∆X1+≤[ ]=

W1 X1 t1,( )

5

Page 6: notation or random process

83041�

the

lies

ensity

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

Sketch 4.2 (a) Probability distribution (b) Probability density

Thus, by definition

The probability that at and simultaneously that at is given by second-order probability distribution function, .

The probability that lies between and at time and simultaneously that between and at time is given by where is the second-order probability density function. The second-order probability distribution and dfunction are related by,

. (4.12)

, (4.10)

and . (4.11)

P1 X1 , t1( ) W1 X1 , t1( ) X1d∞–

X1

∫=

W1 X1 , t1( )dP1 X1 , t1( )

dX1------------------------------=

x t( )k X1≤ t t1= x t( )k X2≤ t t2=P2 X1 , t1 ; X2 , t2( )

xk t( ) X1 X1 ∆X1+ t1 xk t( )X2 X2 ∆X2+ t2 W2 X1 , t1; X2 , t2( )∆X1∆X2 W2 X1 , t1; X2 , t2( )

P2 X1 t1 X2 t2,;,( ) W2 X1 t1 X2 t2,;,( ) X1 d X2d∞–

X2

∫∞–

X1

∫=

6

Page 7: notation or random process

83041�

om

the

.

efine the

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

In terms of the ensemble, if

, when at time

at time (4.13)

, otherwise

then . (4.14)

Third-order and higher-order probability functions can be defined. Knowledge of all the , or ,functions as for completely determines all the statistical properties of the randprocess, and is an alternative to the descriptions of Equation (4.5).

Knowledge of (or ) implies knowledges of all the lower order functions ( , , etc.) since,

. (4.15)

The basis of Equation (4.15) is that is independent of the values of at time , while integration of with respect to covers all possible values of at time .

Now,

,

, (4.16)

(4.17)

Similarly for any real single-valued continuous function ,

. (4.18)

Higher order averages (or higher moments) can be similarly defined; thus,

. (4.19)

This equation shows how the ensemble averages and the probability density functions are related

4.4 Conditional Probability

To formulate the concept of the degree of randomness of a random process it is necessary to dconditional probability distribution.

F xk t( )[ ] 1= xk t( ) X1< t1

and when xk t( ) X2< t2

0=

P2 X1 t1 X2 t2,;,( ) F xk t( )[ ]⟨ ⟩=

WN PNN ∞→ ∞ XN ∞< <–

WN PN WN 1– WN 2–

WN 1– X1 , t1 ;... ; XN 1– , tN 1–( ) WN X1 , t1 ;... ; XN , tN( ) XNd∞–

∞+∫=

WN 1– x t( )k{ } tNWN XN x t( )k{ } tN

xk t1( )⟨ ⟩ 1N---- xk t1( )

k 1=

N

∑=

kx t1( )⟨ ⟩ Xi no. of times Xi xk t1( ) Xi ∆Xi+<<

N--------------------------------------------------------------------------------

Xi ∞– =

Xi + ∞=

∑=

xk t1( )⟨ ⟩ X1 W1 X1 t1,( ) ⋅ X1d∞–

+ ∞∫=

F x t1( )k[ ]

F x t1( )k[ ]⟨ ⟩ F X1( ) . W1 X1 , t1( ) X1d∞–

∞+

∫=

xk t1( ) . xk t2( ) ... x tN( )k⟨ ⟩ ... X1X2 ... XN( ).WN X1 t1; X2 t2 ; ... ; XN , tN, ,( ) X1..d XNd∞–

∞+∫

∞–

∞+∫=

7

Page 8: notation or random process

83041� timereenoter, from

nt and

yd

given seconding to

ess is

nd

ts occur density

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

Consider the random process where is known to lie in the range at, then the probability that at time is denoted by , whe

is the second-order conditional probability density function. The vertical bar is used to dconditional probability, it separates the past known conditions, placed on the left of the vertical bathe probability of an event in the future.† Now is the probability that at time , and is the probability that at time whe

at time . So equals the probability tha lies between and at time , and simultaneously that lies between

at time , which is equal to .

Higher-order conditional probabilities can be defined. Suppose that lies in the range and at time for i = 1, 2, ... , N – 1 then the probability that at time is given b

is the N'th-order conditional probability density function anis related to the probability density function by,

. (4.21)

Conditional probability can be illustrated by considering the probability of drawing two cards of a suit successively from the usual pack of 52 cards, without replacing the first card. The success of thedraw is conditional upon the first, since if it is successful, it will reduce the number of cards belongthat suit when the second draw is made.

4.4.1 First-order random process

The second-order conditional probability density function for a first-order (or purely) random procgiven by,

. (4.22)

Thus the probability that is entirely independent of whether lies betweenand .

Similarly the N'th-order probability is dependant only on the probability that aso,

. (4.23)

For this type of process, events are completely uncorrelated and so the probability that two eventogether is equal to the probability of each event occurring separately. The second-order probabilityfunction can be written as,

, (4.24)

† An alternative notation which some authors adopt is to place the past known conditions to the right of the vertical bar.

i.e. . (4.20)

x t( )k{ } x t1( )k X1 , X1 ∆X1+( )t1 X2 x t2( )k< X2 ∆X2+≤ t2 W2 X1 t1| X2 t2, ,( )dX2W2 |( )

W1 X1 t1,( )dX1 X1 x t1( )k< X1 ∆X1+≤t1 W2 X1, t1| X2 t2,( )dX2 X2 x t2( )k< X2 ∆X2+≤ t2

X1 x t1( )k< X1 ∆X1+≤ t1 W1 X1 t1,( )dX1( ). W2 X1 t1|X2 t2, ,( ) dX2( )x t1( )k X1 X1 ∆X1+ t1 x t2( )k X2

X2 ∆X2+ t2 W2 X1 t1; X2 t2, ,( ) dX1dX2

W2 X1 t1| X2 t2, ,( )W2 X1 t1 ; X2 t2, ,( )

W1 X1 t1,( )--------------------------------------------=

x ti( )k Xi Xi ∆Xi+ti XN x tN( )k< XN ∆XN+≤ tN

WN X1 t1 ; X2 t2 .... |XN tN, , ,( )dXN . WN |( )

WN Xi t1… XN 1– tN 1– XN tN,,;,( ) WN X1 t1 X2 t2 … XN tN,;;,;,( )

WN 1– X1 t1 X2 t2 … XN 1– tN 1–,;;,;,( )-----------------------------------------------------------------------------------------------=

W2 X1 t1 | X2 t2, ,( ) W1 X2 t2,( )=

X2 x t2( )k< X2 ∆X2+≤ x t1( )k X1X1 ∆X1+

XN x tN( )k< XN ∆XN+≤

WN X1 t1 ; X2 t2 ; .... | XN tN, , ,( ) W1 XN tN,( )=

W2 X1 t1 ; X2 t2, ,( ) W1 X1 t1,( ). W1 X2 t2,( )=

8

Page 9: notation or random process

83041�

r result

ip to the

eparated

nsembleity

with thearticular

,

.

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

and similarly the N'th-order probability is,

, for all N. (4.25)

This type of process can be illustrated by the results of throwing a die. The probability of a particulafrom the throw of the die will not depend on the results of previous throws.

4.4.2 Second-order random process

The second-order random process or Markov process is one which is characterized by a relationshpast that depends on only one preceeding observation.

The N'th-order conditional probability distribution function is given by,

, (4.26)

or . (4.27)

5. CLASSIFICATION OF RANDOM PROCESSES

Random processes are classified as non-stationary or stationary, the latter of which can be further sinto two classes non-ergodic or ergodic. The three principal classes are,

5.1 Non-Stationary Processes

These processes are of the most general type and all of the definitions of Section 4 refer to non-stationaryrandom processes. This class of process is completely defined from knowledge of all the general eaverages given by Equation (4.5) as , or alternatively, knowledge of all the probability densfunctions for the complete ensemble as .

5.2 Stationary Processes (Non-Ergodic)

A stationary process by definition is one in which all the ensemble statistical properties associated process are invariant with respect to time translations. Ensemble averages are independent of the ptime or times at which the ensemble quantity is taken. For example the ensemble mean average,and the ensemble mean square value, , are the same for all values of time, t, and theautocorrelation function, , is only a function of the time difference, , where

In general for a function, F, of each record, the stationary ensemble average is,

(5.1)

and is valid for all F, t and N.

(i) Non-stationary(ii) Stationary (non-ergodic)(iii) Ergodic (implying stationarity)

WN X1 t1 X2 t2 … XN tN,;;,;,( ) W1 Xi ti,( )i 1=

N

∏=

WN X1 t1 X2 t2 … XN tN,;,;,( ) W2 XN 1– tN 1– XN tN,,( )=

WN X1 t1 X2 t2 … XN tN,;;,;,( ) W1 X1 t1,( ) W2 X1 t1 Xi ti,,( )i 1=

N

∏=

M ∞→N ∞→

xk t( )⟨ ⟩x t( )k( )2⟨ ⟩

x t1( )k . x t2( )k⟨ ⟩ τ τ t1 t2–( )=

F xk t1( ) , xk t2( ) , ... , x tN( )k[ ]⟨ ⟩ F xk t1 t+( ) , x t2 t+( ) , ... , x tN t+( )kk⟨ ⟩=

9

Page 10: notation or random process

83041�

processr record.

s can be

f aion

s aren mayt

viously

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

A random process is termed “strongly stationary” if all the ensemble averages given by Equation (4.5) forall M and t, are invarient with respect to time translations.

When only the ensemble mean value, , for all values of t and the autocorrelation function, for all values of are the same, the process is termed “weakly stationary” and is

used by practising engineers as sufficient justification for invoking the condition for full stationarity.

5.3 Ergodic Processes

An ergodic random process is one in which each member of the ensemble incorporates the randomsuch that an ensemble average is equal to the time average from each and every membeConsequently for each record,

, (5.2)

which is the same for all k.

For a general function, F, of each member record, the following time average may be written,

. (5.3)

The bar (–) over the function indicates a time average.

Mathematically it should be remembered that stationarity is a necessary but not a sufficient condition forergodicity, since for stationary processes that are not ergodic time averages for different memberdifferent.

6. CORRELATION

Correlation has been briefly introduced in Section 4.2 where the ensemble autocorrelation function osingle random process was given as a particular example of the general ensemble averages of Equat(4.5).

Correlation functions provide similarity between two quantities and will be larger when two quantitiesimilar or identical than when they are dissimilar. The two quantities that form a correlation functiobe part of the same random process in which case autocorrelation functions are involved; or each form parof two different random processes in which case cross-correlation functions are used.

6.1 Autocorrelation Functions

The autocorrelation function at times and of a non-stationary random process , as predefined, is given by,

. (6.1)

x t( )k⟨ ⟩x t1( ) k . x t2( )k⟨ ⟩ t1 t2–( )

LimT ∞→

12T------ xk t( ) td

T–

T

xk t( )⟨ ⟩=

F xk t( )[ ] LimT ∞→

12T------ F xk t( )[ ] td

T–

T

F x t( )k[ ]⟨ ⟩≡=

t1 t2 x t( )k{ }

Kxx t1 t2,( ) x t1( ) x t2( )k⋅k⟨ ⟩ LimN ∞→

1N---- xk t1( ) x t2( )k⋅

k 1=

N

= =

10

Page 11: notation or random process

83041�quation

nd may

llowing

ocess,

y otherts, the

,

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

If the ensemble mean value of across the ensemble at times and is subtracted out, then E(6.1) becomes the autovariance function which is given by

. (6.2)

Combining Equations (6.1) and (6.2) it follows that,

. (6.3)

The value of the autocorrelation function is dependent upon the particular times and chosen, abe either positive or negative.

6.1.1 Stationary autocorrelation functions

Stationary random processes are invariant with respect to time translations and so the foautocorrelation equality may be written,

. (6.4)

By choosing and letting ,

, (6.5)

where , for stationary processes, need only involve the time difference, .

A further translation of time allows the following expression to be found,

, (6.6)

which shows that is an even function, symmetric about .

As ,

, (6.7)

which is valid for any chosen value of t and equals the ensemble mean square value for the random pr.

It can also be shown that,

. (6.8)

Hence the value of the autocorrelation function at is greater than or equal to the value at anvalue of . It is also intuitively clear that for processes which do not contain deterministic componenautocorrelation function, , will approach the ensemble mean value squared, as since widely separated values will not be correlated.

x t( )k t1 t2

Cxx t1 t2,( ) xk t1( ) x t1( )k⟨ ⟩– x t1( ) x t2( )k⟨ ⟩–k

⟨ ⟩=

Cxx t1 t2,( ) Kxx t1 t2,( ) x t1( )k⟨ ⟩ x t2( )k⟨ ⟩⋅–=

t1 t2

Kxx t1 , t2( ) Kxx t1 t , t2 t++( )=

t t1–= t2 t1–( ) τ=

Kxx t1 , t2( ) Kxx 0 τ,( ) Kxx τ( )≡=

Kxx τ

Kxx τ( ) Kxx τ–( )=

Kxx τ( ) τ 0=

τ 0→

Kxx 0( ) x 0( ) . x 0( )kk⟨ ⟩ x t( )k( )2⟨ ⟩≡=

x t( )k{ }

Kxx 0( ) Kxx τ( )≥

τ 0=τ

Kxx τ( ) xk t( )⟨ ⟩( )2 τ ∞→

11

Page 12: notation or random process

83041�n by,

ntical

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

The autovariance function for stationary processes, , is related to the autocorrelation functio

. (6.9)

Sketch 6.1 shows illustrations of both the autocorrelation and the autovariance function which are idein shape but displaced in ordinate by .

Sketch 6.1 Autocorrelation and autovariance functions

6.1.2 Ergodic autocorrelation functions

A necessary condition for ergodicity is stationarity and so all the results of Section 6.1.1 will apply toergodic processes. In addition the autocorrelation function given by Equation (6.1) can be derived from atime average using a single representative record ; and so,

, (6.10)

where the symbol R is used to indicate an ergodic process.

Cxx τ( )

Cxx τ( ) Kxx τ( ) xk t( )⟨ ⟩( )2–=

xk t( )⟨ ⟩( )2

x t( )k

Rxx τ( ) LimT ∞→

1

2T------ x t( ) x t τ+( ) k⋅k td

T–

T

=

12

Page 13: notation or random process

83041�

ocesses

adomample

ation are

efined

ined in

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

6.2 Cross-correlation Functions

The cross-correlation function is used to provide a measure of the similarity between two random pr and , and at fixed times of and is defined as,

. (6.11)

This equation is a more general one than Equation (6.1) and includes the autocorrelation function asspecial case when . This function is normally required when it is known that a ranprocess is a function of two other random processes and . Consider an exwhere each member record of z is related to x and y by,

.

The autocorrelation function of z, at times and , is then,

which becomes,

. (6.12)

This new autocorrelation function of z is seen to contain both the autocorrelation and the cross-correlfunctions of x and y. If the cross-correlation functions and are zero then and independent random processes.

The covariance function is the cross-correlation function when the mean values are subtracted and is dby analogy with Equation (6.2) as,

. (6.13)

6.2.1 Stationary cross-correlation functions

The cross-correlation function for two stationary random processes and can be defa similar manner to Equation (6.11) thus,

. (6.14)

The following relationships may also be derived,

, (6.15)

. (6.16)

x t( )k{ } y t( )k{ } t1 t2

Kxy t1 t2,( ) x t1( ) y t2( )k⋅k⟨ ⟩ LimN ∞→

1N---- x t1( ) y t2( )k⋅k

k 1=

N

= =

x t( )k{ } y t( )k{ }=z t( )k{ } x t( )k{ } y t( )k{ }

z t( )k x t( )k z t( )k+=

t1 t2

Kzz t1 , t2( ) xk t1( ) y t1( )k+( ) x t2( )k y t2( )k+( )⟨ ⟩=

Kzz t1 , t2( ) Kxx t1 , t2( ) Kxy t1 , t2( ) Kyy t1 , t2( ) Kyx t1 , t2( )+ + +=

Kxy Kyx xk t( ){ } yk t( ){ }

Cxy t1 , t2( ) xk t1( ) xk t1( )⟨ ⟩–( ) yk t2( ) yk t2( )⟨ ⟩–( )⟨ ⟩=

xk t( ){ } yk t( ){ }

Kxy τ( ) x 0( ) y τ( )k⋅k⟨ ⟩ LimN ∞→

1N---- x 0( ) y τ( )k⋅k

k 1=

N

= =

Kxy τ–( ) Kyx τ( )=

Kxy τ( ) Kxx 0( ) Kyy 0( )⋅≤

13

Page 14: notation or random process

83041�

,

nction,

rocess

.found.

t they canibing al to the

rate the

la

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

6.2.2 Ergodic cross-correlation functions

The cross-correlation function for two ergodic random processes and is defined as

. (6.17)

6.3 Hidden Deterministic Components

If a random process contains a hidden deterministic component, then the autocorrelation fu, of this “mixed process” will not approach as approaches infinity.

Consider a random process which is given by,

,

where and are; a stationary random process and a stationary deterministic prespectively. and are statistically independent and so,

,

since the cross-correlation functions and are zero.

approaches as , but the term will not approach as Thus by studying the autocorrelation function at large , periodic deterministic components may be

7. SPECTRA

The basic methods of harmonic analysis are not, as they stand, applicable to random processes bube developed in a way that makes them applicable, thus providing an additional way of descrrandomly varying quantity. The ideas of Fourier series, integrals and transforms are fundamentaanalysis of random processes and will be described first.

7.1 Fourier Series and Transforms

7.1.1 Periodic records

The essence of Fourier series or the Fourier transform of a waveform is to decompose or sepawaveform into a sum of sinusoids of different frequencies, amplitudes and phase lags.

Consider a periodic waveform x(t) with period 2T, where x(t) = x(t + 2T) for all t, and ; then,providing the waveform satisfies Dirichlet’s† conditions it can be represented by,

† A sufficient set of conditions by which a periodic waveform can be represented by Equation (7.1) is, that the waveform and its integraare finite, single valued and continuous except for a finite number of both finite discontinuities and finite minima and maxim. SeeReference 8.

= , (7.1)

xk t( ){ } yk t( ){ }

Rxy τ( ) LimT ∞→

1

2T------ x t( ) y t τ+( ) k⋅k td

T–

T

=

xk t( ){ }Kxx τ( ) xk t( )⟨ ⟩( )2 τ

zk t( ){ }

zk t( ){ } xk t( ){ } yk t( ){ }+=

xk t( ){ } yk t( ){ }xk t( ){ } yk t( ){ }

Kzz τ( ) Kxx τ( ) Kyy τ( )+=

Kxy Kyx

Kxx τ( ) xk t( )⟨ ⟩( )2 τ ∞→ Kyy τ( ) yk t( )⟨ ⟩( )2 τ ∞→τ

f0 1/2T=

x t( )a0

2----- an . 2πnf0t( ) bn . 2πnf0t( )sin+cos( )

n 1=

∞∑+

14

Page 15: notation or random process

83041�

infinitemes a

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

The mean square value, , can be expressed in terms of the amplitudes and , as,

. (7.3)

The waveform, , can also be expressed in complex form, given by,

(7.4)

where, (7.5)

The mean square value in terms of the coefficient is,

. (7.6)

7.1.2 Non-periodic records

A non-periodic record can be expressed as a Fourier series only if it is considered to be periodic withperiod. The fundamental frequency , in this case becomes infinitesimal, the spectrum becocontinuous curve and the series become an integral.† Combining Equations (7.4) and (7.5) and substituting

for and (1/2T); then,

.

In the limit as , and , then providing Dirichlet’s conditions are satisfied,

. (7.7)

(7.2)

where = ,

= ,

† Dirichlet's conditions are applicable to non-periodic records and can be extended to include impulse functions by the theory of distributions.See Reference 8.

an1T--- x t( ) . 2πnf0t( )cos td

T–

T∫ n 0 1 2 3 ..., , , ,=

bn1T--- x t( ) . 2πnf0t( )sin td

T–

T∫ n 1 2 3 ..., , ,=

x2 t( ) an bn

x2

t( ) 1

2T------ x

2t( ) t d

T–

T∫

a02

4-----

12--- an

2 bn

2+( )

n 1=

∑+= =

x t( )

x t( ) cn e+ i2πnf0t

⋅n ∞–=

+ ∞∑=

cn1

2T------ x t( ) e

i2πnf0t–⋅ t d

an ibn+( ), n 1–≤

an ibn–( ), n 1≥12--- a0( ), n 0=

≡T–

T∫=

cn

x2

t( ) cn 2

n ∞–=

+ ∞∑=

f0

∆f f0

x t( ) ei2πn∆ft+ ∆f x t( ) e

i2πn∆ft–⋅ tdT–

T∫

n ∞–=

+ ∞∑=

∆f df→ T ∞→ n∆f f→

x t( ) ei2πft+

f x t( ) e i2πft–⋅ td

∞–

∞∫d

∞–

∞∫=

15

Page 16: notation or random process

83041�

is notgral sincen that

nction

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

This equation may be conveniently separated into two equations,

(7.8)

and . (7.9)

The quantity A(if) is called the Fourier transform of x(t), and x(t) is said to be the inverse transform of A(if).These two quantities x(t) and A(if) form what is called a Fourier transform pair.

The complex conjugate of is denoted by and is given by,

(7.10)

and . (7.11)

Then,

(7.12)

If only positive frequencies are considered, then

, (7.13)

since is an even function of .

7.2 Random Processes and Spectral Density

The results of Section 7.1 cannot, as they stand, be applied to random signals. A random signal periodic and so cannot be expressed as a Fourier series, nor can it be expressed as a Fourier intea random signal, in general, continues over an infinite time, and will not satisfy Dirichlet's conditiothe integral of the wave-form is finite. It is possible, however, to develop the results of Section 7.1 in termsof a quantity known as the spectral density which has no convergence difficulties.

Consider a member function, , of a random process, , from which a truncated member fu is formed

(7.14)

x t( ) A i f( ) ei2πft+⋅ fd

∞–

∞∫=

A i f( ) x t( ) e i2πft–⋅ td

∞–

∞∫=

A i f( ) A∗ i f( )

A∗ i f( ) x t( ) ei2πft+⋅ td

∞–

∞∫=

A i f( )A∗ i f( ) A i f( ) 2=

x2

t( ) td∞–

∞∫ x t( ) A i f( ) e

i2πft+⋅ fd∞–

∞∫

td∞–

∞∫=

A i f( ) x t( ) ei2πft+⋅ td

∞–

∞∫

fd∞–

∞∫=

A i f( ) A∗ i f( ) ⋅ f d∞–

∞∫ A i f( ) 2

fd∞–

∞∫= =

x2

t( ) td∞–

∞∫ 2 A i f( ) 2

fd0

∞∫=

A if( ) 2f

x t( )k x t( )k{ }x t ,T( )k

xk t T,( )x t( )k , T t T≤ ≤–

0 , otherwise .

=

16

Page 17: notation or random process

83041�

ndom

range

ctricalean

e

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

Using Equation (7.3), the mean square value is

or

and using Equation (7.13)

.

If is allowed to approach , then

, (7.15)

, (7.16)

where, . (7.17)

The quantity is called the spectral density of the member . The spectrum for the total raprocess is then defined as,

Equation (7.15) describes the distribution of the harmonic content of the signal over the frequency zero to infinity. The mean square value, , associated with a band of frequency , is .

Spectral density is often described as “power spectral density” which derives from its use in eleproblems where a randomly varying current, x(t), which passes through a unit resistance results in a mpower consumption of .

7.3 The Relations between Spectral Density and Correlation

For a truncated member record given by Equation (7.14), the spectral density function may brelated to the spectral density function for, in general, a non-stationary random process by,

, (7.20)

. (7.18)

For ergodic processes, . (7.19)

x t T,( )k( )2

1

2T------ x t T,( )k( )

2 td

T–

T∫=

1

2T------ x t T,( )k( )

2 td

∞–

∞∫=

x t T,( )k( )2 1

T--- A i f,T( )k 2

fd0

∞∫=

T ∞

x t T,( )k( )2

LimT ∞→0

∞∫=

1T--- A i f,T( )k

2 fd

S f( ) k fd0

∞∫=

Sk f( ) LimT ∞→

=1T--- A i f,T( )k

2

S f( )k x t( )k

S f( ) Sk f( )⟨ ⟩=

Sk f( ) S f( )=

x2 t( ) ∆f S f( )∆f

x2 t( )

x t ,T( )k

S f( ) S f( )k⟨ ⟩ LimT ∞→

S f T,( )k⟨ ⟩[ ]= =

17

Page 18: notation or random process

83041�

,

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

and may be defined as,

.

Also, for a truncated member record, the following function may be defined,

as ,

where 0(1) is an error terms which approaches zero as . The Fourier transform of is

.

But, .

Therefore .

Using Equation (7.20),

.

Thus for any general non-stationary random process,

. † (7.21)

Significant simplification occurs if the random process is stationary, as,

and so, .

† Note: the suffixes, ‘xx’, are often attached to the spectral density S, to denote the association with the autocorrelation function, .

Sk f T,( )

Sk f T,( ) 1T--- A i f,T( )k( )

2=

Jk τ T,( ) 12T------ x t T,( ) x t τ T,+( ) k⋅k td

∞–

∞∫=

≡ 12T------ x t T,( ) x t τ T,+( ) k⋅k t 0 1( )+d

T–

T∫ T ∞→

T ∞→ J τ T,( )k

J τ T,( ) e i2πnf0τ–⋅k τd∞–

∞∫ e i2πnf0τ–

12T------ x τ T,( ) x t τ T,+( ) k⋅k td

∞–

∞∫ τd

∞–

∞∫=

12T------ x t T,( ) e

i2πnf0t+⋅k t x t τ T,+( ) e

i2πnf0 t τ+( )–⋅k τd

∞–

∞∫d

∞–

∞∫=

x t τ T,+( ) e i– 2πnf0 t τ+( )⋅k τd∞–

∞∫ x t T,( ) e i 2πnf0t–⋅k t d

∞–

∞∫ A i f,T( )k= =

J τ T,( ) e i2πnf0τ–⋅k τd∞–

∞∫ 1

2--- S f T,( )k( )=

S f( ) LimT ∞→

S f T,( )k⟨ ⟩[ ] LimT ∞→

2 J τ T,( )k⟨ ⟩ e i2πnf0t–

⋅ τd∞–

∫= =

LimT ∞→

2 e i2πnf0t– 1

2T------ x t T,( ) x t τ T,+( )k⋅k⟨ ⟩ td∫

τd∞–

∞∫=

S f( ) LimT ∞→

2 e i2πnf0τ– 1

2T------ Kxx t t τ+,( ) td

T–

T∫

τd∞–

∞∫=

Kxx

Kxx t t τ+,( ) Kxx 0 τ,( ) Kxx τ( )= =

12T------ Kxx t t τ+,( ) t d

T–

T∫ Kxx τ( )=

18

Page 19: notation or random process

83041�

iding

ss, that

by,

and

nge andomplete and is

e noise

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

Equation (7.21) for stationary processes then becomes,

. (7.22)

This equation shows that the Fourier transform of is and it follows that,

. (7.23)

Equations (7.22) and (7.23) can also be expressed using integrals between zero and infinity thus avothe need to consider negative frequencies. These equations are,

(7.24)

. (7.25)

These two equations are known as the Wiener-Khinchin relations and show, for a stationary procethe autocorrelation function and the spectral density are each others Fourier cosine transforms.

For cross-correlation functions and cross-spectral density functions a similar equation to Equation(7.21)may be derived. For two random processes and the cross-spectral density is given

(7.26)

where is the general non-stationary form of the cross-correlation function of at times t and .

For stationary processes,

. (7.27)

7.3.1 The special case of white noise

There are cases when the spectral density is approximately constant over a wide frequency raconsequently it is natural to consider a random process whose spectrum is constant over the cfrequency range zero to infinity. A signal which has such a spectral density is called white noisepurely a theoretical abstraction since this implies an infinite mean square value (see Equation (7.16)).However results obtained using a white spectrum are often meaningful since bandwidth limited whithas finite average power. For white noise, S(f) = S, where S is constant. Then from Equation (7.23), it followsthat,

, (7.28)

where is the Dirac -function, where .

S f( ) 2Kxx τ( ) e i2πnf0τ–⋅ τd∞–

∞∫=

2Kxx τ( ) S f( )

Kxx τ( ) 12--- S f( ) e i2πnf0τ+⋅ fd

∞–

∞∫=

Kxx τ( ) S f( ) 2πnf0τ( ) cos⋅ fd0

∞∫=

S f( ) 4 Kxx τ( ) 2πnf0τ( )cos⋅ τd0

∞∫=

x t( )k{ } y t( )k{ }

S f( ) LimT ∞→

2 e i2πnf0τ– 1

2T------ Kxy t t τ+,( ) td

T–

T∫

τd∞–

∞∫=

Kxy t t τ+,( ) x t( )k{ }y t( )k{ } t τ+( )

S f( ) 2 Kxyτ e i2πnf0τ– ⋅ τd∞–

∞∫=

Kxx τ( ) 12--- S δ τ( )=

δ τ( ) δ δ τ( ) τd0 –

0 +∫ 1=

19

Page 20: notation or random process

83041�

e found

nt role

so the

by the

viation.

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

7.4 Spectral Density and Probability Distribution

The spectral density of a random process only enables, in general, the autocorrelation function to band does not completely define the process since far more general averages, given by Equation(4.5) as

, are needed for complete description.

In the case of a Gaussian or normal distribution the spectral density plays a particularly significabecause it determines the distribution completely.

7.4.1 Gaussian (or normal) distribution

The stationary random process (with zero mean), , is called Gaussian if its N’th-order probabilitydensity function is given by,

, (7.29)

where is the autovariance function.

It is normal practice to subtract the ensemble mean value, , for a stationary process andauto-variance function appears in Equation (7.29).

The random process is completely defined by the autovariance functions as , or alternativelyspectral density functions given by Equation (7.22).

For a first order random process when N = 1, Equation (7.29) reduces to,

, (7.30)

where is the root mean square value of , more usually referred to as the standard deEquation(7.30) is the usual form of the Gaussian distribution involving only one random variable.

M ∞→

x t( )k{ }

Wn X1 t1 … XN tN,;;,( ) 2π( ) N/2–

Cxx ti tj,( ) 1/2----------------------------------

12--- Cxx

1–ti tj,( ) Xi Xj

j 1=

N

∑i 1=

N

∑–

exp=

Cxx ti tj,( ) Cxx ti tj–( )=

x t( )k⟨ ⟩

N ∞→

W1 X1 , t1( ) 1

σ 2π-------------- e X1

2/2σ2–=

σ X1

20

Page 21: notation or random process

83041�

sented

y &

6.

ill,

No.

ill,

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

8. REFERENCES

The following list of references is a selected list of sources of information supplementary to that prein this Item.

1. BENDAT, J.S. Principles and Applications of Random Noise theory, John WileSons, 1958.

2. BENDAT, J.S.PIERSOL, A.G.

Measurement and analysis of random data, John Wiley & Sons, 196

3. CRANDELL, S.H.(Editor)

Random vibration, Technology Press/John Wiley & Sons, 1958.

4. DAVENPORT, W.B.ROOT, W.L.

An introduction to the theory of random signals and noise, McGraw-H1960.

5. GOODMAN, F.O. A summary of the theory of random processes, R.A.E. Tech. NoteCPM 73, April 1964.

6. MIDDLETON, D. An introduction to statistical communication theory, McGraw-H1960.

7. ROBSON, J.D. An introduction to random vibration, Edinburgh Univ. Press, 1963.

8. BRIGHAM, E.O. The fast Fourier transform, Prentice-Hall, 1974.

21

Page 22: notation or random process

83041�

st met

SDU.ailable

ESD

U p

rodu

ct is

sue:

20

03

-03

. Fo

r cu

rren

t st

atus

, con

tact

ES

DU

. O

bser

ve C

opyr

ight

.

THE PREPARATION OF THIS DATA ITEM

The work on this particular Item was monitored and guided by the Dynamics Committee which firin 1962 and now has the following membership:

The work on this Item was carried out in the Mechanical Motion and System Dynamics Group of EThe member of staff who undertook the technical work involved in the initial assessment of the avinformation and the construction and subsequent development of the Item was

ChairmanDr H.H.B.M. Thomas – Independent

MembersProf. G.J. Hancock – Queen Mary CollegeMr M.R. Heath – British Aerospace Dynamics Group, Stevenage DivisionDr M.A. Woodhead – University of Salford

Mr C.J. Loughton – Group Head.

22