appendix basic concepts of probability theory and ...978-94-011-1820-0/1.pdf · appendix basic...

28
Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability theory, stochastic processes and related properties can be found in ax- iomatic form in the books by Soong [1) or Papoulis [2). Nevertheless for the reader's convenience the basic concepts and properties of stochastic processes are here re- called. A basic concept of a probability theory is a probability space. A probability space is a triple (O, F, P) such that • O is a non empty set of elementary events. • F is a u-algebra of subsets of O, i. e. F is a family of subsets of O which satisfies a) O E F, b) A E F O \ A E F, • P is a probability measure on (O, F), i. e. P: F -+ [0,1) satisfies d) P(O) = 1, P(0) = O, e) P(U:=o An) = P(An), whenever An, nE N C Fis a family of pair- wise disjoint sets. 191

Upload: dotram

Post on 06-Sep-2018

228 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

Appendix Basic Concepts of Probability

Theory and Stochastic Processes

Probability Theory

Probability theory, stochastic processes and related properties can be found in ax­

iomatic form in the books by Soong [1) or Papoulis [2). Nevertheless for the reader's

convenience the basic concepts and properties of stochastic processes are here re­

called.

A basic concept of a probability theory is a probability space. A probability

space is a triple (O, F, P) such that

• O is a non empty set of elementary events.

• F is a u-algebra of subsets of O, i. e. F is a family of subsets of O which

satisfies

a) O E F,

b) A E F ~ O \ A E F,

• P is a probability measure on (O, F), i. e. P: F -+ [0,1) satisfies

d) P(O) = 1, P(0) = O,

e) P(U:=o An) = E~o P(An), whenever An, nE N C Fis a family of pair­

wise disjoint sets.

191

Page 2: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

192 _____________ N.ON LINEAR STOCHASTIC EVOLUTION EQUATION

A function e: n --+ IR is called a random variable if it is measurable when the

Borel u-algebra 8(IR) is considered on IR This condition is equivalent to the following

{e<a}={wEn:e(w)<a}EF forevery aEIR.

Given T, any non empty set of parameters, a function e = {e(t,w} defined on

T X n is called a stochastic process indexed by T iff, for any t E T, e(t,·): n --+ IR

is a random variable.

For fixed t E T, the random variable e(t,·) is called a realization of e at the time

t. For fixed wEn, a function T :3 t --+ e(t,w) is called a trajectory or path of e. Usually a random process as above is denoted by {ethET or simply et, if the set of

indexes is not ambiguous.

Very often, the set T is some subset of the set of real numbers lR, usually lN,

[0,00) and sometimes IR or Z. In those cases a natural interpretation of such a

parameter is the time, continuous (if T = [0,00) or IR) or discrete (in the other

cases ).

Some definitions can now be given.

At any particular time t E T, let the process e( t, w) be a random variable. Then

it has a Distribution Function F(x; t) defined by the following formula

F(x;t) = P{w E n,e(t,w) ~ x}, x E IR.

In other words F(x, t) is the probability that the random variable ((t,.) takes

values not greater then a real number x.

If there exists a measurable function f( x, t) of the variable x E IR such that

l'oof(y,t)dy=F(x,t) V xEIR,

then f(', t) is called the Probability Density Function of the variable (t.

Page 3: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX __________________________ 193

If two elements t 1 , t2 E T are chosen, then we define the Second Order Distribu­

tion Function, (or simply Two-Dimensional Distribution Function) the function

In a similar way, a measurable function f(:Cl,:C2jt1 ,t2) is called a Second Order

Probability Density if it is a probability density of the lIt-valued random function

In other words, the following holds

This procedure can be easily generalized by choosing n elements of T, h, ... ,tn

so that we obtain the n-Order Distribution Function

and the n-Order Probability Density

It is worth recalling that the knowledge of all the higher order distribution func­

tions (or probability densities, if they exist) implies the knowledge of the lower order

distribution functions. In fact the following property holds

Page 4: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

194 _____________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

and more in general, if m < n then

If there exists the n-Order Probability Density,

then also

exists for m < n and the last formula takes the form

Moreover, a permutation of any arguments in a n-order distribution function (or

probability density if it makes sense) does not change the function itself, i. e. if 0-

is an n-permutation then

The last two properties are actually known as the Kolgomorov's compatibility

conditions.

All the definitions which have been given until now are related to a single stochas­

tic process ~t. If ~t is a JR'-valued stochastic process, so that n stochastic processes

Page 5: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX _________________________ 195

e: , ... ,e~ are given, then again the joint distributions can be considered. Accord­

ingly one can define the n-Order Joint Distribution, together with the n-Order Joint

Probability Density.

Statistical Measures

When finite dimensional distributions of a stochastic process up to a certain order

are known, then some the statistical properties of the stochastic process can be

characterized. For example, by computing its moments (if they exist).

In particular the first order distribution function F(x, t) is known, then the n­

Order Moment of et at every tEl can be computed by

or, if there exists the first order probability density function,

The First Order Moment is called Mean Value

moreover the n-th Order Central Moment of et is defined by

An obvious modification of this formula is necessary if one uses the density

function. The second order central moment, called the Variance of the Stochastic

Process et, is given by

Page 6: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

196 ____________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

Higher order statistics can be derived by the knowledge of the second order prob­

ability density function F(:l:l, :1:2; tlo t2). In fact the Joint Moments can be computed

for the stochastic process at two different instants of time, t} and t2

This last formula gives the expression of the elements Rnm(t}, t 2 ) of the matrix

R(t}, t2). In particular the element Rl1(tl, t2) is called the Autocorrelation Function.

Analogously to the first order statistics the Central Joint Moments can be defined

as follows

Again the matrix

can be then computed and its element Cll (t}, t 2) is called Autocovariance Function.

In this case, i. e. for n = m = 1 we simply write C(t},t2). Then one can verify

that

A measure of the correlation between the events at tl and t2 is given by the

Autocorrelation Coefficient Function p(t}, t2), which is defined by

Page 7: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX ____________________________________________________ 197

Until now we have defined the joint moments of an autocorrelation statistics,

in the sense that all operations have been made upon the same stochastic process

et. If we consider two stochastic processes, say et and "It, then a cross-correlation

statistics can be defined by the second-order distribution function F(x, y; t l , t2).

In particular, it is immediate to define the Cross Joint Moments

with Ri7 is the Cross-Correlation Function.

Moreover, the Cross Central Joint Moments are

where Cfi is the Cross-Covariance Function.

Obviously the cross-correlation statistics takes into account moments calculated

for realizations of the stochastic processes et and "It at the same instant of time

tl = t2 = t, i. e. R~':n(t, t) and C!::'(t, t).

It should be plain to the reader that higher order statistics, which involve n-order

probability densities, can be defined in the same fashion. However, in practice, as

we shall see in the examples which will be given in what follows, the knowledge of

the second order statistics is often sufficient to characterize a stochastic process.

Stationary Processes

A stochastic process et, which may be vector valued, defined on a complete proba­

bility space (0, F, P) is defined Stationary Process iff the distribution functions are

Page 8: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

198 _____________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

invariant with respect to an arbitrary translation in time. In other words this type

of process is characterized by the property

for all n E IN, t l , ... , tn E T , T E T , Xl, ..• , Xn E IR.

It is assumed that T is a translation invariant subset of IR. This means that

stationary processes depend only on a time parameter T which establishes the dif­

ferences between different times.

Another, however different, notion of a stationary process, which is in general

called Stationary in Wide Sense can be stated in L2. A stochastic process ~t is called

stationary in a wide sense iff its second moments are finite,

E{~t} = m = constant

and

for some function C( s).

One can show that a stationary process with finite second moments is stationary

in a wide sense. On the other hand, there is a class of stochastic processes, the so

called Gaussian Processes, for which these two notions are equivalent.

Starting from the autocorrelation function it is possible to define a statistical

functional which plays an important role in several stochastic processes, namely

the Power Spectral Density S(A). This quantity is defined as the inverse Fourier

transform of the autocorrelation function,

Page 9: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX __________________________ 199

Conversely, we have

These last two relationships are known as the Wiener-Khintchine formulae.

Moreover from the fact that C ( T) is an even function of T, the Wiener-Khintchine

formulae reduce to

1100 S(A) = - COS(AT)C(T) dT, 71' 0

and

Consequently, for T = 0, we have

An important property of stationary processes is Ergodicity. Let {etheT with

T c lR, be a stationary process and let g : IR----t IR be a given function. Then we can

define the pathwise time averages as

Therefore, a process et is defined Ergodic Process iff for every suitable function

9 the pathwise time averages are constant functions for wEn and

Page 10: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

200 _____________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

or, In other words, if the time average of the process equals its mean value for

every time t. One should point out, that there are other definitions of ergodicity,

which more general and such that our definition is a consequence (by means of

ergodic theorems of different types). For practical purposes, the definition above is

sufficient.

Examples of Stochastic Processes

In general a stochastic process is completely determined by its finite order distribu­

tion functions. By this we mean that all its statistical measures can be calculated

by using only these functions. Nevertheless, in many cases, stochastic processes

are well characterized by lower order probability densities. Some examples of these

processes, which are often met in applications, are given here.

Processes with Independent Increments

A family {Xa} of random variables is called independent iff for any finite subset

aI, ... ,ak of indexes and any Borel set Ai E JR

p ({w En: Xa;{W) E Ai, i = 1, ... , k}) = IT P ({W En: Xa;{w) E A;}) . i

This is equivalent to the following condition

k

F{X1, ... , Xki a1,···, an) = IT F{Xii a;) for all Xi E JR,

where F{ Xl, ... ,xki a1> ... ,an) is ajoint distribution function of k variables Xa1 , ••• ,Xak •

A stochastic process ~t is said to have Independent Increments iff for any finite

number of elements t1 :S t2 :S ... tk of T the random variables

Page 11: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX __________________________________________________ 201

are independent.

Markov Processes

Classically, a Markov process is a process {eth~o such that for each t the behaviour

of e. for s > t depends only on et and is independent of er for r < t. This rough

picture has the following more rigorous version.

Assume that there exists a family P( s, x, t, r) which satisfies the conditions:

i) P( s, x, t, .) is a probability measure on (0, F, P), for all s :s t E T and x EX,

where X is a metric space, usually ill. or II{'.

ii) P( s, ., t, A) is measurable for all s :s t E T and A E F­

iii) P(s,x,s,·) = 6., for all sET, x E X.

We say that a stochastic process et is a Markov Process iff there exists a function

P( s, x, t, A) which satisfies conditions i)-iii) and, in addition, for any pair s :s t E T

p(et E Ale. = x) = P(s,x,t,A),

where on the left hand side one has the so called Conditional Probability.

The last condition can also be expressed in the equivalent manner

p(et E A Ie.) = P(s, e., t, A) for all s:S t, A a.e. In o.

This last equality, in turn, has the following meaning: For any Borel set B in ill.

we have

p({et E A,e. E B}) = r p(s,e.(w),t,A)dP(w). Je;l(B)

Page 12: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

202 ____________ --'NON LINEAR STOCHASTIC EVOLUTION EQUATION

Gaussian Process

A stochastic process is called Gaussian if the random vector «(tl , ... , (t,,) is Gaussian,

for any finite set of tb' .. , tn E T. This means that there exist n-order density

functions which are given by the following formulae

an n x n matrix.

One can compute

and

Therefore m(ti) is the mean value of (ti) and B is the correlation matrix of

(t. It follows that a Gaussian process is completely characterized by its mean value

m( t) and by the autocorrelation function.

Wiener Process

A Wiener Process is a stochastic process {w(t)h~o, which is Gaussian and, in addi­

tion, satisfies the following properties

i) w(O) = 0,

ii) m(t) = E(w(t)) = 0 for all t ~ 0,

iii) Rn(s, t) = E (w(s( w(t)))) = Ds 1\ t = D min{ s, t},

where D > 0 is a given constant. When D = 1 then w(t) is called a Standard Wiener

Process. Generally, the word Standard is not even mentioned.

There is an equivalent definition od a Wiener process, which we give for complete­

ness. A stochastic process w(t) is called a Wiener process iff it satisfies conditions

Page 13: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX ____________________________________________________ 203

1) and 2) above, for each t ;?: o. w{t) is a Gaussian random variable with zero mean

and variance Dt.

Rice Noise

The Rice Noise is a stochastic process, which is characterized by a probability density

function of an order greater than two, and which can be written in the form

n

{t{w) = E aj(w) cos(Vj(w)t + 4>j(w», j=l

where aj, Vj and 4>j are 3n given random variables.

It is worth remarking that a Rice process is characterized by the 3n-Order Joint

Distribution Functions

Random Fields

This book is mainly devoted to partial differential equations where the unknown

functions are stochastic processes which depend either upon the time and upon

spatial variable x E nr , i. e. u(t, x, w).

In the literature, (see, for instance, the book [3]), a stochastic process which

depends only on the position x E nr and not on the time, i. e. u( x, w) , is called

a Random Field. If the random field depends also on the time, then one generally

speaks of a Time-Dependent Random Field. A few peculiarities of the random fields

will be summarized in this Appendix. First of all if a finite set of times t l , ... , tn

is chosen, the corresponding functions u(tl,X,.), ... ,u(t,x,.) form a finite set of

random fields. In the same way if we select a finite set of positions, Xl, ... , Xn , then

the functions u(t, Xl, .), •.. , u(t, xn ,·) form a finite family of stochastic processes.

Conversely a selection of finite sets of position and time transforms the random field

into a sequence u(tl , xd, . .. , u(tn , xn) of random variables.

Accordingly the statistical measures can be computed either with respect to the

time, as in the standard stochastic processes, or with respect to the position. In other

words one can study the correlation between the two random variables u( t l , Xl, W)

Page 14: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

204 ____________ -lNON LINEAR STOCHASTIC EVOLUTION EQUATION

and u(t2 ,:l:l,w) at different times but in the same space position, or, vice-versa,

compute the correlations ofu(t1,:l:l,w} and U(t1,:l:2,W) at the same time instant t

for two different space positions. As a consequence, we can define two families of

n-order probability density functions, as

and compute the statistical moments of any order with respect to the time or the

position, respectively, as shown above.

Ito's Integral

The relevance of a Wiener process consists mainly in its connection with the anal­

ysis of stochastic differential equations. The basic notion associated with a Wiener

process is the so called Ito '8 Integral. We will give here its definition and then list its

basic properties. Let w(t) be a d-dimensional Wiener process and Ft be its natural

filtration, i. e. Ft = u(w. : 8 ~ t}, that is Ft is the smallest u-algebra with respect

to which all the random variables w., 8 ~ t are measurable. In other words, it is

a u-algebra of events observable up to time t. A stochastic process et is said to be

adapted, if for each t E T the function

e Tn(-oo,t]xS13(s,w}--+lR

is measurable.

The Ito's Integral is defined only for adapted processes. First, it is defined for

simple processes, i. e. for processes of the form

Page 15: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX __________________________________________________ 205

where 0 = to < h < ... < tn = T, e. is JR""xd-valued :Ft. measurable random variable

with finite second moment. For such a process we simply put

n-l

I(e) = L: < e., il.w > .=0

where il.w = W(ti+l) - W(ti) and for WEHr, e E JR""xd, < e,w > is JR"" vector

whose i-th coordinate is equal to L:eijWj. The properties of the Wiener process j

together with the fact that e(t) is adapted yield the following equality

again only for a simple process e. But it is sufficient to define I(e) for any process

e(t) which is a limit of simple processes in the norm

It can be proved that any progressively measurable process with finite norm has

this property. Hence, there is a linear map I that maps the space M2(0, T; JR""Xd)

of all JR""xd-valued progressively measurable processes into L2(n; JR""). Finally we

define the Ito's integral by putting

re(s)dw(s) = I(l[ot)e)· 10 .

In particular, we see that

loT e(s) dw(s) = I(e).

This procedure can be followed for any T > 0, so we will omit T in the sequel.

Now we list the basic properties of Ito's Integral:

Page 16: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

206 ___________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

1. Linearity: For all 0, (3 E ~ and e,7J E M2(0, T),

2. Isometry property: For all e E M2(0, T),

3. Ito's Formula: Assume that e and." are progressively measurable processes,

respectively firxd and fir valued and z(t) is defined by

z(t)=zo+ le(s)dw(s) + l7J(s)ds,

where the second integral is in the Lebesgue sense. We say that z(t) has a

stochastic differential and write

dz(t) = e(t) dw(t) + 7J(t) dt.

Assume that F : fir -+ IR be a function of class C2 • Then the stochastic

process F(z(t)) has a stochastic differential and

d m of . m of . dF(z(t)) = L L a(e(t))eij dw3 (t) + L aWt))7Ji dw3 (t)+

j=1i=1 Xi i=1 Xi

(A.I)

Page 17: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX ____________________________________________________ 207

4. Doob-Burholder inequality: For e E M2(O, T) and r E (1,00),

E (sup I! 1e(s) dw( s )Ir) ::; ( __ r __ ) r sup{ Elf 1e( s) dw(s W}~ . t~T r - 1 t~T

Stochastic Differential Equations

A very useful and important notion is that of Stochastic Differential Equations. Let

us assume that, as before, w(t) is a d-dimensional Wiener process on a probability

space (0, F, P). Consider two Lipschitz functions

u : JR' ---+ :urxn ,

where n E IN and we assume global Lipschitz properties of these functions for the

simplicity of exposition. We are interested in solving the following ordinary stochas­

tic differential equation (but one might consider a system of ordinary stochastic

differential equations)

de(t) = b(e(t)) dt + uWt)) dw(t), (A.2)

subjected to the initial condition

e(O) = eo,

where eo E L2(O; JR') is Fo-measurable. Very often we will simply take eo = :z: for

some :z: E JR'. It is known that under the above assumptions a unique global solution

to the initial value problem exists. The proof of this theorem can be obtained by

several different methods, one of them is the fixed point method. We fix T > 0 and

Page 18: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

208 ____________ -.lNON LINEAR STOCHASTIC EVOLUTION EQUATION

observe that a process e E M2(0, T) is a solution to the initial value problem iff e is a fixed point of the mapping

where

Next, we observe that if c)(e) = e then, in view of Doob's inequality

Therefore we may look for a fixed point of C), considered as a function in XT . But

endowing X T with a suitable norm (equivalent to the original, sometimes called

Bielecki's norm), i.e.

we prove that for >. sufficiently large, c) is a strict contraction in X T and so by

Banach fixed point theorem has a unique fixed point.

The Ito's formula provides a very efficient way of deriving the Fokker-Planck

Equation. With this in mind, assume that the unique solution e(t) to the stochastic

differential equation exists. Assume that the functions band (j are sufficiently regular

and finally aussume that the density function of the stochastic process e( t, z), the

unique solution to the stochastic differential equation (A.2) with initial condition

eo = z, exists for t > o. Let u(t, z) be a function defined by

u(t,x) = E{4>(e(t, x))} ,

Page 19: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX ________________________ 209

where cPis a C2 functin with compact support. By Ito's formula, the process cP(e(t, z))

satisfies

Taking the mean value of this equality we find that u( t, z) satisfies

dv(t,z) = Av(t,x)dt,

v(O,x) = cP(x),

where A is a second order differential operator given by

n, 8 n ~

A = tt b'( x) 8x; + ;~1 a;j( x) 8x;8xj ,

where ai;( x) = L:k U;k( x )Ujk( z). This allows the following argument, If p( t, y) is the

probability density of e( t, x) then

d J d d dt cP(y)p(t,y)dy= dtE{cP(e(t,z))} = dtu(t,x) = Au(t,x) =

_ ~b;( )8u(t,x) ~ ,,82u(t,x) - ~ Z + ~ a" .

'-I 8x; , '-I 8x;8x,' 1- 1,3-

Taking into account that Au( t, x) is equal to the solution of

8u -=Au 8t

Page 20: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

210 ___________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

with initial condition A4> yields

J J { n {)4>( x ) 1 n {)2 4>( x ) } Au(t,x) = A4>(y)p(t,y)dy = Lbi(x)-{)-. +"2 L aiifiT. p(t,y)dy =

i=1 x, ',j=1 x, X3

where in the last equality the formula of integration by parts has been used.

We conclude the proof by observing that since 4> can be taken arbitrarily, p( t, y)

satisfies the following

which is the Fokker Plank Equation.

Page 21: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

APPENDIX __________________________ 211

References to Appendix

1. Soong T.T., Random Differential Equations in Science and Engineer­

ing, Academic Press, New York, (1973).

2. Papoulis A., Probability Random Variables and Stochastic Processes,

McGraw Hill, New York, (1965).

3. Ivanov A.V. and Leonenko N.N., Statistical Analysis of Random Fields,

Mathematics and Its Applications, (Soviet Series), Ed. M. Hazewinkel,

Kluwer Academic Publishers, Dordrecht, (1989).

Page 22: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

AUTHORS INDEX

Abramowitz M. 31, 60, 151, 153, 165.

Adomian G. 2, 9, 11, 20, 22, 34, 60, 68, 82, 92, 98, 150, 154, 163, 164, 169, 188.

Akilov G. 3, 22, 50, 60.

Arnold L. 2, 20, 22, 34, 60, 101, 122, 130.

Ash R. 66, 98.

Baker G. A. 43, 44, 6l.

Beck J. 179, 189.

Becus G. 78, 98, 181, 188.

Bellomo N. 1, 2, 11, 12, 20, 22, 34, 57, 60, 61, 62, 72, 78, 82, 84, 89, 98, 99,

102, 123, 127, 129, 130, 137, 138, 144, 145, 147, 155, 156, 165, 169, 184, 186,

188, 189.

Bellman R. 34, 60, 68, 77, 98, 169, 188.

Bensoussan A. 102, 108, 132.

Bernard P. 102, 122, 123, 131, 164, 165.

Bharucha Reid A. 35, 38, 61, 68, 98.

Blackwell B. 179, 189.

Bonzani 1. 57, 62, 137, 147, 152, 163, 164, 165.

Brzezniak Z. 102, 108, 109, 111, 130.

Cabannes H. 14, 15, 16, 23.

Cannon J. 185, 186, 189.

Canuto C. 76, 99.

Capinski M. 102, 108, 109, 111, 130.

Carlomusto L. 156, 159, 165.

Cashef J. 169, 188.

213

Page 23: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

214 ____________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

Casti J. 34, 35, 59.

Chorin A. 106, 107, 133.

Colton D. 185, 189.

Conti R. 34, 39, 59.

Courant R. 3, 22, 61.

Da Prato G. 102, 108, 130.

De Blasi G. 102, 131.

de Socio L. 11, 22, 34, 39, 45, 60, 61, 72, 78, 84, 92, 98, 99, 156, 159, 165, 180,

184, 186, 189.

Du Chateau P. 185, 186, 189.

Elworthy K. 100, 131.

Evans J. 169, 188.

Ewing R. 185, 186, 189.

Fitzgibbon W. 169, 187.

Fitz Hugh R. 172, 173, 188.

Flandoli F. 61, 102, 108, 109, 111, 123, 126, 129, 130, 132.

Friedman A. 20, 23, 34, 60, 100, 131.

Gabetta E. 138, 147, 156, 164.

Gardner M. 66, 98.

Gatignol R. 14, 15, 23.

Graves Morris P. 43, 44, 61.

Gualtieri G. 39, 45, 61, 92, 99.

Hilbert D. 3, 22, 61.

Hille B. 172, 186.

Hodgkin A. 169, 170, 188.

Hussaini M. 76, 99.

Huxley A. 169, 170, 188.

Kampe de Feriet J. 181, 188.

Kantorovic L. 3, 22, 50, 60.

Kashef B. 34, 59.

Kazimierzik P. 101, 130.

Keener J. 172, 173, 181, 188, 189.

Page 24: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

AUTHORS' INDEX __________________ 215

Kotulski Z. 101, 102, 130.

Kreener A. 102, 130.

Kustnezov P. 155, 165.

Ichikawa A. 102, 130.

Ito S. 102, 131.

Ivanov A. 206, 211.

Yakhono V. 185, 186, 188.

Lachowicz M. 39, 49, 50, 51, 61, 174, 189.

Lavrent'ev M. 185, 188.

Leonenko N. 206, 211.

Lions J. L. 6, 11, 22, 61, 102, 113, 132.

Lieberstein H. 169, 188.

Lobry C. 102, 130.

Malakian K. 154, 164.

Marsden J. 106, 107, 133.

McShane E. J. 20, 23, 34, 60, 67, 98, 101, 121, 130.

Mikhailov V. 3, 22.

Myjak J. 102, 131.

Monaco R. 34, 39, 49, 50, 51, 60, 61, 72, 78, 83, 98, 156, 158, 166, 174, 189.

Nelson J. 101, 130.

Padgett W. 68, 98.

Payne L. 185, 186, 187, 189.

Papageorgiu N. 102, 131.

Papoulis A. 12, 14, 22, 191, 211.

Pardoux E. 102, 108, 131.

Pianese A. 154, 159, 165.

Pistone G. 137, 144, 147, 164.

Preziosi L. 72, 78, 92, 98, 99, 156, 158, 165, 182, 184, 186, 189.

Pugachev V. 150, 164, 165.

Quarteroni A. 76, 99.

Repaci A. 182, 189.

Reznitskaya K. 183, 184, 188.

Page 25: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

216 ____________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

Riganti R. 1, 2, 12, 20, 22, 34, 60, 98, 138, 150, 152, 154, 164, 165.

Rybinski L. 102, 122, 131.

Roozen H. 102, 120, 131.

Rundell W. 185, 186, 188, 189.

Sambandham M. 35, 38, 61.

Sansone G. 34, 39, 60.

Saraiyan D. 89, 99.

Satofuka A. 39, 45, 46, 61.

Shannon C. 142, 164.

Schuss Z. 102, 131.

Sinitzin T. 150, 165.

Smale S. 6, 23.

Smart D. 12, 23.

SobczykK. 20,23,57,61,62,99,101,102,121,130.

Soize C. 102, 121, 123, 131, 164, 165.

Soong T. 20, 23, 34, 60, 68, 98, 101, 121, 130, 138, 144, 146, 164, 191, 211.

St. Claire G. 179, 189.

Stegun 1. 31, 60, 151, 153, 165.

Steube K. 185, 186, 189.

Stratonovich 1. 144, 154, 165.

Sussman H. 102, 111, 131.

Temam R. 6, 23, 51, 102, 108, 130.

Teppati G. 72, 78, 98, 184, 186, 189.

Tichonov 1. 144, 154, 165.

Tsokos C. 68, 98.

Vacca M. T. 169, 188.

Weaver W. 142, 164.

Wehr A. 143, 164.

Zang T. 76, 99.

Zavattaro M. G. 135, 164.

Page 26: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

SUBJECT INDEX

Autoccorrelation 12, 13, 196.

Autocovariance 12-14.

Autonomous equations 6.

Banach spaces 10.

Boundary conditions 2-6, 27, 69-70.

Boltzmann equation 14-16, 26, 28, 48-53, 156-158.

Bernstein polynomials 36.

Brownian motion 9, 101, 103-106, 108.

Burgers' equation 29,52-56, 107-108.

Chebychev collocation 37.

Classification of PDE 3-9.

Classification of models 3-7.

Coloured noise 110-114.

Conditional probability 200.

Correlation function 12-15.

Covariance function 13-14, 197.

Distribution function 192-194.

Doob-Burholder inequality 207.

Entropy function 140-146, 156-158.

Ergodic process 199.

Euler (stochastic) integration 120-121.

Evolution of the Probability Density 143-147.

Faedo-Galerkin approximation 111.

Fixed point theorems 11, 50-54, 55.

217

Page 27: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

218 _____________ NON LINEAR STOCHASTIC EVOLUTION EQUATION

Fokker-Plank-Kolmogorovequation 121-125, 150, 163,209-210.

Fourier random expansion 42.

Function spaces 9-10.

Gaussian process 202.

GronwaJ1s'Lemma 37.

Hermite polynomials 152-154.

Karunen-Loeve expansion 66.

Kata- Trotter's tbeorem 113.

Kolgomorov's compatibility conditions 194.

Kronecker delta 31.

lli-posed problems 6,174-181.

Initial value problem 4,25,136-141, 174.

Initial-boundary value problem in tbe balf-space 4, 32, 75-77, 84-88.

Initial-boundary value problem 5, 25, 138, 174.

Integra-differential equations 173-174.

Interpolation tecbniques 38-48, 149-155.

Inverse problems 174-181.

Ito's differential 100.

Ito's differential equation 207.

Ito's integral 204-207.

Lagrange polynomials 30-31, 71-72.

Laguerre polynomials 151.

Lipschitz condition 34, 38.

Markov process 201.

Mean value 12, 17.

Moments 12-14, 34-35, 195-197.

Moment approximantion 149-154, 156-160.

Moving boundary problems 91-95, 181-185.

Navier-Stokes (stocbastic) equations 107-108.

Nonautonomous equations 6.

Pade's approximants 43-44.

Periodic polynomials 41-43.

Page 28: Appendix Basic Concepts of Probability Theory and ...978-94-011-1820-0/1.pdf · Appendix Basic Concepts of Probability Theory and Stochastic Processes Probability Theory Probability

SUBJECTS' INDEX __________________ .219

Picard iterative schmeme 90.

Power spectral density 189

Probability density 135-154, 192-194.

Probability space 191.

Processes with independent increments 200.

Power spectral density 198.

Random fields 203-204.

Random heat equation 68-70, 77-90.

Random variables 191-192.

Rice noise 78, 203.

Runge-Kutta (stochastic) integration 121.

Semilinear equations 6-7.

Separable stochastic processes 8, 66-67.

Sobolev imbedding theorem 55.

Solution to the initial-boundary value problem 9-10.

Splines 39-41.

Stationary Processes 197-200.

Statistical measures 12-13, 195-197.

Stochastic calculus 11.

Stochastic differential equations 207-210.

Stochastic operators 67.

Weakly nonlinear equations 7.

Well-posed problems 6.

Well-specified problems 6.

White noise 106, 110-114, 116.

Wiener-Khintchine formulae 199.

Wiener process 106, 110-114, 116, 202.