lecture notes on nonparametric spectral estimation · pdf file1 lecture notes on nonparametric...

13
1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. I NTRODUCTION In this paper, we discuss the classical nonparametric methods for spectral estimation. In particular, we analyze the periodogram, correlogram, averaged periodogram and Blackman-Tukey spectral estimators. The content and notation follows mainly [1, Ch. 2] and in some parts [2]. A set of slides that presents a major part of the material can be found at [3, Lectures 2-3]. The autocovariance function for a zero-mean (wide sense) stationary process y (t) is defined as r(k) E {y (t)y * (t k)} . It is easy to see from the definition that the autocovariance function has the following property: r(k)= r * (k). (1) First definition of power spectral density (PSD): φ(ω) k=-∞ r(k)e -jωk . (2) Second definition of PSD: φ(ω) lim N→∞ E 1 N N t=1 y (t)e -jωt 2 . (3) Definitions (2) and (3) are equivalent under the assumption that the autocovariance decays sufficiently rapidly, so that [1] lim N→∞ 1 N N k=-N |k||r(k)| =0. (4) The problem of interest is to determine an estimate ˆ φ(ω) of the power spectral density from a finite sequence of observations y (1),...,y (N ). II. PERIODOGRAM AND CORRELOGRAM METHODS The periodogram spectral estimator relies on the definition (3), and is computed as ˆ φ p (ω) 1 N N t=1 y (t)e -jωt 2 . (5) The correlogram spectral estimator relies on the definition (2), and is computed as ˆ φ c (ω) N-1 k=-(N-1) ˆ r(k)e -jωk , (6)

Upload: dangthuan

Post on 31-Jan-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

1

Lecture Notes onNonparametric Spectral Estimation

Erik AxellJune 29, 2011

I. INTRODUCTION

In this paper, we discuss the classical nonparametric methods for spectral estimation. In particular, weanalyze the periodogram, correlogram, averaged periodogram and Blackman-Tukey spectral estimators.The content and notation follows mainly [1, Ch. 2] and in someparts [2]. A set of slides that presents amajor part of the material can be found at [3, Lectures 2-3].

The autocovariance function for a zero-mean (wide sense) stationary processy(t) is defined as

r(k) , E {y(t)y∗(t− k)} .It is easy to see from the definition that the autocovariance function has the following property:

r(k) = r∗(−k). (1)

First definition of power spectral density (PSD):

φ(ω) ,

∞∑

k=−∞

r(k)e−jωk. (2)

Second definition of PSD:

φ(ω) , limN→∞

E

1

N

N∑

t=1

y(t)e−jωt

2

. (3)

Definitions (2) and (3) are equivalent under the assumption that the autocovariance decays sufficientlyrapidly, so that [1]

limN→∞

1

N

N∑

k=−N

|k| |r(k)| = 0. (4)

The problem of interest is to determine an estimateφ(ω) of the power spectral density from a finitesequence of observationsy(1), . . . , y(N).

II. PERIODOGRAM AND CORRELOGRAM METHODS

The periodogram spectral estimator relies on the definition (3), and is computed as

φp(ω) ,1

N

N∑

t=1

y(t)e−jωt

2

. (5)

The correlogram spectral estimator relies on the definition (2), and is computed as

φc(ω) ,

N−1∑

k=−(N−1)

r(k)e−jωk, (6)

Page 2: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

2

where r(k) denotes an estimate of the covariancer(k) obtained from the observationsy(1), . . . , y(N).There are two standard ways to obtain the estimated covariance:

r(k) =1

N − k

N∑

t=k+1

y(t)y∗(t− k), 0 ≤ k ≤ N − 1 (7)

and

r(k) =1

N

N∑

t=k+1

y(t)y∗(t− k), 0 ≤ k ≤ N − 1. (8)

The autocovariance estimates for negative lags are constructed using (1), so that

r(−k) = r∗(k), 0 ≤ k ≤ N − 1.

The only difference between the two estimators (7) and (8) isthe factor in front of the sum. Consider theexpectation of the autocovariance estimate (8),

E {r(k)} = E

{

1

N

N∑

t=k+1

y(t)y∗(t− k)

}

=1

N

N∑

t=k+1

E {y(t)y∗(t− k)} =N − k

Nr(k), k ≥ 0,

E {r(−k)} = E {r∗(k)} =N − k

Nr∗(k) =

N − k

Nr(−k), k ≥ 0,

(9)

Hence, the autocovariance estimate (8) is biased. Similarly, it is easy to see that the autocovarianceestimate (7) is unbiased for anyN . Even so, the biased estimator (8) is most commonly used for severalreasons (cf. [1, pp. 24-25]). Hence, in the sequel we will consider the correlogram spectral estimatorusing the autocovariance estimate (8). It should be noted that the number of terms used to estimate theautocovariance decays with increasingk. Therefore, the autocovariance estimates are very poor forlargek. For example, the estimates are computed from one term only at the extremek = N − 1.

A. Equivalence of the Periodogram and the Correlogram

In the following, we will show that the periodogram (5) and the correlogram (6) are in fact equivalent.Consider the signal

x(t) =1√N

N∑

k=1

y(k)e(t− k),

wherey(k) are fixed constants ande(t) is white noise with unit variance, i.e.E {e(t)e∗(s)} = δt,s. Thiscan be seen as a convolution, and thereforex(t) is the output of a filter with transfer function

Y (ω) =1√N

N∑

k=1

y(k)e−jωk.

Sincee(t) is white noise, its PSD is given byφe(ω) = 1 and

φx(ω) = |Y (ω)|2 φe(ω) =1

N

N∑

k=1

y(k)e−jωk

2

= φp(ω). (10)

Page 3: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

3

A straightforward computation of the autocovariance function (for k ≥ 0) yields

rx(k) = E {x(t)x∗(t− k)} = E

{

1√N

N∑

p=1

y(p)e(t− p)

(

1√N

N∑

s=1

y(s)e(t− k − s)

)∗}

=1

N

N∑

p=1

N∑

s=1

y(p)y∗(s)E {e(t− p)e∗(t− k − s)} =1

N

N∑

p=1

N∑

s=1

y(p)y∗(s)δp,k+s

= {substitutep = s+ k} =1

N

N∑

p=k+1

y(p)y∗(p− k) =

{

r(k), k = 0, . . . , N − 1

0, k ≥ N,

where r(k) is given by (8). Now, we obtain the PSD from (2) as

φx(ω) =

∞∑

k=−∞

rx(k)e−jωk =

N−1∑

k=−(N−1)

r(k)e−jωk = φc(ω) (11)

Equations (10) and (11) show the equivalence of the periodogram and the correlogram methods.

III. PROPERTIES OF THEPERIODOGRAM METHOD

In what follows, we will analyze some properties of the nonparametric spectral estimators.

A. Bias Analysis

Using the equivalence of the correlogram and the periodogram, we compute the expectation of thespectral estimate as

E

{

φp(ω)}

= E

{

φc(ω)}

= E

N−1∑

k=−(N−1)

r(k)e−jωk

=N−1∑

k=−(N−1)

E {r(k)} e−jωk.

Using (9), we obtain

E

{

φp(ω)}

=

N−1∑

k=−(N−1)

(

1− |k|N

)

r(k)e−jωk.

Define the triangular, or Bartlett, window

wB(k) ,

{

1− |k|N, 0 ≤ |k| ≤ N − 1,

0, elsewhere.

Then, we can write the expectation as

E

{

φp(ω)}

=

∞∑

k=−∞

wB(k)r(k)e−jωk.

The DFT of the product of two sequences is equal to the convolution of their respective DFTs. Hence,

E

{

φp(ω)}

=1

∫ π

−π

φ(ψ)WB(ω − ψ)dψ, (12)

where

WB(ω) =1

N

(

sin (ωN/2)

sin (ω/2)

)2

is the DFT of the Bartlett windowwB(k) shown in Figure 1.

Page 4: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

4

−3 −2 −1 0 1 2 3−60

−40

−20

0Standard biased covariance estimator (Triangular window)

WB(ω

)/W

B(0

) [d

B]

ω

−3 −2 −1 0 1 2 3−60

−40

−20

0Standard unbiased covariance estimator (Rectangular window)

WR

(ω)/

WR

(0)

[dB

]

ω

Fig. 1. Weighting functions for the standard biased and unbiased covariance estimators respectively, forN = 25.

Note that, if instead we would have used the autocovariance estimate (7), that would correspond to aweighting function being a rectangular window,

wR(k) ,

{

1, 0 ≤ |k| ≤ N − 1,

0, elsewhere,

sinceE {r(k)} = r(k).Consider the asymptotic expectation

limN→∞

E

{

φp(ω)}

= limN→∞

N−1∑

k=−(N−1)

(

1− |k|N

)

r(k)e−jωk = φp(ω)− limN→∞

1

N

N−1∑

k=−(N−1)

|k|r(k)e−jωk = φp(ω),

where we used (4) in the final step. Hence, the periodogram is asymptotically unbiased.

B. Spectral Resolution

• The expectation of the estimatorE{

φp(ω)}

can be interpreted as the output of a system with a

weighting (or window) functionWB(ω) and the true PSDφp(ω) as input.

Page 5: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

5

−0.1 −0.08 −0.06 −0.04 −0.02 0 0.02 0.04 0.06 0.08 0.1−3

−2

−1

0N=25

WB(f

)/W

B(0

) [d

B]

f

−0.1 −0.08 −0.06 −0.04 −0.02 0 0.02 0.04 0.06 0.08 0.1−3

−2

−1

0N=5

WB(f

)/W

B(0

) [d

B]

f

Fig. 2. Width of the mainlobe for a Bartlett window, forN = 25 andN = 5 respectively.

• The periodogram is asymptotically unbiased (infinite length window). This corresponds to a windowfunction which is a Dirac impulse. It is seen also from (12) that this would give the true spectrum.

• The window function should be as close as possible to a Dirac impulse. That is, we wish to have anarrow main lobe and a small number of sidelobes. The window approaches a Dirac impulse withincreasingN .

• Leakageis caused by the sidelobes. Power from the frequency band that contains most of the poweris transferred (leaks) to frequency bands that contain lesspower.

• Smearingis caused by the main lobe. The width of the main lobe causes a smoothing or smearingof the spectrum.

• Details in the spectrum that are separated by less than1/N cannot be resolved. Hence,1/N is calledthespectral resolution limit. The half-power (3 dB) width of the main lobe ofWB(ω) is approximately2π/N radians (1/N in frequency). Figure 2 shows the 3 dB width of a Bartlett window for N = 25andN = 5 respectively.

C. Variance Analysis

In what follows, we derive the (asymptotic) covariance of the periodogram spectral estimate for a zero-mean circularly symmetric complex white Gaussian noise process{y(t)} (see Appendix A). For white

Page 6: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

6

noise,φ(ω) = σ2. Then, the covariance of the periodogram is computed as

Cov{

φp(ω1), φp(ω2)}

= E

{

(φp(ω1)− φ(ω1))(φp(ω2)− φ(ω2))}

= E

{

φp(ω1)φp(ω2)}

− φ(ω1)φ(ω2) = E

{

φp(ω1)φp(ω2)}

− σ4(13)

Using (5), we obtain

E

{

(φp(ω1)φp(ω2)}

= E

1

N

N∑

t=1

y(t)e−jω1t

2

1

N

N∑

p=1

y(p)e−jω2p

2

= E

{

1

N2

(

N∑

t=1

y(t)e−jω1t

)(

N∑

s=1

y(s)e−jω1s

)∗( N∑

p=1

y(p)e−jω2p

)(

N∑

m=1

y(m)e−jω2m

)∗}

= E

{

1

N2

N∑

t=1

y(t)e−jω1tN∑

s=1

y∗(s)ejω1sN∑

p=1

y(p)e−jω2pN∑

m=1

y∗(m)ejω2m

}

=1

N2

N∑

t=1

N∑

s=1

N∑

p=1

N∑

m=1

E {y(t)y∗(s)y(p)y∗(m)} e−jω1(t−s)e−jω2(p−m)

(14)

To compute the expectation, we use the general result for theexpectation of the product of four jointlyGaussian random variables. That is

E {abcd} = E {ab}E {cd}+ E {ac}E {bd}+ E {ad}E {bc} − 2E {a}E {b}E {c}E {d} .Then,

E {y(t)y∗(s)y(p)y∗(m)}= E {y(t)y∗(s)}E {y(p)y∗(m)}+ E {y(t)y(p)}E {y∗(s)y∗(m)}+ E {y(t)y∗(m)}E {y∗(s)y(p)}

− 2E {y(t)}E {y∗(s)}E {y(p)}E {y∗(m)}= σ4 (δt,sδp,m + δt,mδp,s)

(15)

Inserting (15) into (14) yields

E

{

(φp(ω1)φp(ω2)}

=1

N2

N∑

t=1

N∑

s=1

N∑

p=1

N∑

m=1

σ4 (δt,sδp,m + δt,mδp,s) e−jω1(t−s)e−jω2(p−m)

=1

N2

N∑

t=1

N∑

p=1

σ4 +σ4

N2

N∑

t=1

N∑

s=1

e−jω1(t−s)e−jω2(s−t) = σ4 +σ4

N2

N∑

t=1

e−j(ω1−ω2)t

N∑

s=1

ej(ω1−ω2)s

(16)

Consider the sum of the second term, and note that it can be rewritten as

N∑

t=1

e−j(ω1−ω2)tN∑

s=1

ej(ω1−ω2)s =

N∑

t=1

ej(ω1−ω2)t

2

=

ej(ω1−ω2)N − 1

ej(ω1−ω2) − 1

2

=

∣ej(ω1−ω2)N/2∣

2

|ej(ω1−ω2)/2|2∣

ej(ω1−ω2)N/2 − e−j(ω1−ω2)N/2

ej(ω1−ω2)/2 − e−j(ω1−ω2)/2

2

=

(

sin ((ω1 − ω2)N/2)

sin ((ω1 − ω2) /2)

)2(17)

If we now insert (17) and (16) into (13), we get

Cov{

φp(ω1), φp(ω2)}

= σ4 +σ4

N2

(

sin ((ω1 − ω2)N/2)

sin ((ω1 − ω2) /2)

)2

− σ4 =σ4

N2

(

sin ((ω1 − ω2)N/2)

sin ((ω1 − ω2) /2)

)2

. (18)

Page 7: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

7

Then, we get the asymptotic covariance by

limN→∞

Cov{

φp(ω1), φp(ω2)}

= limN→∞

σ4

N2

(

sin ((ω1 − ω2)N/2)

sin ((ω1 − ω2) /2)

)2

=

{

σ4, ω1 = ω2,

0, ω1 6= ω2.

The variance of the periodogram spectral estimator does noteven asymptotically go to zero. That is,the periodogram is not a consistent estimator. The inconsistency holds also more generally, for coloredGaussian noise (cf. [1, Sec. 2.4.2]). Note further that spectral components at angular frequenciesω1, ω2

separated by2π/N are uncorrelated. That is, the distance between neighboring uncorrelated samplesdecays with increasingN and the spectral estimate fluctuates more rapidly.

IV. AVERAGED PERIODOGRAM

The main problem with the periodogram is the high variance (inconsistency). A simple solution tothis is to average over a set of (ultimately independent) estimates (cf. [2, Sec. 4.4]). Suppose thatKindependent data records of lengthL are available, so that we can compute

φ(k)p (ω) ,

1

L

L∑

t=1

y(t)e−jωt

2

, k = 1, . . . , K,

or equivalentlyφ(k)c (ω) from (6). Then, the averaged periodogram estimator is defined as

φav(ω) ,1

K

K∑

k=1

φ(k)p (ω)

By the assumption that theK data records are independent, we have

E

{

φav(ω)}

= E

{

φp(ω)}

andVar{

φav(ω)}

=1

KVar{

φp(ω)}

,

where φp(ω) refers to the original periodogram (5) usingN samples. That is, the expectation remainsthe same, but the variance is decreased by a factorK. However, since each estimateφ(k)

p (ω) is basedon L < N samples, the resolution (1/L) is worse. Hence, for maximum resolutionL should be chosenas large as possible. However, to reduce the variance we should chooseK = N/L large or equivalentlyL small. Clearly there is a tradeoff between resolution and variance. In practice, we do not haveKindependent data records, but rather theN samples are segmented intoK (correlated) non-overlappingblocks of lengthL. Then, the variance reduction is in general greater than1/K.

V. BLACKMAN -TUKEY SPECTRAL ESTIMATION

In what follows, we briefly discuss the Blackman-Tukey method, which aims at reducing the statisticalvariability of the spectrum estimate, at the cost of spectral resolution (and bias). The high variance arisesfrom the poor estimate of the autocorrelation for extreme lags, and the large number of small covarianceerrors that are summed. Moreover, we saw that the expectation of the periodogram can be seen as awindowing of the true spectrum. A windowing and a truncationof the sum (6) leads to the Blackman-Tukey estimator

φBT(ω) =M−1∑

k=−(M−1)

w(k)r(k)e−jωk, (19)

Page 8: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

8

whereM < N andw(k) is real sequence satisfying the following conditions

0 ≤ w(k) ≤ w(0) = 1

w(−k) = w(k)

w(k) = 0, |k| > M.

The functionw(k) weights the lags of the sample covariance, and is therefore called a lag window. Inanalogy with (12), we can rewrite (19) in the form

φBT(ω) =1

∫ π

−π

φp(ψ)W (ω − ψ)dψ (20)

Most windows in common use has a dominant relatively narrow peak (main lobe) aroundω = 0.Therefore, the Blackman-Tukey estimator (19) is a locally weighted average of the periodogram.

A. Bias Analysis

Since the periodogram is asymptotically unbiased, we have that

E

{

φBT(ω)}

=1

∫ π

−π

E

{

φp(ψ)}

W (ω − ψ)dψ

is asymptotically (M,N → ∞) unbiased. Moreover, the bias gets smaller with increasingM , since thenthe spectral window approaches a Dirac impulse. In analogy with the periodogram, the Blackman-Tukeyspectral estimator will (for most lag windows) be able to resolve details to about the bandwidth of themainlobe of the spectral window, which in this case is approximately1/M . That is, the spectral resolutionof the Blackman-Tukey estimator (19) is1/M , owing to the truncation of the sum.

B. Variance Analysis

It can be shown that the approximate variance for (19) is in the order ofM/N for large data records(see [2, Appendix 4B] and [1]).

Clearly there is a variance/resolution (and a variance/bias) tradeoff, since the spectral resolution isO(1/M) and the variance isO(M/N). To get good spectral resolution (or a small bias)M should bechosen to be large, since then the spectral window will get closer to a Dirac impulse. On the other hand,for a small varianceM should be chosen to be small.

VI. EXAMPLES

In what follows, we will show a few examples of the periodogram and the Blackman-Tukey spectralestimators. The examples are highly inspired by [2, Sec. 4.6]. Our example consist of the followingsinusoidal signals in white noise

y(t) =√10 exp(j2π0.25t) +

√20 exp(j2π0.3t) + e(t), (21)

wheree(t) is circularly symmetric complex Gaussian noise with unit variance.Figure 3 shows the periodogram spectral estimates forN = 20. Figure 3(a) shows the estimates of 50

realizations in an overlaid format and Figure 3(b) shows themean of the 50 realizations. Figure 4 showsthe corresponding results for the Blackman-Tukey estimator, using a Bartlett window of lengthM = 5.It should be noted that the periodogram can resolve the two peaks, which are separated exactly at theresolution limit1/N = 1/20. However, for the Blackman-Tukey estimator both peaks appear as one widepeak, since the spectral resolution is only1/M = 1/5. On the other hand, it should also be noted that theestimate varies more for the periodogram shown in Figure 3(a) compared to the Blackman-Tukey estimatein Figure 4(a).

Figures 5 and 6 shows the corresponding results for when the data record length is increased toN = 100andM = 20. Worth noting is that the B-T estimate is nearly able to resolve the two sinusoids since theresolution is now1/M = 1/20. Moreover, the variance of the estimates does not decrease with an increaseof the data record length.

Page 9: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

9

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(a) 50 overloaded realizations.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(b) Mean of 50 realizations.

Fig. 3. Periodogram spectral estimate of (21) forN = 20

Page 10: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

10

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(a) 50 overloaded realizations.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(b) Mean of 50 realizations.

Fig. 4. Blackman-Tukey spectral estimate of (21) forN = 20, using a Bartlett window of lengthM = 5.

Page 11: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

11

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(a) 50 overloaded realizations.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(b) Mean of 50 realizations.

Fig. 5. Periodogram spectral estimate of (21) forN = 100

Page 12: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

12

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(a) 50 overloaded realizations.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−40

−30

−20

−10

0

10

20

30

40

PS

D [d

B]

Frequency

(b) Mean of 50 realizations.

Fig. 6. Blackman-Tukey spectral estimate of (21) forN = 100, using a Bartlett window of lengthM = 20.

Page 13: Lecture Notes on Nonparametric Spectral Estimation · PDF file1 Lecture Notes on Nonparametric Spectral Estimation Erik Axell June 29, 2011 I. INTRODUCTION In this paper, we discuss

13

APPENDIX

A. Circularly Symmetric White Noise

A sequence{e(t)} is called complex circularly symmetric white noise if it satisfies

E {e(t)e∗(s)} = σ2δt,s,

E {e(t)e(s)} = 0.

These requirements can be rewritten as

σ2δt,s = E {e(t)e∗(s)} = E {(Re[e(t)] + jIm[e(t)])(Re[e(s)]− jIm[e(s)])}= E {Re[e(t)]Re[e(s)] + Im[e(t)]Im[e(s)]}+ jE {Im[e(t)]Re[e(s)]− Re[e(t)]Im[e(s)]}

E {Re[e(t)]2}+ E {Im[e(t)]2} = σ2,

E {Re[e(t)]Re[e(s)]} = −E {Im[e(t)]Im[e(s)]} , t 6= s

E {Im[e(t)]Re[e(s)]} = E {Re[e(t)]Im[e(s)]} , t 6= s

and

0 = E {e(t)e(s)} = E {(Re[e(t)] + jIm[e(t)])(Re[e(s)] + jIm[e(s)])}= E {Re[e(t)]Re[e(s)]− Im[e(t)]Im[e(s)]}+ jE {Im[e(t)]Re[e(s)] + Re[e(t)]Im[e(s)]}

⇔{

E {Re[e(t)]Re[e(s)]} = E {Im[e(t)]Im[e(s)]} ,E {Im[e(t)]Re[e(s)]} = −E {Re[e(t)]Im[e(s)]} .

Combining these equations yields the equivalent properties

E {Re[e(t)]Re[e(s)]} =σ2

2δt,s,

E {Im[e(t)]Im[e(s)]} =σ2

2δt,s,

E {Re[e(t)]Im[e(s)]} = 0.

REFERENCES

[1] P. Stoica and R. Moses,Introduction to Spectral Analysis. Prentice-Hall, 1997.[2] S. M. Kay, Modern Spectral Estimation – Theory and Application. Prentice-Hall, 1988.[3] P. Stoica and R. Moses. [Online]. Available: http://www2.ece.ohio-state.edu/∼randy/SAtext/