16.548 coding and information theory

106
1 16.548 Coding and Information Theory Lecture 15: Space Time Coding and MIMO:

Upload: jam

Post on 17-Mar-2016

61 views

Category:

Documents


4 download

DESCRIPTION

16.548 Coding and Information Theory. Lecture 15: Space Time Coding and MIMO:. Credits. Wireless Channels. Signal Level in Wireless Transmission. Classification of Wireless Channels. Space time Fading, narrow beam. Space Time Fading: Wide Beam. Introduction to the MIMO Channel. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: 16.548 Coding and Information Theory

1

16.548 Coding and Information Theory

Lecture 15: Space Time Coding and MIMO:

Page 2: 16.548 Coding and Information Theory

2

Credits

Page 3: 16.548 Coding and Information Theory

3

Wireless Channels

Page 4: 16.548 Coding and Information Theory

4

Signal Level in Wireless Transmission

Page 5: 16.548 Coding and Information Theory

5

Classification of Wireless Channels

Page 6: 16.548 Coding and Information Theory

6

Space time Fading, narrow beam

Page 7: 16.548 Coding and Information Theory

7

Space Time Fading: Wide Beam

Page 8: 16.548 Coding and Information Theory

8

Introduction to the MIMO Channel

Page 9: 16.548 Coding and Information Theory

9

Capacity of MIMO Channels

Page 10: 16.548 Coding and Information Theory

10

Page 11: 16.548 Coding and Information Theory

11

Single Input- Single Output systems (SISO)

y(t) = g • x(t) + n(t)

x(t): transmitted signaly(t): received signalg(t): channel transfer functionn(t): noise (AWGN, 2)

Signal to noise ratio :

Capacity : C = log2(1+)

x(t)y(t)

g

2x2

σEρ g

Page 12: 16.548 Coding and Information Theory

12

Single Input- Multiple Output (SIMO) Multiple Input- Single Output (MISO)

• Principle of diversity systems (transmitter/ receiver)• +: Higher average signal to noise ratio

Robustness• - : Process of diminishing return

Benefit reduces in the presence of correlation• Maximal ratio combining >

Equal gain combining > Selection combining

Page 13: 16.548 Coding and Information Theory

13

Idea behind diversity systems• Use more than one copy of the same signal• If one copy is in a fade, it is unlikely that all the others

will be too.

• C1xN>C1x1

• C1xN more robust than C1x1

1

N

)N1(log2N1 xC

Page 14: 16.548 Coding and Information Theory

14

Background of Diversity Techniques

• Variety of Diversity techniques are proposed to combat Time-Varying Multipath fading channel in wireless communication– Time Diversity– Frequency Diversity– Space Diversity (mostly multiple receive antennas)

• Main intuitions of Diversity:– Probability of all the signals suffer fading is less then probability of single

signal suffer fading– Provide the receiver a multiple versions of the same Tx signals over

independent channels• Time Diversity

– Use different time slots separated by an interval longer than the coherence time of the channel.

– Example: Channel coding + interleaving– Short Coming: Introduce large delays when the channel is in slow fading

Page 15: 16.548 Coding and Information Theory

15

Diversity Techniques• Improve the performance in a fading environment

– Space Diversity• Spacing is important! (coherent distance)

– Polarization Diversity• Using antennas with different polarizations for

reception/transmission.– Frequency Diversity

• RAKE receiver, OFDM, equalization, and etc.• Not effective over frequency-flat channel.

– Time Diversity• Using channel coding and interleaving.• Not effective over slow fading channels.

Page 16: 16.548 Coding and Information Theory

16

RX Diversity in Wireless

Page 17: 16.548 Coding and Information Theory

17

Receive Diversity

Page 18: 16.548 Coding and Information Theory

18

Selection and Switch Diversity

Page 19: 16.548 Coding and Information Theory

19

Linear Diversity

Page 20: 16.548 Coding and Information Theory

20

Receive Diversity Performance

Page 21: 16.548 Coding and Information Theory

21

Transmit Diversity

Page 22: 16.548 Coding and Information Theory

22

Transmit Diversity with Feedback

Page 23: 16.548 Coding and Information Theory

23

TX diversity with frequency weighting

Page 24: 16.548 Coding and Information Theory

24

TX Diversity with antenna hopping

Page 25: 16.548 Coding and Information Theory

25

TX Diversity with channel coding

Page 26: 16.548 Coding and Information Theory

26

Transmit diversity via delay diversity

Page 27: 16.548 Coding and Information Theory

27

Transmit Diversity Options

Page 28: 16.548 Coding and Information Theory

28

MIMO Wireless Communications: Combining TX and RX Diversity

• Transmission over Multiple Input Multiple Output (MIMO) radio channels

• Advantages: Improved Space Diversity and Channel Capacity• Disadvantages: More complex, more radio stations and required

channel estimation

Space-TimeEncoder

Datasymbolsd

N DataSymbols

Pilo

tsy

mbo

ls

P

L_tTransmitantennas

Wireless Channel(What a Big Cloud!)

Space-TimeDecoder

L_rReceiveantennas

Datasymbols

Pilo

tsy

mbo

ls

P

d_hat

Page 29: 16.548 Coding and Information Theory

29

MIMO Model

• Matrix Representation

– For a fixed T

TNTMMNTN WXHY

T: Time index

W: Noise

Page 30: 16.548 Coding and Information Theory

30

Part II: Space Time Coding

Page 31: 16.548 Coding and Information Theory

31

Multiple Input- Multiple Output systems (MIMO)

H

1

M

1

N

Nx1Mx1NxMNx1nxy H

22

totalP

• Average gain

• Average signal to noise ratio

H11

HN1

H1M

HNM

HH

1,H22 ijE

Page 32: 16.548 Coding and Information Theory

32

Shannon capacity

K= rank(H): what is its range of values?Parameters that affect the system capacity• Signal to noise ratio • Distribution of eigenvalues (u) of H

H

2

H22

T2

H2x

2

Mρdetlog

MσPdetlog

σEdetlogC

HHI

HHIHHI g

Page 33: 16.548 Coding and Information Theory

33

Interpretation I: The parallel channels approach

• “Proof” of capacity formula

• Singular value decomposition of H: H = S·U·VH

• S, V: unitary matrices (VHV=I, SSH =I)U : = diag(uk), uk singular values of H

• V/ S: input/output eigenvectors of H• Any input along vi will be multiplied by ui and will appear as an output along si

Page 34: 16.548 Coding and Information Theory

34

Vector analysis of the signals

1. The input vector x gets projected onto the vi’s

2. Each projection gets multiplied by a different gain ui.

3. Each appears along a different si.u1

u2

uK

<x,v1> · v1

<x,v2> · v2

<x,vK> · vK<x,vK> uK sK

<x,v1> u1 s1

<x,v2> u2 s2

Page 35: 16.548 Coding and Information Theory

35

Capacity = sum of capacities

• The channel has been decomposed into K parallel subchannels

• Total capacity = sum of the subchannel capacities

• All transmitters send the same power:Ex=Ek

2

k2

k2

k

2k

2k

k σEu

s,nE

v,xEuρ

K

1ik2

K

1ik ρ1logCC

K

1i

222 1logC kk uE

Page 36: 16.548 Coding and Information Theory

36

Interpretation II: The directional approach

• Singular value decomposition of H: H = S·U·VH

• Eigenvectors correspond to spatial directions (beamforming)

1

M

1

N

(si)1

(si)N

Page 37: 16.548 Coding and Information Theory

37

Example of directional interpretation

Page 38: 16.548 Coding and Information Theory

38

Page 39: 16.548 Coding and Information Theory

39

Space-Time Coding• What is Space-Time

Coding?– Space diversity at antenna– Time diversity to introduce

redundant data• Alamouti-Scheme

– Simple yet very effective– Space diversity at

transmitter end– Orthogonal block code

design

Page 40: 16.548 Coding and Information Theory

40

Space Time Coded Modulation

Page 41: 16.548 Coding and Information Theory

41

Space Time Channel Model

Page 42: 16.548 Coding and Information Theory

42

Page 43: 16.548 Coding and Information Theory

43

STC Error Analysis

Page 44: 16.548 Coding and Information Theory

44

STC Error Analysis

Page 45: 16.548 Coding and Information Theory

45

Page 46: 16.548 Coding and Information Theory

46

Page 47: 16.548 Coding and Information Theory

47

STC Design Criteria

Page 48: 16.548 Coding and Information Theory

48

Page 49: 16.548 Coding and Information Theory

49

STC 4-PSK Example

Page 50: 16.548 Coding and Information Theory

50

STC 8-PSK Example

Page 51: 16.548 Coding and Information Theory

51

STC 16-QAM Example

Page 52: 16.548 Coding and Information Theory

52

STC Maximum Likelihood Decoder

Page 53: 16.548 Coding and Information Theory

53

STC Performance with perfect CSI

Page 54: 16.548 Coding and Information Theory

54

Page 55: 16.548 Coding and Information Theory

55

Page 56: 16.548 Coding and Information Theory

56

Delay Diversity

Page 57: 16.548 Coding and Information Theory

57

Delay Diversity ST code

Page 58: 16.548 Coding and Information Theory

58

Page 59: 16.548 Coding and Information Theory

59

Space Time Block Codes (STBC)

Page 60: 16.548 Coding and Information Theory

60

Decoding STBC

Page 61: 16.548 Coding and Information Theory

61

Page 62: 16.548 Coding and Information Theory

62

Page 63: 16.548 Coding and Information Theory

63

Block and Data Model• 1X(N+P) block of information symbols broadcast from transmit antenna: i

Si(d, t)

• 1X(N+P) block of received information symbols taken from antenna: j

Rj = hjiSi(d, t) + nj

• Matrix representation:

Function (mapping) S at antennai defines the Space-Timeencoding process

Assuming a single user andquasi-static and independentfading, n is AWGN

NHS

r

r

R

tL

...

1

trrr

t

t

LLLL

L

L

hhh

hhhhhh

H

...............

...

...

21

22221

11211

),(...

),(),(

2

1

tdS

tdStdS

S

tL

Page 64: 16.548 Coding and Information Theory

64

Related IssuesSpace-Time

EncoderS_i(d,t)

i=1,2,…,K

Datasymbolsd

N DataSymbols

Pilo

tsy

mbo

ls

P

L_tTransmitantennas

• How to define Space-Time mapping Si(d,t) for diversity/channel capacity trade-off?

• What is the optimum sequence for pilot symbols?• How to get “best estimated” Channel State Information (CSI) from the

pilot symbols P?• How to design frame structure for Data symbols (Payload) and Pilot

symbols such that most optimum for FER and BER?

Wireless Channel(What a Big Cloud!)

Space-TimeDecoder

L_rReceiveantennas

Datasymbols

Pilo

tsy

mbo

ls

P

d_hat

Page 65: 16.548 Coding and Information Theory

65

Specific Example of STBC: Alamouti’s Orthogonal Code

• Let’s consider two antenna i and i+1 at the transmitter side, at two consecutive time instants t and t+T:

• The above Space-Time mapping defines Alamouti’s Code[1].• A general frame design requires concatenation of blocks (each 2X2)

of Alamouti code,

d_0 d_1

-d_1*

Tim

e t

Tim

e

d_0*

Ant i+1

Tim

e t+

T

Ant. i

Space

...||...||

*2

*3

**1

321

dddddddd

Do

o

Page 66: 16.548 Coding and Information Theory

66

Estimated Channel State Information (CSI)

• Pilot Symbol Assisted Modulation (PSAM) [3] is used to obtain estimated Channel State Information (CSI)

• PSAM simply samples the channel at a rate greater than Nyquist rate,so that reconstruction is possible

• Here is how it works…

Page 67: 16.548 Coding and Information Theory

67

Channel State Estimation

P 5P P 5D P 2P 3D P

1 7 13 19

5D P

25

2P 3D P

31

2P 3D P

37

5D …... 5D P 3D 2P P 3D 2P P …... P 5P

edge part uniformpart edge part

Frame size = 300

265 271 277 295

L_t = i

A typical slow fadingchannel

Page 68: 16.548 Coding and Information Theory

68

Estimated CSI (cont.d) Block diagram of the receiver

+

n(t)

r(t) A/DConverter Delay

Pilot SymbolExtractor

Channel StateEstimator

MLDecoder

Matched Filteru*(-t)

r_k

r_p_1, r_p_2, …, r_p_N

r_k+1,r_k+2,…,r_k+K

h_hat

D_hat

tL

ltlstlh

1)()()()(

Page 69: 16.548 Coding and Information Theory

69

Channel State Estimation (cont.d)

P 5P P 5D P 2P 3D P

1 7 13 19

5D P

25

2P 3D P

31

2P 3D P

37

5D …... 5D P 3D 2P P 3D 2P P …... P 5P

edge part uniformpart edge part

Frame size = 300

265 271 277 295

L_t = i

• Pilot symbol insertion length, Pins=6. • The receiver uses N=12, nearest pilots to obtain estimated

CSI

Page 70: 16.548 Coding and Information Theory

70

Channel State Estimation Cont.d• Pilot Symbols could be think of as redundant data symbols

• Pilot symbol insertion length will not change the performance much, as long as we sample faster than fading rate of the channel

• If the channel is in higher fading rate, more pilots are expected to be inserted

Page 71: 16.548 Coding and Information Theory

71

Estimated CSI, Space-time PSAM frame design

P 5P P 5D P 2P 3D P

1 7 13 19

5D P

25

2P 3D P

31

2P 3D P

37

5D …... 5D P 3D 2P P 3D 2P P …... P 5P

edge part uniformpart edge part

Frame size = 300

265 271 277 295

L_t = i

• The orthogonal pilot symbol (pilots chosen from QPSK constellation) matrix is, [4]

• Pilot symbol insertion length, Pins=6. • The receiver uses N=12, nearest pilots to obtain estimated CSI• Data = 228, Pilots = 72

P 5P P 5D P 2P 3D P

1 7 13 19

5D P

25

2P 3D P

31

2P 3D P

37

5D …... 5D P 3D 2P P 3D 2P P …... P 5P

edge part uniformpart edge part

Frame size = 300

265 271 277 295

L_t = i+1

11

11P

Page 72: 16.548 Coding and Information Theory

72

Channel State Estimation (cont.d)MMSE estimation

• Use Wiener filtering, since it is a Minimum Mean Square Error (MMSE) estimator

• All random variables involved are jointly Gaussian, MMSE estimator becomes a linear minimum mean square estimator [2]:

• Wiener filter is defined as, .

• Note, and

1)(][ p

H

prCovhrEW

INhCovrCovo

H

pppp )()(

][)( H

ppphhEhCov

]|[ˆp

rhEh

ppp

H

pWrrrCovhrEh 1)(][ˆ

Page 73: 16.548 Coding and Information Theory

73

Block diagram for MRRC scheme with two Tx and one Rx

+

channelestimator combiner

h_hat_0

h_hat_1

h_hat_0 h_hat_1 d_hat_0 d_hat_1

maximum likelihood detector

n_0n_1

Interference& noise

rx antenna

h_0(t)h_0(t+T)

h_1(t)h_1(t+T)

tx antenna 0 tx antenna 1

d_0-d_1*

d_1d_0*

d_0 d_1

-d_1*

Tim

e t

Tim

e

d_0*

Ant i+1

Tim

e t+

T

Ant. i

Space

Page 74: 16.548 Coding and Information Theory

74

Block diagram for MRRC scheme with two Tx and one Rx

• The received signals can then be expressed as,

• The combiner shown in the above graph builds the following two estimated signal

oooo ndthdthtrr 11 )()()(

1

*

01

*

11 )()()( ndTthdTthTtrr o

*1

*

1

*

000 ])()([ˆ rthrTthd

*1

*

0

*

011 ])()([ˆ rthrTthd

Page 75: 16.548 Coding and Information Theory

75

Maximum Likelihood Decoding Under QPSK Constellation

• Output of the combiner could be further simplified and could be expressed as follows:

• For example, under QPSK constellation decision are made according to the axis.

*

110

*

00

*

11

*

000 )()()]()()()([ˆ nthnTthdTththTththd *

100

*

11

*

11

*

001 )()()]()()()([ˆ nthnTthdTththTththd

Page 76: 16.548 Coding and Information Theory

76

Space-Time Alamouti Codes with Perfect CSI,BPSK Constellation

Page 77: 16.548 Coding and Information Theory

77

Space-Time Alamouti Codes with PSAM under QPSK Constellation

Page 78: 16.548 Coding and Information Theory

78

Space-Time Alamouti Codes with PSAM under QPSK Constellation

Page 79: 16.548 Coding and Information Theory

79

Performance metrics• Eigenvalue distribution

• Shannon capacity– for constant SNR or– for constant transmitted power

• Effective degrees of freedom(EDOF)

• Condition number

• Measures of comparison– Gaussian iid channel– “ideal” channel

Page 80: 16.548 Coding and Information Theory

80

Measures of comparisonGaussian ChannelHij =xij+jyij : x,y i.i.d. Gaussian random variablesProblem: poutage

“Ideal” channel (max C)rank(H) = min(M, N)|u1 | = |u2 | = … = |uK |

Page 81: 16.548 Coding and Information Theory

81

Eigenvalue distribution

Ideally: As high gain as possibleAs many eigenvectors as possibleAs orthogonal as possible

H

1

M

1

N

LimitsPower constraintsSystem sizeCorrelation

Page 82: 16.548 Coding and Information Theory

82

Example: Uncorrelated & correlated channels

Page 83: 16.548 Coding and Information Theory

83

Shannon capacity

• Capacity for a reference SNR (only channel info)

• Capacity for constant transmitted power (channel + power roll-off info)

Href2 M

ρdetlogC HHI

H

2x

2 σEdetlogC HHI

Page 84: 16.548 Coding and Information Theory

84

Building layout

RCVR(hall)

XMTR

RCVR(lab)

4m

6m

3.3m

3.3m

2m

0o

90o

270o

180o

Page 85: 16.548 Coding and Information Theory

85

LOS conditions: Higher average SNR, High correlationNon-LOS conditions: Lower average SNR,More scattering

XMTR

RCVR(lab)

4m

6m

3.3m

3.3m

2m

0o

90o

270o

180o

Page 86: 16.548 Coding and Information Theory

86

Example: C for reference SNR

Page 87: 16.548 Coding and Information Theory

87

Example: C for constant transmit pwr

Page 88: 16.548 Coding and Information Theory

88

Other metrics EDOF (Effective degrees of freedom)

Condition number

Definition UMAX/ umin

+ Intuition Simplicity

- Dependence on reference SNR

No information on intermediate eigenvalue distribution

ref

outageref

ρ2

ρ2Clim δ

p

δ

Page 89: 16.548 Coding and Information Theory

89

From narrowband to wideband

• Wideband: delay spread >> symbol time

• -: Intersymbol interference+: Frequency diversity

• SISO channel impulse response:

SISO capacity:

L

1lll τtδgg(t)

L

1l

2l

22

x2

2 gg,σ

Eg1logC

Page 90: 16.548 Coding and Information Theory

90

Matrix formulation of wideband case

)()(

)(

)(

)(

)(

τtδhth

1

1

1111

L

1llij,lij

tntx

tx

ty

ty

tntxty

MNMN

M

N

HH

HH

H

Page 91: 16.548 Coding and Information Theory

91

Equivalent treatment in the frequency domain

• Wideband channel = Many narrowband channels

H(t) H(f)

bandwidthbandwidthNB fffC H

0

x2WB

02H

2x

2NB

)()(N

)(EdetlogC

(BW)Nσ,σEdetlogC

HHI

HHIf

Noise level

Page 92: 16.548 Coding and Information Theory

92

Extensions

• Optimal power allocation• Optimal rate allocation• Space-time codes• Distributed antenna systems

• Many, many, many more!

Page 93: 16.548 Coding and Information Theory

93

Optimal power allocation

• IF the transmitter knows the channel, it can allocate power so as to maximize capacity

• Solution: Waterfilling

total

K

1ik

K

1i

2k2

k2 PE,u

σE1logC

2

2k

k

kk

σu

λ

)λ1(νE

Page 94: 16.548 Coding and Information Theory

94

Illustration of waterfilling algorithm

2

2k

k

kk

σu

λ

)λ1(νE

Stronger subchannels get the most power

Page 95: 16.548 Coding and Information Theory

95

Discussion on waterfilling• Criterion: Shannon capacity maximization(All the SISO discussion on coding, constellation limitations etc is pertinent)

• Benefit depends on the channel, available power etc.Correlation, available power Benefit

• Limitations:– Waterfilling requires feedback link– FDD/ TDD– Channel state changes

Page 96: 16.548 Coding and Information Theory

96

Optimal rate allocation• Similar to optimal power allocation• Criterion: throughput (T) maximization

• Bk : bits per symbol (depends on constellation size)• Idea: for a given k, find maximum Bk for a target

probability of error Pe

(b/Hz)B T

(bps/Hz) ρ1logC

K

1kk

K

1kk2

Page 97: 16.548 Coding and Information Theory

97

Discussion on optimal rate allocation

• Possible limits on constellation sizes!• Constellation sizes are quantized!!!• The answer is different for different target

probabilities of error• Optimal power AND rate allocation

schemes possible, but complex

Page 98: 16.548 Coding and Information Theory

98

Distributed antenna systems

• Idea: put your antennas in different places• +: lower correlation

- : power imbalance, synchronization, coordination

Page 99: 16.548 Coding and Information Theory

99

Practical considerations

• Coding• Detection algorithms• Channel estimation • Interference

Page 100: 16.548 Coding and Information Theory

100

Detection algorithms• Maximum likelihood linear detector

y = H x + n xest = H+yH+ = (HH H)-1 HH : Pseudo inverse of H

• Problem: find nearest neighbor among QM points (Q: constellation size, M: number of transmitters)

• VERY high complexity!!!

Page 101: 16.548 Coding and Information Theory

101

Solution: BLAST algorithm

• BLAST: Bell Labs lAyered Space Time

• Idea: NON-LINEAR DETECTOR– Step 1: H+ = (HH H)-1 HH

– Step 2: Find the strongest signal(Strongest = the one with the highest post detection SNR)

– Step 3: Detect it (Nearest neighbor among Q)– Step 4: Subtract it– Step 5: if not all yet detected, go to step 2

Page 102: 16.548 Coding and Information Theory

102

Discussion on the BLAST algorithm• It’s a non-linear detector!!!

• Two flavors– V-BLAST (easier)– D-BLAST (introduces space-time coding)

• Achieves 50-60% of Shannon capacity

• Error propagation possible • Very complicated for wideband case

Page 103: 16.548 Coding and Information Theory

103

Coding limitations• Capacity = Maximum achievable data rate that can be achieved over the channel with arbitrarily low probability of error

• SISO case: – Constellation limitations– Turbo- coding can get you close to Shannon!!!

• MIMO case:– Constellation limitations as well– Higher complexity– Space-time codes: very few!!!!

Page 104: 16.548 Coding and Information Theory

104

Channel estimation

• The channel is not perfectly estimated because– it is changing (environment, user movement)– there is noise DURING the estimation

• An error in the channel transfer characteristics can hurt you– in the decoding – in the water-filling

• Trade-off: Throughput vs. Estimation accuracy• What if interference (as noise) is not white????

Page 105: 16.548 Coding and Information Theory

105

Interference

• Generalization of other/ same cell interference for SISO case

• Example: cellular deployment of MIMO systems• Interference level depends on

– frequency/ code re-use scheme– cell size– uplink/ downlink perspective– deployment geometry– propagation conditions– antenna types

Page 106: 16.548 Coding and Information Theory

106

Summary and conclusions• MIMO systems are a promising technique for high

data rates• Their efficiency depends on the channel between the

transmitters and the receivers (power and correlation)

• Practical issues need to be resolved• Open research questions need to be answered