tele4653 l9

30
TELE4653 Digital Modulation & Coding Block Codes Wei Zhang [email protected] School of EET, UNSW

Upload: vin-voro

Post on 10-Jun-2015

774 views

Category:

Design


3 download

TRANSCRIPT

Page 1: Tele4653 l9

TELE4653 Digital Modulation & CodingBlock Codes

Wei [email protected]

School of EET, UNSW

Page 2: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 2

Firstly we will talk about:

Information Theory: Concerned with the theoretical limitations

and potentials of communication systems?

Entropy, Mutual Information. Channel Capacity. Gaussian Channels.

Page 3: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 3

Information Theory

The amount of information contained in a message signal greatly depends on its uncertainty.

Messages containing knowledge of high probability of occurrence convey relatively little information.

Information is proportional to the uncertainty of an outcome

Information Theory: quantitative measure of the amount of information contained in message signals and determination of data transfer capacity of the communication channel.

If there is a set of M symbols {x1 , x2 , … , xM}, the information content of symbol xi denoted by I(xi) is defined as:

     )(log)(

1log)( 22 ii

i xPxP

xI where P(xi) is the probability of occurrence of symbol xi.   Unit: bit

With:

tindependen   are    and     if    for       

    for             

jijiji

jiji

i

ii

xxxIxIxxI

xPxPxIxIxI

xPxI

)()()(

)()()()(0)(

1)(0)(

Page 4: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 4

Information Theory

Average information, entropy

Suppose we have M different and independent symbols, {X: x1 , x2 , … , xM}, with respective probabilities of occurrence: {p1 , p2 , … , pM},

Consider a long sequence of symbols is generated and transmitted, the mean value of I(xi) is:

H(X) 

M

iiii

M

i

M

iiiii ppxPxPxIxPxIE

122

1 1log)(log)()()()(

where H(X) is called the entropy of source X. It is a measure of the average information content per source symbol. Unit: bit per symbol

When pi=1, H(X) = 0. When pi 0, H(X) 0 01loglim 20

i

ip pp

i

Maximum entropy occurs when all symbols are equally probable, i.e. pi=1/Mfor all i=1,…,M , M

MMXH

M

i2

12max log1log1)(

 

MXH 2log)(0 Thus,

Page 5: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 5

Information Theory

Example: When there are two symbols with probability of occurrence: p and (1‐p) ,

The entropy is:

Consider

)1(log)1(log 22 ppppH

pp

ppdxdy

xe

ype

ppp

epp

dpdH a

a

1log

)1(loglog

loglog

1log

)1()1(log)log

(log

2

22

22

22 dx

d       

For maximum value of entropy H:

Thus,

 0dpdH

211101log2

pp

pp

p             

1)211(log)

211(

21log

21

22max H

1

0 1½

H

pWhen p=1, H=0. When p 0, H 0.

Page 6: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 6

Information Theory

Information Rate

If the source X generates the messages at the rate of r symbols per second, the information rate R is defined as:

secondper   bits         )(XrHR Example: Consider a telegraph source having two symbols, dot and dash. The dot

duration is 0.2s. The dash duration is 3 times the fot duration. The probability of the dot’s occurrence is twice that of the dash, and the time between symbols is 0.2s. Calculate the information rate of the telegraph source.

2/3 P(dot)and  1/3  P(dash)    1P(dash)   P(dot)and    P(dash)P(dot) 2

lbits/symbo   92.0)3/1(log)3/1()3/2(log)3/2()(log)()(log)(

22

22

dashPdashPdotPdotPH

symbolsttdashPtdotPT

ststst

spacedashdots

spacedashdot

/53333.0)()(

2.0,6.0,2.0

  symbolper    time  average              

ssymbolsTr s / 875.1/1 srHR /bits 1.725 (0.92) 875.1

Solution:

Average information rate

Average symbol rate

Page 7: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 7

Information Theory

Channel Capacity

Consider an additive white Gaussian noise (AWGN) channel,

X: channel input, Y: channel output,

n: additive band-limited white Gaussian noise with zero mean and variance n2.

nXY

The capacity Cs of an AWGN channel is given by:

ebits/sampl       )1(log21

2 NSCs

If the channel bandwidth is B Hz , the output y(t) is also a band-limited signal completely characterized by its periodic sample values taken at the Nyquistrate 2B samples/second.

bits/s       )1(log2 2 NSBBCC s

where S/N is the signal-to-noise ratio at the channel output. S is the signal power and N is the total noise power within the channel bandwidth.

BN where , /2 is the noise power spectral density (two-sided)

Shannon-Hartley Law

Page 8: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 8

Information Theory

Given a source of M equally likely messages, with M >1, which is generating information at a rate R.

Given a channel with channel capacity C.

• If R C, there exists a coding technique such that the output of the source may be transmitted over the AWGN channel with an arbitrarily small probability of error in the received message.

• If R >C, the probability of error is close to unity for every possible set of M signal symbols, i.e., M Pe 1

bits/s       )1(log2 NSBC

Shannon-Hartley theorem indicates that a noiseless Gaussian channel (S/N=) has an infinite capacity.

For practical AWGN channel, the noise will limit the channel capacity.

Channel Capacity

Page 9: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 9

where , /2 is the noise power spectral density (two-sided).

i.e.

Information Theory

bits/s       )1(log2 NSBC Channel Capacity

BN

bits/s       )1(log2 BSBC

Let S/(B)= )ln(1 Sln21 )1(log2

SC

11lnlim0

Consider B , i.e. 0

S44.1

Sln21 )ln(1 S

ln21limlim

0

C

CCB

The channel capacity does not become infinite as the bandwidth becomes infinite. (increase in bandwidth will lead to increase in noise)

Trade-off between SNR and bandwidth

e.g. If SNR=7, B=4kHz C=12kbit/s. If SNR=15 and B=3kHz C=12kbit/s

Page 10: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 10

Information Theory

Example: An analog signal having 4-kHz bandwidth is sampled at 1.25 times the Nyquistrate, and each sample is quantized into one of 256 equally likely levels. Assume that the successive samples are statistically independent.

Nyquist rate=2(4000) = 8000 samples/s

Symbol rate r=8000(1.25)=104 samples/s

Entropy H(X)=log2256=8 bits/sample

Information rate R=rH(X)=104(8) = 80 kbits/s

For S/N ratio of 20dB and channel bandwidth B=10kHz

Channel capacity kbits/s 66.6100)(1log10 )1(log 24

2 NSBC

CR , error-free transmission is not possible.

For error-free transmission (R C) with S/N ratio is 20dB,

322 1080100)(1log )1(log RB

NSBC

kHzB 12)1001(log

1080

2

3

Required channel bandwidth

Page 11: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 11

Next, we are going to talk about:

Linear block codes The error detection and correction capability Encoding and decoding Hamming codes

Page 12: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 12

Block diagram of a DCS

FormatSourceencode

FormatSourcedecode

Channelencode

Pulsemodulate

Bandpassmodulate

Channeldecode

Demod.SampleDetect

Channel

Digital modulation

Digital demodulation

Page 13: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 13

Channel coding: Transforming signals to improve

communications performance by increasing the robustness against channel impairments (noise, interference, fading, ..)

Waveform coding: Transforming waveforms to better waveforms

Structured sequences: Transforming data sequences into better sequences, having structured redundancy. “Better” in the sense of making the decision

process less subject to errors.

What is channel coding?

Page 14: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 14

Error control techniques

Automatic Repeat reQuest (ARQ) Full-duplex connection, error detection codes The receiver sends a feedback to the transmitter,

saying that if any error is detected in the received packet or not (Not-Acknowledgement (NACK) and Acknowledgement (ACK), respectively).

The transmitter retransmits the previously sent packet if it receives NACK.

Forward Error Correction (FEC) Simplex connection, error correction codes The receiver tries to correct some errors

Hybrid ARQ (ARQ+FEC) Full-duplex, error detection and correction codes

Page 15: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 15

Repetition Codes

Transmit the message {0 0 1 0}

Encoder Rn: Repeat every bit n times Then, R3 is

{000 000 111 000}

Receive bits{000 100 110 000}

Decoder: Majority-Vote

Page 16: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 16

Repetition Codes

The probability of error of RN is:

What is PB of R3?

jnjn

njB ff

jn

P

)1(

2/)1(

Page 17: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 17

Linear block codes

The information bit stream is chopped into blocks of k bits. Each block is encoded to a larger block of n bits. The coded bits are modulated and sent over channel. The reverse procedure is done at the receiver.

Data blockChannelencoder Codeword

k bits n bits

rate Code

bits Redundant

nkR

n-k

c

Page 18: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 18

Linear block codes – cont’d

Encoding in (n,k) block code

The rows of G, are linearly independent.

mGU

kn

k

kn

mmmuuu

mmmuuu

VVVV

VV

2221121

2

1

2121

),,,(

),,,(),,,(

Generator Matrix

Message

Page 19: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 19

Linear block codes – cont’d

Example: Block code (6,3)

100

010

001

110

011

101

3

2

1

VVV

G

11

11

10

00

01

01 1

111

10

110

001

101

111

100

011

110

001

101

000

100

010

100

110

010

000

100

010

Message vector Codeword

Page 20: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 20

Linear block codes – cont’d

Systematic block code (n,k) For a systematic code, the first (or last) k

elements in the codeword are information bits.

matrix )(matrixidentity

][

knkkkk

k

PI

IPG

),...,,,,...,,(),...,,(bits message

21

bitsparity

2121 kknn mmmpppuuu U

Page 21: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 21

Linear block codes – cont’d

For any linear code we can find a matrix , which rows are orthogonal to rows of :

H is called the parity check matrix and its rows are linearly independent.

For systematic linear block codes:

nkn )(HG0GH T

][ Tkn PIH

Page 22: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 22

Linear block codes – cont’d

The Hamming weight of vector U, denoted by w(U), is the number of non-zero elements in U.

The Hamming distance between two vectors U and V, is the number of elements in which they differ.

The minimum distance of a block code is

)()( VUVU, wd

)(min),(minmin iijijiwdd UUU

Page 23: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 23

Linear block codes – cont’d

Error detection capability is given by

Error correcting-capability t of a code, which is defined as the maximum number of guaranteed correctable errors per codeword, is

2

1mindt

1min de

Page 24: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 24

Linear block codes – cont’d

For memoryless channels, the probability that the decoder commits an erroneous decoding is

is the transition probability or bit error probability over channel.

The decoded bit error probability is

jnjn

tjB ff

jn

P

)1(

1

jnjn

tjb ff

jn

jn

P

)1(1

1

f

Page 25: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 25

Linear block codes – cont’d

Syndrome testing: S is syndrome of r, corresponding to the error

pattern e.

Format Channel encoding Modulation

ChanneldecodingFormat Demodulation

Detection

Data source

Data sink

U

r

m

channel

or vectorpattern error ),....,,(or vector codeword received ),....,,(

21

21

n

n

eeerrr

er

eUr

TT eHrHS

Page 26: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 26

Linear block codes – cont’d

111010001100100000010010000001001000110000100011000010101000001000000000

(101110)(100000)(001110)ˆˆestimated is vector corrected The

(100000)ˆis syndrome this toingcorrespondpattern Error

(100)(001110):computed is of syndrome The

received. is (001110)ted. transmit(101110)

erU

e

HrHSr

rU

TT

Error pattern Syndrome

Page 27: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 27

Hamming codes Hamming codes are a subclass of linear block codes. Hamming codes are expressed as a function of a

single integer .

Example (7,4).

Hamming codes

2m

tmn-k

mkn

m

m

1 :capability correctionError :bitsparity ofNumber

12 :bitsn informatio ofNumber 12 :length Code

Page 28: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 28

Hamming codes

Example: Systematic Hamming code (7,4)

][

1101000111010001100101010001

44 PIG

Page 29: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 29

Hamming codes

Representation of encoding for Hamming code (7,4)

u1 u2u3

u4

u5

u6u7

Page 30: Tele4653 l9

2010-May-10 TELE4653 - Block Codes 30

Example of the block codes

8PSK

QPSK

[dB] / 0NEb

BP