elg3175 introduction to communication...

44
Lecture 18-19 Introduction to Error Control Coding ELG3175 Introduction to Communication Systems

Upload: others

Post on 21-Mar-2020

12 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Lecture 18-19

Introduction to Error Control Coding

ELG3175 Introduction to Communication Systems

Page 2: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Error Control Coding

Purpose 1.  Convert a problematic channel (i.e. where increasing

SNR hardly matters) into a non-problematic channel.

2. Improve data reliability and/or provide coding gain. Coding gain: difference in Eb/No required to attain a

particular error rate without coding and with coding.

Page 3: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Types of Error Control

1. Forward Error Correction (FEC): based on the received signal, receiver attempts to correct the most likely errors.

2. Automatic repeat request (ARQ): receiver detects errors but cannot correct them, hence asks for retransmissions.

3. Hybrid FEC-ARQ: receiver tries to correct, if it cannot correct, then asks for retransmissions.

Page 4: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Major Classes of FEC Techniques

1.  Block coding

2. Convolutional coding 3. Combined modulation and coding (trellis

coded modulation: TCM) •  In the first 2 categories redundant bits are added to the

transmitted sequence. Signal constellation remains the same. Hence in return for error control capability, more bandwidth is utilized.

•  In TCM, the same number of information bits per symbol is transmitted but the redundancy is achieved by using a larger constellation. Thus, usually there is no need for bandwidth expansion.

Page 5: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Parity Bits

•  Suppoose we wish to transmit m=[1001001]. •  Let us assume that the second bit is received in error, r

= [1101001]. •  The receiver has no way of knowing that the second bit

has been incorrectly detected, therefore we must accept the consequences of the detection error.

•  Suppose, before transmission, we add an even parity bit to the message to be transmitted, mc= [10010011].

•  Now, let us assume that the second bit is in error, r = [11010011]. There are now 5 1’s, which is not permitted. Therefore the error is detected and the receiver can request a retransmission.

•  The detection of the error was made possible by the addition of the parity bit.

Page 6: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Block Codes

•  The data is grouped into segments of k bits. •  Each block of k bits is encoded to produce a block of n

bits. where n>k. The encoder adds redundancy to the data to be transmitted.

•  The code rate is r = k/n.

encoder m c

1011 1011100

Page 7: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Code rate & minimum distance

Code rate = # of information bits in a block# of total bits in a block

The bandwidth expansion is / = /The energy per channel bit ( is related to energy per information bit ( through

Minimum distance ( Minimum number ofpositions in which any 2 codewords differ.

r kn

r n kE

E E rE

d

c

b c b

=

=

1)

)

):min

Page 8: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Linear Block Codes

•  Let C be a code made up of the vectors {c1, c2, … cK}. •  C is a linear code if for any ci and cj in C, ci+cj is also in

C. •  Example C = {c1 = 0000, c2 = 0110, c3 = 1001, c4 =

1111}. •  c1+cx = cx for x = 1, 2, 3 ou 4. •  cx+cx = c1. •  c2+c3 = c4, c3+c4 = c2, c2+c4 = c3. •  C is a linear code. •  C2 = {c1 = 0001, c2 = 0111, c3 = 1000, c4 = 1110}. •  cx+cx = 0000 which is not in C2. •  C2 is not linear.

Page 9: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Hamming Weight

•  For codeword cx of code C, its Hamming Weight is the number of symbols in cx that are not 0.

•  C = {0000 0110 1001 1111} •  H.W{0000} = 0 •  H.W{0110} = 2 •  H.W{1001} = 2 •  H.W{1111} = 4

Page 10: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Hamming Distance

•  The Hamming Distance between codewords ci and cj of C is the number of positions in which they differ.

•  0000 0110 1001 1111 •  0000 0 2 2 4 •  0110 2 0 4 2 •  1001 2 4 0 2 •  1111 4 2 2 0

•  ci+cj = 0 in the positions in which they are the same and ci+cj = 1 in the positions in which they differ. Therefore HD{ci, cj} = HW{ci+cj}.

Page 11: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Minimum Distance

•  A code’s minimum distance is the minimum Hamming distance between two different codewords in the code.

•  In our example, dmin = 2. •  We saw previously HD{ci,cj} = HW{ci+cj} = HW{cx}

where, in the case of linear block codes, cx is another codeword in C excluding the all-zero codeword. –  Therefore for linear block codes, dmin = minimum

Hamming weight of all codewords in C excluding the all-zero codeword.

•  In our example, if we exclude codeword 0000, the remaining codewords are 0110, 1001 and 1111. The minimum Hamming weight is 2. Therefore dmin = 2.

Page 12: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Basis of a linear block code •  C is a linear block code. •  Let us choose k linearly independent codewords, c1, c2, …, ck.

None of these k codewords can be expressed as a linear combination of the others.

•  All codewords in C can then be expressed as a linear combination of these k codewords. –  The k codewords selected form the basis of code C.

•  cx = a1c1+a2c2+a3c3+…+akck where ai = 0 or 1 (binary block codes).

•  In our example, we can select 0110 and 1111, or 0110 and 1001 or 1001 and 1111.

•  Example, let us select c1 = 0110 and c2 = 1111 as the basis of the code. –  0000 = 0c1+0c2, 0110 = 1c1+0c2, 1001 = 1c1+1c2 and

1111 = 0c1+1c2.

Page 13: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Generator Matrix

Gmcc

cc

G

xx

k

=

⎥⎥⎥⎥

⎢⎢⎢⎢

=2

1Example

]1001[]11[]0110[]01[]1111[]10[]0000[]00[

11110110

=

=

=

=

⎥⎦

⎤⎢⎣

⎡=

GGGG

G

The dimensions of G are k×n.

Page 14: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Equivalent codes

•  The codes generated by G1 and G2 are equivalent if they generate the same codewords but with a different mapping to message words.

•  Example

⎥⎦

⎤⎢⎣

⎡=⎥

⎤⎢⎣

⎡=

11111001

11110110

21 GG

m 00 0000 0000 01 1111 1111 10 0110 1001 11 1001 0110

Page 15: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Systematic codes

•  A code is systematic if the message bits can be found at the beginning of the codeword.

•  c = [m|p]. •  Gsyst = [Ik|P]. •  Any generator matrix can be transformed into Gsyst

using linear transformation.

⎥⎦

⎤⎢⎣

⎡=

01101001

systGm 00 0000 01 0110 10 1001 11 1111

Page 16: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Parity Check Matrix

•  A parity check matrix H is a matrix with property cHT = 0.

•  cHT = 0 can be written as mGHT = 0 •  Therefore GHT = 0. •  We can find H from Gsyst. •  H= [PT|In-k].

•  H has dimensions (n-k)×n.

⎥⎦

⎤⎢⎣

⎡=

10010110

H

Page 17: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Example Hamming (7,4) code

⎥⎥⎥⎥

⎢⎢⎢⎢

=

1011000010110000101101010011

G

Find all of the codewords, find dmin, find H.

Page 18: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

A simple block code: (7,4) Hamming Code Message Codeword 0000 000 0000 1000 110 1000 0100 011 0100 1100 101 1100 0010 111 0010 1010 001 1010 0110 100 0110 1110 010 1110 0001 101 0001 1001 011 1001 0101 110 0101 1101 000 1101 0011 010 0011 1011 100 1011 0111 001 0111 1111 111 1111

Page 19: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Decoding

•  The received word, r = c+e, where e = error pattern. •  For example if c = (1 1 0 0 1 1 0 1) and r = (1 0 0 0 1

1 0 1), then e = (0 1 0 0 0 0 0 0). •  Assuming that errors occur independently with

probability p < 0.5 –  Therefore, code bits are correctly detected with

probability (1-p) •  Lower weight error patterns are more probable than

higher weight ones.

Page 20: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Example •  C = {(00000) (01011) (10110) (11101)} •  r = (11111) •  If c = (00000), then e = (11111) which occurs with

probability p5. •  If c = (01011), then e = (10100) which occurs with

probability p2(1-p)3. •  If c = (10110), then e = (01001) which occurs with

probability p2(1-p)3. •  If c = (11101), then e = (00010) which occurs with

probability p(1-p)4 > p2(1-p)3 > p5. •  Therefore receiver selects c = (11101) as most likely

transmitted codeword and outputs message that corresponds to this codeword.

Page 21: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Standard Array Decoding •  Lookup table that maps received words to most likely

transmitted codewords. •  Each received word points to a memory address which

holds the value of the most likely transmitted word. 00000 01011 10110 11101

00001 01010 10111 11100 00010 01001 10100 11111 00100 01111 10010 11001 01000 00011 11110 10101 10000 11011 00110 01101 10001 11010 00111 01100 11000 10011 01110 00101

Page 22: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

How to Build Standard Array

•  Write out all possible received words. •  Remove all codewords and place at top of columns with

all-zero codeword at left side (left most column corresponds to error pattern)

•  Take lowest weight vector from remaining words and place in left column. Add this vector to all codewords and place result below that codeword. –  Remove all of these results from list of all possible

received words. •  Repeat until list of possible received words is exhausted

Page 23: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Syndrome decoding

•  S = rHT. •  r=c+e, therefore S = (c+e)HT = cHT + eHT = eHT. •  All vectors in the same row of the standard array

produce the same syndrome. •  Syndrome points to a memory address which contains

the most likely error pattern, then decoder computes c = r+e.

Page 24: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Example

•  For our code:

⎥⎥⎥

⎢⎢⎢

=

⎥⎦

⎤⎢⎣

⎡=

100100101100101

1101001101

H

G

Page 25: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Example continued

•  Suppose r = (01001), then

•  This indicates that the 4th bit is in error : e = (00010) and c = (01011).

[ ]010

100010001110011

)10010( =

⎥⎥⎥⎥⎥⎥

⎢⎢⎢⎢⎢⎢

Page 26: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Error correcting and Error Detecting Capabilities of a code

•  t = number of error that decoder can always correct. •  J = number of errors that decoder can always detect. •  t = (dmin-1)/2 (dmin is odd) or (dmin-2)/2 (dmin is even). •  J = dmin -1 •  We can have codes that both correct and detect errors,

then t+j = dmin -1 where j > t.

Page 27: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

How to calculate coding gain

Demodulator FEC Decoder p Pb

Ec/No

Identify the variables p: demodulated, hard decision channel bit error probability (if coded an intermediate quantity, if uncoded p= Pb) Pb: decoded bit error rate (important quantity) Eb: Energy per information bit Ec: Energy per channel (info or parity) bit, Ec=r Eb

Page 28: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Consider (7,4) Hamming code, let p=0.01 •  Prob. of no error = (1-p)7=0.932 •  Prob. of 1 error = 7(1-p)6p=0.066 •  Thus 99.8% of codewords are correct(ed).

•  Prob. of 2 errors = 21(1-p)5p2=0.00197 •  Prob. of 3 errors = 35(1-p)4p3=0.00003 Simplify by saying 2 errors with Prob. 0.002 •  2 errors at decoder input will cause 3 errors at output

Page 29: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

•  0.002 i.e. 0.2% of all code words contain 2 errors. Therefore out of decoder 0.002 code words have 3 of 7 positions in error.

•  Probability of any position to be in error is Pb=0.002*3/7=0.00085 Conclusion p=0.01 Pb=0.00085

Page 30: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Coding Gain in AWGN

•  In AWGN with 2PSK,

p =Q 2EN0

!

"##

$

%&&

hereE is Ecif coded and Eb if uncodedEc = rEbwhere r is the code rate

Page 31: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

(a) codedp=0.01! Pb=0.00085 p=0.01 corresponds to Ec /N0 =4.1 dBEb = rEc

Ec /N0 of 4.1 dB corresponds toEb

N0

=Ec

No

(in dB)+10 log 74"

#$%

&'= 6.5dB

(b) uncodedIf it was not coded, from

Pb =Q2Eb

N0

"

#$$

%

&'' we find

Pb = 0.00085 requires Eb

N0

of 6.9 dB

Coding Gain=6.9-6.5=0.4 dB

Page 32: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Coding Gain in Fading Channel

Rayleigh fading with 2PSK p =1/ 4 EN0

!

"#

$

%&

Uncoded system p = Pbfor p = 0.00085, required Eb / N0 = 294 (24.7dB)If coded, p = 0.01 leads to Pb = 0.00085

0.01=1/ 4 Ec

N0

!

"#

$

%&'

Ec

N0

= 25'14dB

Eb

N0

=Ec

N0

+10 log 74=16.4

Coding gain = 24.7(16.4 = 8.3 dB

Page 33: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Performance of some block codes

Page 34: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

A convolutional code operates on the source data using a sliding window. The encoder utilizes the current inputs as well as the past inputs.A block diagram of a rate 1/2 convolutional code is shown below. In this example, there are 2 memory locations, thus 2 past and 1 present input affect the output. This is referred to as the constraint length of a convolutional code.

Convolutional Codes

Page 35: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

A simple convolutional encoder

Page 36: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Code Tree

In a code tree, every branch indicates a distinct code word. The figure in the next page shows the code tree for the same simple code. Branches with solid lines represent “0” encoder inputs and branches with dashed lines represent “1”. Nodes corresponding to 00, 01,10 and 11 are indicated by different symbols. The tail bits at the end are to terminate the code such that the encoder is back at all zero state. As the length of the code increases, the number of branches increase exponentially.

Page 37: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

10 01 11

001101

10

01

10

11 00

11

00

00

01

10

00

11

0000 00

01 01

10 11

11

00

11 10 11

11

00

10

00

11

01

01

10

10

11

11

01

00

00

01 11

11

10

00

01

00

11

00

11

11

10

00

00

11

00

00

11

START1,2

TAIL SEQUENCE

TIME 0 1 2 3 4 5 6

Page 38: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Decoding Techniques There are many convolutional code decoding

techniques. 1. Viterbi decoding

–  Provides the optimum performance. Convenient for constraint lengths of approx. up to 10.

2. Sequential decoding –  Nearly optimum performance. Good for long

constraint length codes because its complexity is a weak function of constraint length.

3. Feedback decoding. –  Not as powerful but much simpler.

Page 39: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

10

01

00

11

01 01 10

10 11 00

11 10 01

00 00 11

00 11 10 01 11 00 01 10

00

11

01

10

00

11

0 1 a

a

b

a

b a

b c

d

a

b c

d c

d

t 1 t 2 t 3 4 t

Codeword Branch

b a

c d a b c d a b c d a b c d

A B

11 00 01 00 Received sequence

step 1 step 2 Received sequence

Feedback Decoding

Page 40: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Soft Decisions versus Hard Decisions In all the preceding discussions we assumed the received signal was 0 or 1. This is called using hard decisions. In hard decisions we usually try to choose the codeword nearest to the received vector in terms of minimum Hamming distance. (Hamming distance between two vectors is the number of positions in which they differ). Ideally we would like compare the received signal vector with all possible codewords and choose the one nearest in Geometric sense (i.e. the one with the minimum Euclidean distance). This would require the demodulator to send to the decoder not binary but continuum of values. Receiving the actual signal amplitudes would make the tasks of a decoder much more complex. Nevertheless, the performance would improve because of using more reliable information. Even if the received signal is quantized in to a few amplitudes, the improvement is considerable. Soft decisions based on 3-bit quantization is illustrated in the next page.

Page 41: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Soft versus Hard (ctd) At this point we should mention that using the soft decisions is possible but difficult and complex for most block codes. On the other hand, Viterbi decoding or sequential decoding techniques of convolutional codes are readily amenable to soft decision decoding.

Page 42: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Interleaver

•  A burst of errors is spread in time by using an interleaver.

•  Types of interleaver –  Block –  Convolutional –  Pseudorandom

Page 43: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Position of an interleaver

FEC encoder Interleaver Channel De-int FEC

Decoder

Page 44: ELG3175 Introduction to Communication Systemssite.uottawa.ca/~yongacog/courses/elg3175/Lecture18-19-AY-Coding.pdf · • Suppose, before transmission, we add an even parity bit to

Interleaving

1 2 3 4 5 6 7

8 9 10 11 12 13 14

15 16 17 18 19 20 2122 23 24 25 26 27 28

29 30 31 32 33 34 35

1 8 15 22 29 2 9

16 23 30 3 10 17 2431 4 11 18 25 32 512 19 26 33 6 13 2027 34 7 14 21 28 25

Order of incoming symbols

Order of transmitted symbols