state machines & trellises - mitweb.mit.edu/6.02/www/f2009/handouts/lectures/l10.pdf · 6.02...

5
6.02 Fall 2009 Lecture 10, Slide #1 6.02 Fall 2009 Lecture #10 state machines & trellises path and branch metrics Viterbi algorithm: add-compare-select hard decisions vs. soft decisions 6.02 Fall 2009 Lecture 10, Slide #2 State Machines & Trellises Example: k=3, rate ! convolutional code G 0 = 111: p 0 = 1*x[n] ! 1*x[n-1] ! 1*x[n-2] G 1 = 110: p 1 = 1*x[n] ! 1*x[n-1] ! 0*x[n-2] States labeled with x[n-1] x[n-2] Arcs labeled with x[n]/p 0 p 1 00 10 01 11 0/00 1/11 1/10 0/01 0/10 1/01 0/11 1/00 STARTING STATE 2 k-1 states 00 01 10 11 0/00 1/11 0/10 1/01 1/00 0/11 0/01 1/10 time x[n-1]x[n-2] Addition mod 2 aka XOR 6.02 Fall 2009 Lecture 10, Slide #3 Example Using k=3, rate ! code from earlier slides Received: 11101100011000 Some errors have occurred… What’s the 4-bit message? Look for message whose xmit bits are closest to rcvd bits Msg Xmit* Rcvd d 0000 000000000000 111011000110 7 0001 000000111110 8 0010 000011111000 8 0011 000011010110 4 0100 001111100000 6 0101 001111011110 5 0110 001101001000 7 0111 001100100110 6 1000 111110000000 4 1001 111110111110 5 1010 111101111000 7 1011 111101000110 2 1100 110001100000 5 1101 110001011110 4 1110 110010011000 6 1111 110010100110 3 Most likely: 1011 *Msg padded with 2 zeros before transmission 6.02 Fall 2009 Lecture 10, Slide #4 Viterbi Algorithm Want: Most likely message sequence Have: (possibly corrupted) received parity sequences Viterbi algorithm for a given k and r: Works incrementally to compute most likely message sequence Uses two metrics Branch metric: BM(xmit,rcvd) measures likelihood that transmitter sent xmit given that we’ve received rcvd. “Hard decision”: use digitized bits, compute Hamming distance between xmit and rcvd. Smaller distance is more likely if BER is small “Soft decision”: use received voltages (more later…) Path metric: PM[s,i] for each state s of the 2 k-1 transmitter states and bit time i where 0 " i < len(message). PM[s,i] = most likely BM(xmit m ,received parity) over all message sequences m that leave transmitter in state s at time i PM[s,i+1] computed from PM[s,i] and p 0 [i],…,p r-1 [i]

Upload: others

Post on 25-Oct-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: State Machines & Trellises - MITweb.mit.edu/6.02/www/f2009/handouts/lectures/L10.pdf · 6.02 Fall 2009 Lecture 10, Slide #1 6.02 Fall 2009 Lecture #10 • state machines & trellises

6.02 Fall 2009 Lecture 10, Slide #1

6.02 Fall 2009 Lecture #10

•  state machines & trellises •  path and branch metrics •  Viterbi algorithm: add-compare-select •  hard decisions vs. soft decisions

6.02 Fall 2009 Lecture 10, Slide #2

State Machines & Trellises

•  Example: k=3, rate ! convolutional code –  G0 = 111: p0 = 1*x[n] ! 1*x[n-1] ! 1*x[n-2]

–  G1 = 110: p1 = 1*x[n] ! 1*x[n-1] ! 0*x[n-2]

•  States labeled with x[n-1] x[n-2]

•  Arcs labeled with x[n]/p0p1

00 10

01 11

0/00

1/11

1/10 0/01

0/10 1/01

0/11 1/00

STARTING STATE

2k-1 states

00

01

10

11

0/00 1/11

0/10

1/01

1/00 0/11

0/01 1/10

time x[n-1]x[n-2]

Addition mod 2 aka XOR

6.02 Fall 2009 Lecture 10, Slide #3

Example

•  Using k=3, rate ! code from earlier slides

•  Received: 11101100011000

•  Some errors have occurred…

•  What’s the 4-bit message?

•  Look for message whose xmit bits are closest to rcvd bits

Msg Xmit* Rcvd d

0000 000000000000

111011000110

7 0001 000000111110 8 0010 000011111000 8 0011 000011010110 4 0100 001111100000 6 0101 001111011110 5 0110 001101001000 7 0111 001100100110 6 1000 111110000000 4 1001 111110111110 5 1010 111101111000 7 1011 111101000110 2 1100 110001100000 5 1101 110001011110 4 1110 110010011000 6 1111 110010100110 3

Most likely: 1011

*Msg padded with 2 zeros before transmission 6.02 Fall 2009 Lecture 10, Slide #4

Viterbi Algorithm

•  Want: Most likely message sequence

•  Have: (possibly corrupted) received parity sequences

•  Viterbi algorithm for a given k and r: –  Works incrementally to compute most likely message sequence

–  Uses two metrics

•  Branch metric: BM(xmit,rcvd) measures likelihood that transmitter sent xmit given that we’ve received rcvd. –  “Hard decision”: use digitized bits, compute Hamming distance

between xmit and rcvd. Smaller distance is more likely if BER is small

–  “Soft decision”: use received voltages (more later…)

•  Path metric: PM[s,i] for each state s of the 2k-1 transmitter states and bit time i where 0 " i < len(message). –  PM[s,i] = most likely BM(xmitm,received parity) over all message

sequences m that leave transmitter in state s at time i

–  PM[s,i+1] computed from PM[s,i] and p0[i],…,pr-1[i]

Page 2: State Machines & Trellises - MITweb.mit.edu/6.02/www/f2009/handouts/lectures/L10.pdf · 6.02 Fall 2009 Lecture 10, Slide #1 6.02 Fall 2009 Lecture #10 • state machines & trellises

6.02 Fall 2009 Lecture 10, Slide #5

Hard-decision Branch Metric

•  BM = Hamming distance between expected parity bits and received parity bits

•  Compute BM for each transition arc in trellis

–  Example: received parity = 00

–  BM(00,00) = 0 BM(01,00) = 1 BM(10,00) = 1 BM(11,00) = 2

•  Will be used in computing PM[s,i+1] from PM[s,i].

•  We’ll want most likely BM, which, since we’re using Hamming distance, means minimum BM.

00

01

10

11

0/00 1/11

0/10

1/01

1/00 0/11

0/01 1/10

Time: i i+1

Sta

te

00 0 2

1 1

0 2

1 1

6.02 Fall 2009 Lecture 10, Slide #6

Computing PM[s,i+1]

Starting point: we’ve computed PM[s,i], shown graphically as label in trellis box for each state at time i.

Example: PM[00,i] = 1 means there was 1 bit error detected when comparing received parity bits to what would have been transmitted when sending the most likely message, considering all messages that leave the transmitter in state 00 at time i.

Q: What’s the most likely state s for the transmitter at time i?

A: state 00 (smallest PM[s,i])

1

3

3

2

00

01

10

11

0/00 1/11

0/10

1/01

1/00 0/11

0/01 1/10

Time: i i+1

Sta

te

00 0 2

1 1

0 2

1 1

6.02 Fall 2009 Lecture 10, Slide #7

Computing PM[s,i+1] cont’d.

Q: If the transmitter is in state s at time i+1, what state(s) could it have been in at time i?

A: For each state s, there are two predecessor states α and β in the trellis diagram

Example: for state 01, α=10 and β=11.

Any message sequence that leaves the transmitter in state s at time i+1 must have left the transmitter in state α or state β at time i.

1

3

3

2

00

01

10

11

0/00 1/11

0/10

1/01

1/00 0/11

0/01 1/10

Time: i i+1

Sta

te

00 0 2

1 1

0 2

1 1

6.02 Fall 2009 Lecture 10, Slide #8

Computing PM[s,i+1] cont’d.

Example cont’d: to arrive in state 01 at time i+1, either

1)  The transmitter was in state 10 at time i and the ith message bit was a 0. If that’s the case, the transmitter sent 11 as the parity bits and there were 2 bit errors since we received 00. Total bit errors = PM[10,i] + 2 = 5 OR

2)  The transmitter was in state 11 at time i and the ith message bit was a 0. If that’s the case, the transmitter sent 01 as the parity bits and there was 1 bit error since we received 00. Total bit errors = PM[11,i] + 1 = 3

Which is mostly likely?

1

3

3

2

?

00

01

10

11

0/00 1/11

0/10

1/01

1/00 0/11

0/01 1/10

Time: i i+1

Sta

te

00 0 2

1 1

0 2

1 1

Page 3: State Machines & Trellises - MITweb.mit.edu/6.02/www/f2009/handouts/lectures/L10.pdf · 6.02 Fall 2009 Lecture 10, Slide #1 6.02 Fall 2009 Lecture #10 • state machines & trellises

6.02 Fall 2009 Lecture 10, Slide #9

Computing PM[s,i+1] cont’d.

Formalizing the computation:

PM[s,i+1] = min(PM[s,α] + BM[α→s], PM[s,β] + BM[β→s])

Example:

PM[01,i+1] = min(PM[10,i] + 2,

PM[11,i] + 1)

= min(3+ 2,2+1) = 3

Notes: 1)  Remember with arc was min; saved

arcs will form a path through trellis

2)  If both arcs have same sum, break tie arbitrarily (eg, when computing PM[11,i+1])

1

3

3

2

1

3

3

3

00

01

10

11

0/00 1/11

0/10

1/01

1/00 0/11

0/01 1/10

Time: i i+1

Sta

te

00 0 2

1 1

0 2

1 1

6.02 Fall 2009 Lecture 10, Slide #10

Finding the Most-likely Path

•  Path metric: number of errors on most-likely path to given state (min of all paths leading to state)

•  Branch metric: for each arrow, the Hamming distance between received parity and expected parity

0

!

!

!

00

01

10

11

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

11 Rcvd: 10 11 00 01 10

6.02 Fall 2009 Lecture 10, Slide #11

Viterbi Algorithm

•  Compute branch metrics for next set of parity bits

•  Compute path metric for next column –  add branch metric to path metric for old state

–  compare sums for paths arriving at new state

–  select path with smallest value (fewest errors, most likely)

0

!

!

!

00

01

10

11

00:2 11:0

10:1

01:1

00:2 11:0

01:1 10:1

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

11 Rcvd: 10 11 00 01 10

2

!

0

!

6.02 Fall 2009 Lecture 10, Slide #12

Example (cont’d.)

•  After receiving 3 pairs of parity bits we can see that all ending states are equally likely

•  Power of convolutional code: use future information to constrain choices about most likely events in the past

0

!

!

!

00

01

10

11

3

1

3

1

00:1 11:1

10:0

01:2

00:1 11:1

01:2 10:0

2

2

2

2

00:2 11:0

10:1

01:1

00:2 11:0

01:1 10:1

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

11 Rcvd: 10 11 00 01 10

2

!

0

!

Page 4: State Machines & Trellises - MITweb.mit.edu/6.02/www/f2009/handouts/lectures/L10.pdf · 6.02 Fall 2009 Lecture 10, Slide #1 6.02 Fall 2009 Lecture #10 • state machines & trellises

6.02 Fall 2009 Lecture 10, Slide #13

Survivor Paths

•  Notice that some paths don’t continue past a certain state –  Will not participate in finding most-likely path: eliminate

–  Remaining paths are called survivor paths

–  When there’s only one path: we’ve got a message bit!

0

!

!

!

00

01

10

11

3

1

3

1

2

2

2

2

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

00 11

10

01

00 11

01 10

11 Rcvd: 10 11 00 01 10

2

!

0

!

6.02 Fall 2009 Lecture 10, Slide #14

Example (cont’d.)

•  When there are “ties” (sum of metrics are the same) –  Make an arbitrary choice about incoming path

–  If state is not on most-likely path: choice doesn’t matter

–  If state is on most-likely path: choice may matter and error correction has failed (mark state with underline to tell)

0

!

!

!

00

01

10

11

3

1

3

1

2

2

2

2

2

3

3

2

00:0 11:2

10:1

01:1

00:0 11:2

01:1 10:1

3

2

3

4

00:1 11:1

10:2

01:0

00:1 11:1

01:0 10:2

00 11

10

01

00 11

01 10

11 Rcvd: 10 11 00 01 10

2

!

0

!

6.02 Fall 2009 Lecture 10, Slide #15

Example (cont’d.)

•  When we reach end of received parity bits –  Each state’s path metric indicates how many errors have

happened on most-likely path to state

–  Most-likely final state has smallest path metric –  Ties means end of message uncertain (but survivor paths may

merge to a unique path earlier in message)

0

!

!

!

00

01

10

11

3

1

3

1

2

2

2

2

2

3

3

2

3

2

3

4

2

4

4

4

00:1 11:1

10:0

01:2

00:1 11:1

01:2 10:0

11 Rcvd: 10 11 00 01 10

2

!

0

!

6.02 Fall 2009 Lecture 10, Slide #16

Traceback

•  Use most-likely path to determine message bits –  Trace back through path: message in reverse order

–  Message bit determined by high-order bit of each state (remember that came from message bit when encoding)

–  Message in example: 101100 (w/ 2 transmission errors)

0

!

!

!

00

01

10

11

3

1

3

1

2

2

2

2

2

3

3

2

3

2

3

4

2

4

4

4

11 Rcvd: 10 11 00 01 10

2

!

0

!

1 Msg: 0 1 1 0 0

Page 5: State Machines & Trellises - MITweb.mit.edu/6.02/www/f2009/handouts/lectures/L10.pdf · 6.02 Fall 2009 Lecture 10, Slide #1 6.02 Fall 2009 Lecture #10 • state machines & trellises

6.02 Fall 2009 Lecture 10, Slide #17

Viterbi Algorithm Summary

•  Branch metrics measure the likelihood by comparing received parity bits to possible transmitted parity bits computed from possible messages.

•  Path metric PM[s,i] measures the likelihood of the transmitter being in state s at time i assuming the mostly likely message of length i that leaves the transmitter in state s.

•  Most likely message? The one that produces the most likely PM[s,N].

•  At any given time there are 2k-1 most-likely messages we’re tracking → time complexity of algorithm grows exponentially with constraint length k.

6.02 Fall 2009 Lecture 10, Slide #18

Hard Decisions

•  As we receive each bit it’s immediately digitized to “0” or “1” by comparing it against a threshold voltage –  We lose the information about how “good” the bit is:

a “1” at .9999V is treated the same as a “1” at .5001V

•  The branch metric used in the Viterbi decoder is the Hamming distance between the digitized received voltages and the expected parity bits –  This is called hard-decision Viterbi decoding

•  Throwing away information is (almost) never a good idea when making decisions –  Can we come up with a better branch metric that uses

more information about the received voltages?

6.02 Fall 2009 Lecture 10, Slide #19

Soft Decisions

•  Let’s limit the received voltage range to [0.0,1.0] –  Veff = max(0.0, min(1.0, Vreceived))

–  Voltages outside this range are “good” 0’s or 1’s

•  Define our “soft” branch metric as the Euclidian distance between received Veff and expected voltages

•  Soft-decision decoder chooses path that minimizes sum of Euclidian distances between received and expected voltages –  Different branch metric but otherwise the same recipe

0.0,0.0

0.0,1.0 1.0,1.0

1.0,0.0

Vp0,Vp1 “Soft” metric when expected parity bits are 0,0

6.02 Fall 2009 Lecture 10, Slide #20

Channel Coding Summary

•  Add redundant info to allow error detection/correction –  CRC to detect error-transmission (our safety net for catching

undiscovered or uncorrectable errors) –  Block codes: multiple parity bits, RS codes: oversampled polynomials

–  Convolutional codes: continuous streams of parity bits

•  Error correction –  Block codes: use parity errors to triangulate which bits are in error

–  RS codes: use subsets of bits to vote for message, majority rules!

–  Convolutional codes: use Viterbi algorithm to find most-likely message

Digital Transmitter

Digital Receiver

Channel Coding

Error Correction

Message bitstream

bitstream with redundant information used for dealing with errors

redundant bitstream possibly with errors

Recovered message bitstream

Ch

an

nel