victor tomashevich-convolutional codes

Upload: akramece912

Post on 07-Apr-2018

240 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    1/35

    Name

    Convolutional codes

    Tomashevich Victor

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    2/35

    Name - 2 -

    Introduction

    Convolutional codes map information to code bits sequentially byconvolving a sequence of information bits with generator sequences

    A convolutional encoder encodes Kinformation bits to N>Kcode bitsat one time step

    Convolutional codes can be regarded as block codes for which theencoder has a certain structure such that we can express theencoding operation as convolution

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    3/35

    Name - 3 -

    Properties of convolutional codes

    Consider a convolutional encoder. Input to the encoder is a

    information bit sequence u (partitioned into blocks of length K):),,( 10 uuu

    ),,,()()2()1( K

    iiii uuuu

    The encoder output the code bit sequence

    x(partitioned into blocks

    ),,( 10 xxx

    ),,,()()2()1( N

    iiii xxxx

    N

    KR

    of length N)

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    4/35

    Name - 4 -

    Example: Consider a rate convolutional code with K=1 and N=2defined by the circuit:

    iu

    ix

    The sequences ),,(

    )1(

    1

    )1(

    0

    xx , ),,(

    )2(

    1

    )2(

    0

    xx are generated as follows:

    ii ux )1(

    1

    )2(

    iii uuxand

    Multiplexing between)1(

    ix and)2(

    ix gives the code bit sequence

    ),,()),(),(( 10)2(

    1

    )1(

    1

    )2(

    0

    )1(

    0 xxxxxxx

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    5/35

    Name - 5 -

    The convolutional code is linear

    The encoding mapping is bijective

    Code bits generated at time step iare affected by information bits upto Mtime steps i 1, i2, , i Mback in time. Mis the maximaldelay of information bits in the encoder

    Code memory is the (minimal) number of registers to construct anencoding circuit for the code.

    Constraint length is the overall number of information bits affectingcode bits generated at time step i: =code memory + K=MK + K=(M +1)K

    A convolutional code is systematic if the Ncode bits generated attime step icontain the Kinformation bits

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    6/35

    Name - 6 -

    Example: The rate code defined by the circuit

    iuix

    has delay M=1, memory 1, constraint length 2, and it is systematic

    Example: the rate 2/3 code defined by the circuit

    )1(iu

    )2(iu

    )1(ix

    )2(ix

    )3(

    ix

    has delay M=1, memory 2, constraint length 4, and not systematic

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    7/35

    Name - 7 -

    Tree

    A

    A

    A

    B

    B

    A

    0u

    1u

    2u

    1

    0

    00

    01

    00

    00

    01

    11

    00

    11

    01

    10

    10

    11

    10

    11

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    8/35

    Name - 8 -

    Trellis

    The tree graph can be contracted to a direct graph called trellis

    of the convolutional code having at most Snodes at distancei=0,1, to the root

    The contents of the (at most) MKencoder registers are assignedthe variables 1,,1,0),2()( KMjGFs ji

    The vector ),,()1()1()0( KMiiii ssss

    combibing all register contents at time step i is called state ofthe encoder at time step i.

    The code bit blockix is clearly a function of is and iu , only

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    9/35

    Name - 9 -

    Example:

    The encoder of the rate convolutional code has 221 Sdifferent states. The state is given by

    iss The code bit block

    ix at time step i is computed from is and iu by

    ii ux )1(

    andiii sux

    )2(

    0iu

    1i

    u

    A AAA

    B BB

    11 1111

    00 0000

    0101

    1010

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    10/35

    Name - 10 -

    Example: Constructing a trellis section

    iuix

    ii ux )1(

    andiii sux

    )2(

    Two equations are required:

    (1) How doesi

    s depend on miu and possibly 0,

    msmi

    1 ii us(2) How does ix depend on is and iu

    ii ux )1(

    andiii

    sux )2(

    The branches are labeled with ii xu | called state transitionleading from a state

    is to a new state 1is

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    11/35

    Name - 11 -

    Trellissection:

    0 0

    1 1

    0|00

    0|01 1|11

    1|10

    00000

    11111

    0|000|000|00

    1|101|101|10

    0|010|010|01 1|111|111|11

    0s 1s 2s 2Ls 1Ls

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    12/35

    Name - 12 -

    00000

    1111

    0|000|000|00

    1|101|10

    0|010|01 1|111|111|11

    0s 1s 2s 2Ls 1Ls

    State diagram

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    13/35

    Name - 13 -

    State diagramExample: Trellis of the rate convolutional code

    iiux

    )1(and iii sux

    )2(

    0 0

    1 1

    00

    01 11

    10

    0i

    u

    1i

    u

    is

    1is

    State diagram:

    0

    1

    00

    0111

    10

    0i

    u

    1i

    u

    D i ti ith b t i

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    14/35

    Name - 14 -

    Description with submatrices

    Definition: A convolutional code is a set Cof code bit sequences

    ),,,,,( 10 ixxxx

    ),,,()()2()1( N

    iiii xxxx )2()(

    GFxj

    i

    partitioned into lenth Nblocks

    There exist many encoders mapping information bit sequences

    ),,( 10 uuu

    ),,,()()2()1( K

    iiii uuuu )2()(

    GFuj

    i

    (partitioned into length K

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    15/35

    Name - 15 -

    Example: the following two encoding curcuits generate thesame set of code word sequences

    )1(

    iu

    )2(

    iu

    )1(

    ix

    )2(

    ix )3(ix

    )1(

    ix

    )2(

    ix )3(ix

    )1(

    iu

    )2(

    iu

    Generator matrix

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    16/35

    Name - 16 -

    Generatormatrix

    ,Gux where

    M

    M

    M

    GGGG

    GGGG

    GGGG

    G

    210

    210

    210

    iGuxM

    m

    mmii

    ,0

    The generated convolutional code has rateR=K/N, memoryK*M, and constraint lengthK*(M+1)

    E l

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    17/35

    Name - 17 -

    Example:

    The rate code is given by

    ii ux )1(

    iii sux )2(

    and

    0G governs how iu affects 11:)( 0)2()1( Gxxx iii

    1G governs how 1iu affects 10: 1 Gx i

    ,),,())(),(),(( 210)2(

    2

    )1(

    2

    )2(

    1

    )1(

    1

    )2(

    0

    )1(

    0 Guuuxxxxxx

    where

    110111

    0111

    G

    Description with polynomials

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    18/35

    Name - 18 -

    Description with polynomials

    )()()(

    )()()(

    )()()(

    )(

    )()2()1(

    )(

    2

    )2(

    2

    )1(

    2

    )(

    1

    )2(

    1

    )1(

    1

    DgDgDg

    DgDgDg

    DgDgDg

    DG

    NKKK

    N

    N

    )2(,)( )(,)(

    ,

    2)(

    2,

    1)(

    1,

    )(

    0,

    )( GFgDgDgDggDg jmiMj

    Mi

    j

    i

    j

    i

    j

    i

    j

    i

    ))(),(),(()( )()2()1( DuDuDuDu K

    where

    ,,,2,1,)( )()(1)(

    0

    )(KjDuDuuDu

    ij

    i

    jjj

    ))(,),(),(()( )()2()1( DxDxDxDx N

    where

    NjDxDxxDx ijijjj ,,2,1,)( )()(1

    )(

    0

    )(

    )()()( DGDuDx

    ,,,0,,,1,,,1),,()(, MmNjKijiGg mjmi

    NjKiDgMj

    iji

    ,,1,,,1)),(deg(max )(,

    Example:

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    19/35

    Name - 19 -

    Example:

    The rate code is given by

    ii ux )1(

    iii sux )2(

    and

    )(

    )1(

    1 Dgand

    )(

    )2(

    1 Dg ))()(()(

    )2(

    1

    )1(

    1 DgDgDG

    From M=1 follows that 1)))(deg(( )( Dg jiThe polynomial )()1(1 Dg governs how ,1,0, mu ml affects

    101)(:

    )1(

    1

    )1( DDgxl

    The polynomial )()2(1 Dg governs how ,1,0, mu ml affects

    111)(: )2(1)2( DDDgxl

    DDuu 1)(),0,1,1( ,Gux yielding ),00,01,10,11( x

    )()()( DGDuDx yielding211)( DDDx

    Punctured convolutional codes

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    20/35

    Name - 20 -

    Punctured convolutional codes

    A sequence of code bits is punctured by deleting some of thebits in the sequence according to a fixed rule

    In general, the the puncturing of a rate K/Nconvolutionalcode is defined using Npuncturing tables, one for any codebit ,,,1,)( Njx ji in a block ix

    Each table contains pbits, where p is the puncturing period. If

    a bit is 1, the corresponding code bit is part of the puncturedcode, if the bit is 0, the corresponding code bit is not part of thepunctured code

    For a sequence of code bit blocks ,,1,0, ix i the puncturingtables are applied periodically. Npuncturing tables are combinedin a pN puncturing matrixP

    Example:

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    21/35

    Name - 21 -

    Example:

    The encoder circuit of rate convolutional code given by

    22 11)( DDDDG

    )0,0,1,0,0(u )11,01,11,00,00(NPxThe sequence NPx is punctured using two different puncturingmatrices:

    ,

    1001

    01111

    P

    1011

    01112P

    The puncturing period p is 4. Using1P , 3 out of 4 code bits

    )1(

    ixand 2 out of 4 code bits

    )2(

    ix of the mother code bits are used, theothers are discarded

    5/4)23/()44(2/1 R

    and u is encoded to )11,1,1,0,00()11,1,1,0,00( XXXx

    Using ,2P the rate of the punctured code is

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    22/35

    Name - 22 -

    Using ,2P the rate of the punctured code is3/2)33/()44(2/1 R and u is encoded to

    )11,1,1,00,00()11,1,1,00,00( XXx

    )1(

    ix

    )2(

    ix

    1 1 1 0

    1 0 0 1

    1 1 1 0

    1 1 0 11P

    2P

    Puncturing tables

    Pucturing period p=41 2 3 4

    10|

    1|

    1|

    0|

    00

    )2(

    4

    )1(

    4

    )2(

    3

    )1(

    2

    )1(

    1

    )2(

    0

    )1(

    0 xxxxxxx

    00100

    43210 uuuuu

    Encoder of a rate code punctured to a rate 4/5 (top puncturingtables) or a rate 2/3 code (bottom puncturing tables)

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    23/35

    Name - 23 -

    The rate Rof a punctured code obtained from a rate NKR /0 mother code using the pN puncturing matrix P is given as

    Pinof

    pK

    Pinof

    pN

    RR 1#1#0

    With puncturing we can easily construct convolutional codeswith arbitrary rational rate. However, punctured codes of rateR=K/Nobtained from an optimized good mother code ofmemory musually perform worse than unpunctured rate K/N,memory moptimized codes given by a NK generator matrix

    )(DG

    This performance gap increases with the number of punctured

    bits. The advantage of puncturing is that the decodingcomplexity is not altered, since the original trellis of the mothercode can be used

    Consider a rate 1/3 memory 4 mother code given by the submatrices

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    24/35

    Name - 24 -

    Consider a rate 1/3, memory 4 mother code given by the submatrices

    )101(),011(),010(),111( 3210 GGGG and

    ),111(4 G

    code# rate punc. tablefd fdc

    91/38/24

    8

    7

    6

    5

    4/11

    8/124/108/20

    4/98/18

    1/28/16

    11

    9

    8

    7

    7

    8

    10

    2

    2

    32

    1111 11111111 11111111 1111

    1111 11111111 11111110 11101111 11111111 11111100 1100

    1111 11111111 1111

    1000 10001111 11111111 11110000 0000

    code# ratepunc. table fd fdc

    44/78/14

    3

    2

    1

    2/38/12

    4/58/10

    8/9

    8/9

    5

    4

    3

    2

    8

    4

    42

    2

    1111 11111110 11100000 0000

    1111 1111

    1010 10100000 00001111 11111000 10000000 0000

    1111 01111000 1000

    0000 0000

    Decoding of convolutional codes

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    25/35

    Name - 25 -

    ecod g o co o ut o a codesThe Viterbi algorithm

    jujx1

    jx2

    js1 js2

    )1,1,1,1,1,1,1( u

    )1,1(),( 2010 ss

    )11,11,11,11,11,11,11( x

    1/ 1 1

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    26/35

    Name - 26 -

    +1+1 +1+1

    -1-1

    -1+1

    +1-1 +1-1

    -1+1

    -1-1

    jj ss 21 , jj xx 21 , 1211 , jj ss

    +1/+1+1

    -1/-1-1

    +1/+1+1

    -1/-1+1

    -1/-1+1

    +1/+1-1

    +1/+1-1

    -1/-1-1

    Hard decisions

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    27/35

    Name - 27 -

    Hard decisions

    y

    j

    m

    jj

    m

    j

    m

    j yxyx 2)(

    21

    )(

    1

    )(

    +1+1 +1+1+1+1+1+1+1+1+1+1-1+1

    +1+1

    0 -2

    +2

    0

    0

    -2

    +2

    0

    -2

    0

    0

    0

    +2

    -2

    +2

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2

    -4

    +8+6+4

    +12+10+8+6

    +20+2

    +4+2

    -2

    0 +6

    +6

    +4+20+2

    +4+2+4+2

    (+1+1,-1+1,+1+1,+1+1,+1+1,+1+1,+1+1)

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    28/35

    Name - 28 -

    +1+1 +1+1+1+1+1+1+1+1+1+1-1+1

    +1+1

    0 -2

    +2

    0

    0

    -2

    +2

    0

    -2

    0

    0

    0

    +2

    -2

    +2

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2

    -4

    +8+6+4

    +12+10+8+6

    +20+2

    +4+2

    -2

    0 +6

    +6

    +4+20+2

    +4+2+4+2

    ju +1 +1 +1 +1 +1 +1 +1 - No error

    Hard decisions

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    29/35

    Name - 29 -

    y

    j

    m

    jj

    m

    j

    m

    jyxyx

    2

    )(

    21

    )(

    1

    )(

    +1+1 +1+1+1+1+1+1+1+1+1-1-1+1

    +1+1

    0 -2

    +2

    0

    0

    0

    0

    -2

    +2

    +2

    0

    -2

    +2

    -2

    +2

    -2

    +2

    0

    0

    00

    -2+20

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2

    -4

    +6+4+2

    +10+8+6+4

    +4+2+2

    +2+2

    -2

    0 +4

    +4

    +2+4+20

    +6+4+2+4

    (+1+1,-1+1,+1-1,+1+1,+1+1,+1+1,+1+1)

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    30/35

    Name - 30 -

    +1+1 +1+1+1+1+1+1+1+1+1-1-1+1

    +1+1

    0 -2

    +2

    0

    0

    0

    0

    -2

    +2

    +2

    0

    -2

    +2

    -2

    +2

    -2

    +2+2

    0

    0

    00

    -2+20

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2

    -4

    +6

    +10+8+6+4

    +4+2+2

    +2+2

    -2

    0 +4

    +4

    +20

    +2+4

    ju +1 +1 +1 +1 +1 +1 +1 - No error

    Hard decisions

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    31/35

    Name - 31 -

    y

    j

    m

    jj

    m

    j

    m

    j yxyx 2)(

    21

    )(

    1

    )(

    +1+1 +1+1+1+1+1+1+1+1-1+1-1-1

    +1+1

    0 -2

    +2

    0

    0

    +2

    -2

    -2

    +2

    -2

    0

    +2

    -2

    +2

    0

    0

    0

    0

    00

    -2+20

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2

    -2

    +8+6+8

    +12+10+8+2

    +20+4

    00

    -2

    -2 +6

    +10

    +8+4+6

    +8+6+8+2

    (+1+1,-1-1,-1+1,+1+1,+1+1,+1+1,+1+1)

    +1+1 +1+1+1+1+1+1+1+1-1+1-1-1

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    32/35

    Name - 32 -

    +1+1 +1+1+1+1+1+1+1+11+11 1

    +1+1

    0

    +2

    -2

    +2

    0

    0

    +2

    -2

    -2

    +2

    -2

    0

    +2

    -2

    +2

    0

    0

    +2

    -2

    0

    0

    00

    0

    -2-2+20 +2

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2

    -2

    +8+6

    +12+10+8+2

    +20+4

    00

    -2

    -2 +6

    +10

    +8+4+6

    +8+8+2

    ju +1 -1 -1 +1 +1 +1 +1- 2 decoding errors

    S ft d i i

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    33/35

    Name - 33 -

    Soft decisions

    ,,2

    ,,2/1,,2/1

    ,2

    )(

    )(

    )(

    ,

    )(

    ij

    m

    ij

    ij

    m

    ij

    ij

    m

    ij

    ij

    m

    ij

    ij

    yx

    yxyx

    yx

    l

    GOOD channel

    BAD channel

    GOOD channel

    BAD channel

    jjjjjjmj ylxylx 222111)(

    CSI values ((G,B),(B,B),(G,G),(G,B),(B,B),(G,G),(G,G))

    y (+1+1,-1-1,-1+1,+1+1,+1+1,+1+1,+1+1)

    +1+1 +1+1+1+1+1+1+1+1-1+1-1-1

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    34/35

    Name - 34 -

    +1+1

    0

    +1

    -2.5

    +2.5

    -4

    +4

    -1

    +1

    -2.5

    +2.5

    0

    -4

    0

    0

    0

    0

    0

    -4

    +4

    -4

    +4

    -1

    0

    0

    0

    0

    0

    0

    0

    0

    00

    00

    00

    00

    -4-4-1-2.5+1

    +2.5+4 +4 +4

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2.5

    -2.5

    +10+7+9

    +18+14+11+10

    +5-0.5+1.5

    +7.5+3.5

    -2.5

    -2.5 +7

    +13

    +9+5-0.5+1.5

    +9+5+4+1.5

    G B G GG GB BG G G BB B

    +2 +0.5 +2 +2+2 +2+0.5 +0.5+2 +0.5-2 +2-0.5-0.5

    +1+1 +1+1+1+1+1+1+1+1-1+1-1-1

  • 8/4/2019 Victor Tomashevich-Convolutional Codes

    35/35

    Name - 35 -

    +1+1

    0

    +1

    -2.5

    +2.5

    -4

    +4

    -1

    +1

    -2.5

    +2.5

    0

    -4

    0

    0

    0

    0

    0

    +4+4

    0

    0

    00

    -2.5

    +2.5+4

    j=0 j=6j=5j=4j=3j=2j=1 j=7

    +2.5

    -2.5

    +10

    +18+14+11+10

    -0.5+1.5

    +7.5+3.5

    -2.5

    -2.5 +7

    +13

    +1.5

    +1.5

    G B G GG GB BG G G BB B

    +2 +0.5 +2 +2+2 +2+0.5 +0.5+2 +0.5-2 +2-0.5-0.5

    ju +1 +1 +1 +1 +1 +1 +1 - No error