iterative soft-decision decoding of algebraic-geometric codes li chen associate professor school of...

41
Iterative Soft-Decision Decoding of Algebraic- Geometric Codes Li Chen Associate Professor School of Information Science and Technology, Sun Yat-sen University, Guangzhou, China [email protected] website: sist.sysu.edu.cn/~chenli Institute of Network Coding and Department of Information Engineering, the Chinese University of Hong Kong 1 st of Aug, 2012

Upload: lynne-benson

Post on 03-Jan-2016

221 views

Category:

Documents


1 download

TRANSCRIPT

Iterative Soft-Decision Decoding of Algebraic-Geometric Codes

Li Chen Associate Professor School of Information Science and Technology,

Sun Yat-sen University, Guangzhou, China [email protected] website: sist.sysu.edu.cn/~chenli

Institute of Network Coding and Department of Information Engineering,

the Chinese University of Hong Kong 1st of Aug, 2012

Outline Introduction (How to construct an algebraic-geometric code?)

Review on Koetter-Vardy list decoding (Challenges in the decoding)

Iterative soft-decision decoding (An iterative solution)

Geometric interpretation of the iterative decoding (An insight into the solution)

Complexity reduction decoding approaches (Some implementation advises)

Performance analysis (Advantage and cost)

Conclusions (An end & a beginning)

I. Introduction The construction of an algebraic-geometric (AG) code

Based on an algebraic curve χ(x, y, z) Identify its point of infinity p∞ define the pole basis Φ

Pick up one of the affine components, e.g., χ(x, y, 1) find out the affine points pj

The Reed-Solomon (RS) code is the simplest AG code Constructed based on y = 0; Its pole basis Φ = {1, x, x2, x3, x4, ……} Affine points {x1, x2, x3, …., xn} \ {0} Note: the length of the code cannot exceed the size of the finite field.

The generator matrix G The parity-check matrix H

I. Introduction The Hermitian curve: , ,

The point of infinity

The pole basis

Bivariate monomials and their pole orders

Based on one of its affine components Hw(x, y, 1), determine the affine points pj = (xj, yj, 1) where xj

w+1 + yjw + yj = 0 and j = 1, 2, …, n.

Encoding of an (n, k) Hermitian code Given the message vector The codeword is generated by Note > q The length of the code can exceed the size of the finite field!

I. Introduction Example: Construction of a (8, 4) Hermitian code

Defined in = {0, 1, α, α2}; The Hermitian curve H2(x, y, z) = x3 + y2z + yz2 point of infinity p∞ = (0, 1, 0);

One of its affine components: H2(x, y, 1) = x3 + y2 + y; Its pole basis Its affine points: p1 = (0, 0), p2 = (0, 1), p3 = (1, α), p4 = (1, α2),

p5 = (α, α), p6 = (α, α2), p7 = (α2, α), p8 = (α2, α2)

I. Introduction Advantage: Over the same finite field, the Hermitian codes are larger than

the RS codes;

Code length vs. field size

Disadvantage: It is not a Maximum Distance Separable (MDS) code. The error-correction capability of a very high rate code is almost vanished.

Fq

CodesF4 F16 F64 F256

RS code 3 15 63 255

H. code 8 64 512 4096

II. Review on KV list decoding Decoding philosophy evolution

Unique decoding List decoding

The Sakata algorithm with majority voting The Guruswami-Sudan (GS) algorithm The Koetter-Vardy (KV) algorithm

II. Review on KV list decoding Key processes: Reliability transform (Π M), Interpolation (construct Q(x, y, z)),

Factorisation (find out the z-roots of Q)

Reliability transform and knowledge of M (Example: a (8, 4) Hermitian code)

p1 p2 p3 p4 p5 p6 p7 p8 C1 C2 C3 C4 C5 C6 C7 C8 R1 R2 R3 R4 R5 R6 R7 R8

0

1

α

Encoding

Channel

α2

Reliability transform

E.g., interpolation will be performed w.r.t. (p5, 1) with a multiplicity of 2.

The number of interpolation constraints is

Reliability-based codeword score and multiplicity-based codeword score Given a codeword

Theorem 1 If , can be found by determining the z-roots of Q. Theorem 2 If , can be found by determining the z-roots of Q.

The optimal decoding performance of the KV algorithm is dictated by Π.

II. Review on KV list decoding

0

1

α

α2

= 2.14

0

1

α

α2

= 5

II. Review on KV list decoding KV decoding performance of the (64, 39) Hermitian code

Challenge: Can we further improve the KV decoding performance for Hermitian codes?

Π

III. Iterative Soft-Decision Decoding Decoding stages:

ABP: Adaptive belief propagation, to improve the reliability of Π; KV: Koetter-Vardy list decoding, to find out the message vector ;

Decoding block diagram

ABP KVΠ Π’

III. Iterative Soft-Decision Decoding Binary image of the parity-check matrix H Let σ(x) = σ0 + σ1x + ∙∙∙ + σβxβ be the primitive polynomial of

The companion matrix of σ(x) is

Example: In , σ(x) = 1 + x + x2 and

III. Iterative Soft-Decision Decoding Is Hb suitable to be used for BP decoding?

Density of the matrix: 53.125%; The number of short cycles ( ): 279; We will have to reduce the density and eliminate parts of the short cycles!

III. Iterative Soft-Decision Decoding Bit reliability oriented Gaussian elimination on Hb;

Assume each coded by cj is BPSK modulated symbol sj (j = 1, 2, …, N); Given as the received vector; The bit log-likelihood ratio (LLR) value is:

and the LLR vector

Reliability of bit cj is determined by ,

E.g., Pr[c1 = 0 | y1] = 0.49

Pr[c1 = 1 | y1] = 0.51

Pr[c2 = 0 | y2] = 0.93

Pr[c2 = 1 | y2] = 0.07

|L(c1)| = 0.04

|L(c2)| = 2.59

Bit c2 is more reliable!

III. Iterative Soft-Decision Decoding Bit reliability sorting: sort the bits in term of their reliabilities; Refreshed bit indices that indicate

Let be the set of bit indices and .

E.g., based on the above sorting outcome, let that collects all the N - K least reliable bit indices;

We could sort LLR vector as

III. Iterative Soft-Decision Decoding Perform Gaussian elimination w.r.t. the columns indicated by B, i.e.,

reduce column j1 to [1 0 0 ∙∙∙ 0]T;

reduce column j2 to [0 1 0 ∙∙∙ 0]T;

reduce column jN-K to [0 0 0 ∙∙∙ 1]T.

Gaussian elimination Hb Hb’ (density ; number of short cycles )

Matrix Hb’ is more suitable to be used in the BP decoding.

N-K bits

III. Iterative Soft-Decision Decoding The conventional BP decoding based on Hb’

Let hij denote the entry of matrix Hb’ Define and

Initialization: with entries vij and uij, and

For each BP iteration

Horizontal step (V U)

Vertical step (U V)

After a number of BP iterations, update the bit LLR values as

, where the extrinsic LLR is

η is the damping factor.

III. Iterative Soft-Decision Decoding The updated LLR vector can be formed as

The updated bit LLR values are converted back into APP values by

They can then be used to generate the improved reliability matrix Π’

Reliability transform M

Interpolation

Factorization

III. Iterative Soft-Decision Decoding A work example: Iterative decoding of the (8, 4) Hermitian code

Codeword (sym. wise)

Codeword (bit wise)

The received LLR vector is:

: LLR values that give a wrong estimation on the bits

The original reliability matrix Π is:

= 3.969

= 3.993

Based on Theorem 1, KV decoding will fail!

L(cj) ≥ 0 cj = 0; L(cj) < 0 cj = 1

III. Iterative Soft-Decision Decoding Sort the bits in an ascending order in terms of , yielding

Perform Gaussian elimination on those columns implied by B

Density: 53.125%S. cycles: 279

Density: 37.5%S. cycles: 112

j0, j2, …, j15= 7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 15, 4, 1, 6, 9, 5

B Bc

III. Iterative Soft-Decision Decoding Based on Hb’, perform 3 BP iterations, we have the updated LLR vector as

The updated reliability matrix Π’ becomes

• For the ‘wrong’ LLR values ( ): we would like to change its sign, or reduce its magnitude;• For the ‘right’ LLR values: we would like to leave the sign unchanged and increase its magnitude;

= 4.478

= 4.037

Based on Theorem 1, KV decoding will succeed!

III. Iterative Soft-Decision Decoding Why Gaussian elimination should be bit reliability oriented?

reliable bitsunreliable bits

L’(c7)L’(c5)

Tanner graph

4/1 5/2 5/2 3/2 3/2 5/0

III. Iterative Soft-Decision Decoding How to improve the iterative decoding performance? It is possible that reliable bits are wrongly estimated by their LLR values; We can create different sets of bit indices B and let more bits’

corresponding cols. also fall into the identity submatrix of Hb’.

Example with the sorted bit indices being

{7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 15, 4, 1, 6, 9, 5}

B(1)

Gau. elimination

{3, 11, 13, 2, 14, 7, 10, 0, 12, 8, 15, 4, 1, 6, 9, 5}

B(2)

{15, 4, 1, 6, 9, 7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 5}

B(3)

Gau. elimination

Gau. elimination

III. Iterative Soft-Decision Decoding

A revisit of the decoding block diagram

• Note if there are multiple matrix adaptations, the next bit reliability sorting will be performed based on the updated LLR vector ;

• Multiple attempts of KV decoding result in an output list that contains all the message candidates. The Maximum Likelihood (ML) criterion is used to select one from the list.

IV. Geometric Interpretation Insight of why we need matrix adaptations before the BP decoding Normalize the vector to the vector Normalize L(cj) to Tj by the mapping function

A graphical look into the vector and the vector.

IV. Geometric InterpretationWhen the codeword is not found When the codeword is found

• When a codeword is found, Tj = 1 for j = 1, 2, …, N;

IV. Geometric Interpretation Objective of the BP decoding: Finding the vector that minimizes the potential

function

The LLR update in the BP decoding

can be seen as the T value update

Finding the estimated codeword using the BP algorithm can be seen as identifying the vertex at which the potential function is minimized.

IV. Geometric Interpretation The convergence behavior of the potential function of the (64, 39) Hermitian code = -100

V. Complexity Reduction Decoding parameters -- number of groups of unreliable bit indices -- number of matrix adaptations (Gau. eliminations) -- number of BP iterations

There are three types of computations required by the decoding Binary operations (Gau. eliminations): Floating point operations (BP iterations): Finite field arithmetic operations (KV decodings):

With the iterative decoding parameters of Binary operations: Floating point operations: Finite field arithmetic operations:

×

×

×

V. Complexity Reduction Reduce the deployment of the KV decoding steps

ABP-KV decoding block diagram

We could try to assess the quality of matrices Π’ and M. If they are not good enough to result in a possibly successful decoding, the following KV decoding process will NOT be carried out.

ABP Π’MΠ Π’

Intp. Fac.M

V. Complexity Reduction Reliability-based received word score Multiplicity-based received word score Example: the (8, 4) Hermitian code

0

1

α

α2

0

1

α

α2

= 6.7

= 15

V. Complexity Reduction Recall the two theorems for successful KV decoding Theorem 1 If ( ) {KV can succeed;} Theorem 2 If ( ) {KV can succeed;}

Lemma 3 If ( ) {KV cannot succeed;}

Lemma 4 If ( ) {KV cannot succeed;}

ABP Π’MΠ Π’

Intp. Fac.M

Proof:

Proof:

V. Complexity Reduction

Complexity reduction for ABP-KV decoding of the (64, 52) Hermitian code Decoding parameters = (10, 5, 2) There are 50 KV decoding processes for each codeword frame

V. Complexity Reduction Other facilitated decoding approaches: Parallel decoding:

Output validation: once is found, the iterative decoding will be terminated.

( ; )

VI. Performance Analysis Decoding parameters: the KV decoding output list size (l) and The (8, 4) Hermitian code over the AWGN channel

VI. Performance Analysis The (64, 39) Hermitian code over the AWGN channel

VI. Performance Analysis The (64, 47) Hermitian code over the AWGN channel

VI. Performance Analysis The (64, 47) Hermitian code over the fast Rayleigh fading channel Coherent detection with the knowledge of CSI

VI. Performance Analysis Herm. (64, 47) vs. RS (15, 11), over the AWGN channel

VII. Conclusions Revisit the construction of AG codes: pole basis + affine points;

Review the KV soft-decision list decoding algorithm: Π dependent;

Introduce an iterative soft-decision decoding algorithm for Hermitian codes:

Adaptive Belief Propagation + KV list decoding;

ABP algorithm is bit reliability oriented BP is also good for AG (RS) codes;

Geometric interpretation necessity of performing parity-check matrix adaptation;

Complexity reduction : successive criteria to assess Π’ and M; parallel decoding; output validations;

Performance analysis shows a significant performance gain can be achieved

(~ conventional algorithms; ~ RS codes).

Acknowledgement

Project: Advanced coding technology for future storage devices;

ID: 61001094; From 2011. 1 to 2013. 12.

National natural Science Foundation of China

Thank you!