soft-in soft-out decoding of reed-solomon codes based on v ... · pdf file1 soft-in soft-out...

7
1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on Vardy and Be’ery’s Decomposition Thomas R. Halford, Vishakan Ponnampalam, Alex J. Grant and Keith M. Chugg Abstract— This correspondence presents an optimal soft-in soft-out (SISO) decoding algorithm for the binary image of Reed- Solomon codes that is based on Vardy and Be’ery’s optimal soft-in hard-out algorithm. A novel suboptimal list-based SISO decoder that exploits Vardy and Be’ery’s decomposition is also presented. For those codes with very high rate, which allows practical decoding with the proposed algorithms, the proposed suboptimal SISO significantly outperforms standard list-based decoding techniques in iteratively decoded systems. Index Terms— Reed-Solomon (RS) codes, soft-in soft-out (SISO) decoding, graphical models. I. I NTRODUCTION The ubiquity and utility of Reed-Solomon (RS) codes are well-established (see, for example, [3]). It has been shown that soft decision decoding (SDD) algorithms can achieve as much as 3 dB of additional coding gain on the additive white Gaussian noise (AWGN) channel in comparison to hard decision decoding algorithms; however, SDD algorithms are often much more complex [4]. There has thus been a great deal of recent interest in SDD algorithms for RS codes with practically realizable complexity. Koetter and Vardy recently presented a soft-in hard-out (SIHO) RS decoder that achieves coding gains on the order of 1 dB compared to the Berlekamp-Massey algorithm with a moderate complexity increase [5]. Extensions of Koetter and Vardy’s algorithm proposed by Parvaresh and Vardy [6] and El-Khamy, McEliece and Harel [7] improve upon these results. Liu and Lin recently presented a SIHO decoder for self- concatenated Reed-Solomon codes based on their binary im- age [8]. Soft-in hard-out decoding algorithms are not suitable for iterative decoding, however, and soft-in soft-out (SISO) decoders for RS codes are often desired. Due to their non-binary nature, trellis representations of RS codes are in general prohibitively complex [9]. In this corre- spondence, an optimal SISO decoder for the binary image of RS codes is presented based on the SIHO decoding algorithms of Vardy and Be’ery [10] and Ponnampalam and Vucetic [11]. It is shown that Vardy and Be’ery’s decomposition implies a The work of T. R. Halford was supported in part by the Powell Foundation. The material in this correspondence was presented in part at ISIT 2003, Yokohama, Japan, June-July 2003 [1] and in [2]. T. R. Halford and K. M. Chugg are with the Communication Sciences Institute, University of Southern California, Los Angeles CA 90089-2565 USA (email: [email protected], [email protected]). V. Ponnampalam is with IPWireless Ltd., Chippenham, Wiltshire SN15 1BN, UK (email: [email protected]). A. J. Grant is with the Institute for Telecommunications Research, Uni- versity of South Australia, Mawson Lakes, SA 5095, Australia (email: [email protected]). cycle-free factor graph and thus an optimal SISO decoding algorithm [12]. As predicted by the Cut-Set Bound [13], [14], the proposed optimal algorithm is necessarily prohibitively complex for large codes; however, for small, high-rate RS codes the proposed algorithm has reasonable complexity. Several authors have considered supoptimal SISO RS de- coding algorithms. Fossorier and Lin’s ordered statistics ap- proach realizes a suboptimal SISO decoding algorithm for the binary image of RS codes [15]. Jiang and Narayan re- cently presented a suboptimal SISO decoding algorithm for the binary image of RS codes [16]; however, although their algorithm provides soft outputs, there is no indication that these soft outputs are good in an iterative context. The present correspondence proposes a suboptimal list-based SISO decod- ing algorithm based on Vardy and Be’ery’s decomposition. It is shown that for very high-rate RS codes, the proposed algo- rithm compares favorably to standard list decoding schemes in both complexity and performance. The remainder of this correspondence is organized as fol- lows. Section II reviews Vardy and Be’ery’s decomposition and presents the proposed optimal SISO decoding algorithm. Section III presents the proposed suboptimal list-based SISO decoding algorithm and investigates its performance as a stand-alone decoder and as a constituent decoder in a turbo product code [17]. Section IV gives conclusions and suggests directions for future work. II. OPTIMAL SISO DECODING OF RS CODES A. The Vardy-Be’ery Decomposition Let C RS be an (n =2 m - 1,k,d = n - k + 1) RS code defined over GF (2 m ) with roots α, α 2 ,...,α d-1 , where α is primitive in GF (2 m ). Associated with C RS is the (n, k k,d d) binary Bose-Chaudhuri-Hocquenghem (BCH) code C BCH with roots α, α 2 ,...,α d-1 and their cyclotomic conjugates over GF (2). Let φ : GF (2 m ) (GF (2)) m be a GF (2)-linear map with basis {γ 1 2 ,...,γ m }. Any element c i GF (2 m ) can be written: c i = m j=1 γ j c (j) i where c (j) i GF (2) (1) and φ thus defines the binary image of C RS . The SIHO algorithms of [10] and [11] were motivated by structural properties of the generator matrix of the binary image of C RS . Vardy and Be’ery proved in [10] that a generator matrix of the binary image of C RS can be found with the structure shown in Figure 1 where B is k × n and generates C BCH . The first mk rows of this structure are block

Upload: vuongkiet

Post on 24-Mar-2018

228 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ... · PDF file1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ardy and BeÕeryÕ s Decomposition Thomas R

1

Soft-In Soft-Out Decoding of Reed-Solomon CodesBased on Vardy and Be’ery’s Decomposition

Thomas R. Halford, Vishakan Ponnampalam, Alex J. Grant and Keith M. Chugg

Abstract—This correspondence presents an optimal soft-insoft-out (SISO) decoding algorithm for the binary image of Reed-Solomon codes that is based on Vardy and Be’ery’s optimalsoft-in hard-out algorithm. A novel suboptimal list-based SISOdecoder that exploits Vardy and Be’ery’s decomposition is alsopresented. For those codes with very high rate, which allowspractical decoding with the proposed algorithms, the proposedsuboptimal SISO significantly outperforms standard list-baseddecoding techniques in iteratively decoded systems.

Index Terms—Reed-Solomon (RS) codes, soft-in soft-out(SISO) decoding, graphical models.

I. INTRODUCTIONThe ubiquity and utility of Reed-Solomon (RS) codes are

well-established (see, for example, [3]). It has been shownthat soft decision decoding (SDD) algorithms can achieveas much as 3 dB of additional coding gain on the additivewhite Gaussian noise (AWGN) channel in comparison to harddecision decoding algorithms; however, SDD algorithms areoften much more complex [4]. There has thus been a greatdeal of recent interest in SDD algorithms for RS codes withpractically realizable complexity.Koetter and Vardy recently presented a soft-in hard-out

(SIHO) RS decoder that achieves coding gains on the orderof 1 dB compared to the Berlekamp-Massey algorithm with amoderate complexity increase [5]. Extensions of Koetter andVardy’s algorithm proposed by Parvaresh and Vardy [6] andEl-Khamy, McEliece and Harel [7] improve upon these results.Liu and Lin recently presented a SIHO decoder for self-concatenated Reed-Solomon codes based on their binary im-age [8]. Soft-in hard-out decoding algorithms are not suitablefor iterative decoding, however, and soft-in soft-out (SISO)decoders for RS codes are often desired.Due to their non-binary nature, trellis representations of RS

codes are in general prohibitively complex [9]. In this corre-spondence, an optimal SISO decoder for the binary image ofRS codes is presented based on the SIHO decoding algorithmsof Vardy and Be’ery [10] and Ponnampalam and Vucetic [11].It is shown that Vardy and Be’ery’s decomposition implies a

The work of T. R. Halford was supported in part by the Powell Foundation.The material in this correspondence was presented in part at ISIT 2003,Yokohama, Japan, June-July 2003 [1] and in [2].T. R. Halford and K. M. Chugg are with the Communication Sciences

Institute, University of Southern California, Los Angeles CA 90089-2565USA (email: [email protected], [email protected]).V. Ponnampalam is with IPWireless Ltd., Chippenham, Wiltshire SN15

1BN, UK (email: [email protected]).A. J. Grant is with the Institute for Telecommunications Research, Uni-

versity of South Australia, Mawson Lakes, SA 5095, Australia (email:[email protected]).

cycle-free factor graph and thus an optimal SISO decodingalgorithm [12]. As predicted by the Cut-Set Bound [13], [14],the proposed optimal algorithm is necessarily prohibitivelycomplex for large codes; however, for small, high-rate RScodes the proposed algorithm has reasonable complexity.Several authors have considered supoptimal SISO RS de-

coding algorithms. Fossorier and Lin’s ordered statistics ap-proach realizes a suboptimal SISO decoding algorithm forthe binary image of RS codes [15]. Jiang and Narayan re-cently presented a suboptimal SISO decoding algorithm forthe binary image of RS codes [16]; however, although theiralgorithm provides soft outputs, there is no indication thatthese soft outputs are good in an iterative context. The presentcorrespondence proposes a suboptimal list-based SISO decod-ing algorithm based on Vardy and Be’ery’s decomposition. Itis shown that for very high-rate RS codes, the proposed algo-rithm compares favorably to standard list decoding schemesin both complexity and performance.The remainder of this correspondence is organized as fol-

lows. Section II reviews Vardy and Be’ery’s decompositionand presents the proposed optimal SISO decoding algorithm.Section III presents the proposed suboptimal list-based SISOdecoding algorithm and investigates its performance as astand-alone decoder and as a constituent decoder in a turboproduct code [17]. Section IV gives conclusions and suggestsdirections for future work.

II. OPTIMAL SISO DECODING OF RS CODESA. The Vardy-Be’ery DecompositionLet CRS be an (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with roots!!, !2, . . . ,!d!1

", where

! is primitive in GF (2m). Associated with CRS isthe (n, k" " k, d" # d) binary Bose-Chaudhuri-Hocquenghem(BCH) code CBCH with roots

!!, !2, . . . ,!d!1

"and

their cyclotomic conjugates over GF (2). Let " :GF (2m) $ (GF (2))m be a GF (2)-linear map with basis{#1, #2, . . . , #m}. Any element ci % GF (2m) can be written:

ci =m#

j=1

#jc(j)i where c(j)

i % GF (2) (1)

and " thus defines the binary image of CRS .The SIHO algorithms of [10] and [11] were motivated by

structural properties of the generator matrix of the binaryimage of CRS . Vardy and Be’ery proved in [10] that agenerator matrix of the binary image of CRS can be foundwith the structure shown in Figure 1 where B is k" & n andgenerates CBCH . The first mk" rows of this structure are block

Page 2: Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ... · PDF file1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ardy and BeÕeryÕ s Decomposition Thomas R

2

diagonal. The last m(k!k") rows of this structure are denotedglue vectors in [10]. The code generated by the glue vectorsis denoted the glue code.

1

Efficient Soft-In Soft-Out Decoding of

Reed-Solomon CodesKeith M. Chugg, Alex Grant, Thomas R. Halford and Vishakan Ponnampalam

I. OPTIMAL SISO DECODING OF RS CODES

A. The Vardy-Be’ery Decomposition

Let CRS be a (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with generator polynomial:

gRS(X) =d!1!

i=1

"X + !i

#(1)

where ! is primitive in GF (2m). Associated with CRS is

the (n, k" " k, d" # d) binary BCH code CBCH generated by

gBCH(X) with roots$!, !2, . . . ,!d!1

%and their cyclotomic

conjugates over GF (2). Let " : GF (2m) $ (GF (2))mbe

a linear mapping that defines the binary image of CRS . Vardy

and Be’ery proved in [VaBe91] that the generator matrix of

the binary image of CRS can be found with the block diagonal

structure pictured in Figure 1 where B generates CBCH .

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

Based on this structure, define codes B and L as:

B = {b(X) | } (2)

Codewords belonging to B are

II. SUBOPTIMAL SISO DECODING OF RS CODES

1

Efficient Soft-In Soft-Out Decoding of

Reed-Solomon CodesKeith M. Chugg, Alex Grant, Thomas R. Halford and Vishakan Ponnampalam

I. OPTIMAL SISO DECODING OF RS CODES

A. The Vardy-Be’ery Decomposition

Let CRS be a (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with generator polynomial:

gRS(X) =d!1!

i=1

"X + !i

#(1)

where ! is primitive in GF (2m). Associated with CRS is

the (n, k" " k, d" # d) binary BCH code CBCH generated by

gBCH(X) with roots$!, !2, . . . ,!d!1

%and their cyclotomic

conjugates over GF (2). Let " : GF (2m) $ (GF (2))mbe

a linear mapping that defines the binary image of CRS . Vardy

and Be’ery proved in [VaBe91] that the generator matrix of

the binary image of CRS can be found with the block diagonal

structure pictured in Figure 1 where B generates CBCH .

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

Based on this structure, define codes B and L as:

B = {b(X) | } (2)

Codewords belonging to B are

II. SUBOPTIMAL SISO DECODING OF RS CODES

1

Efficient Soft-In Soft-Out Decoding of

Reed-Solomon CodesKeith M. Chugg, Alex Grant, Thomas R. Halford and Vishakan Ponnampalam

I. OPTIMAL SISO DECODING OF RS CODES

A. The Vardy-Be’ery Decomposition

Let CRS be a (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with generator polynomial:

gRS(X) =d!1!

i=1

"X + !i

#(1)

where ! is primitive in GF (2m). Associated with CRS is

the (n, k" " k, d" # d) binary BCH code CBCH generated by

gBCH(X) with roots$!, !2, . . . ,!d!1

%and their cyclotomic

conjugates over GF (2). Let " : GF (2m) $ (GF (2))mbe

a linear mapping that defines the binary image of CRS . Vardy

and Be’ery proved in [VaBe91] that the generator matrix of

the binary image of CRS can be found with the block diagonal

structure pictured in Figure 1 where B generates CBCH .

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

Based on this structure, define codes B and L as:

B = {b(X) | } (2)

Codewords belonging to B are

II. SUBOPTIMAL SISO DECODING OF RS CODES

1

Efficient Soft-In Soft-Out Decoding of

Reed-Solomon CodesKeith M. Chugg, Alex Grant, Thomas R. Halford and Vishakan Ponnampalam

I. OPTIMAL SISO DECODING OF RS CODES

A. The Vardy-Be’ery Decomposition

Let CRS be a (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with generator polynomial:

gRS(X) =d!1!

i=1

"X + !i

#(1)

where ! is primitive in GF (2m). Associated with CRS is

the (n, k" " k, d" # d) binary BCH code CBCH generated by

gBCH(X) with roots$!, !2, . . . ,!d!1

%and their cyclotomic

conjugates over GF (2). Let " : GF (2m) $ (GF (2))mbe

a linear mapping that defines the binary image of CRS . Vardy

and Be’ery proved in [VaBe91] that a generator matrix of the

binary image of CRS can be found with the block diagonal

structure pictured in Figure 1 where B generates CBCH .

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

Based on this structure, define codes B and L as:

B = {b(X) | } (2)

The diagonal blocks of Figure ?? correspond to B; codewordsbelonging to B are formed by interleaving BCH codewords.

The glue vectors of Figure ?? correspond to L; codewordsbelonging to L are formed by interleaving some combination

of coset leaders of CBCH .

B. An Alternate Definition of the Glue Code

II. SUBOPTIMAL SISO DECODING OF RS CODES

glue vectors

. . . (3)

0

1

Efficient Soft-In Soft-Out Decoding of

Reed-Solomon CodesKeith M. Chugg, Alex Grant, Thomas R. Halford and Vishakan Ponnampalam

I. OPTIMAL SISO DECODING OF RS CODES

A. The Vardy-Be’ery Decomposition

Let CRS be a (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with generator polynomial:

gRS(X) =d!1!

i=1

"X + !i

#(1)

where ! is primitive in GF (2m). Associated with CRS is

the (n, k" " k, d" # d) binary BCH code CBCH generated by

gBCH(X) with roots$!, !2, . . . ,!d!1

%and their cyclotomic

conjugates over GF (2). Let " : GF (2m) $ (GF (2))mbe

a linear mapping that defines the binary image of CRS . Vardy

and Be’ery proved in [VaBe91] that a generator matrix of the

binary image of CRS can be found with the block diagonal

structure pictured in Figure 1 where B generates CBCH .

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

Based on this structure, define codes B and L as:

B = {b(X) | } (2)

The diagonal blocks of Figure ?? correspond to B; codewordsbelonging to B are formed by interleaving BCH codewords.

The glue vectors of Figure ?? correspond to L; codewordsbelonging to L are formed by interleaving some combination

of coset leaders of CBCH .

B. An Alternate Definition of the Glue Code

II. SUBOPTIMAL SISO DECODING OF RS CODES

glue vectors

. . . (3)

0

1

Efficient Soft-In Soft-Out Decoding of

Reed-Solomon CodesKeith M. Chugg, Alex Grant, Thomas R. Halford and Vishakan Ponnampalam

I. OPTIMAL SISO DECODING OF RS CODES

A. The Vardy-Be’ery Decomposition

Let CRS be a (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with generator polynomial:

gRS(X) =d!1!

i=1

"X + !i

#(1)

where ! is primitive in GF (2m). Associated with CRS is

the (n, k" " k, d" # d) binary BCH code CBCH generated by

gBCH(X) with roots$!, !2, . . . ,!d!1

%and their cyclotomic

conjugates over GF (2). Let " : GF (2m) $ (GF (2))mbe

a linear mapping that defines the binary image of CRS . Vardy

and Be’ery proved in [VaBe91] that a generator matrix of the

binary image of CRS can be found with the block diagonal

structure pictured in Figure 1 where B generates CBCH .

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

Based on this structure, define codes B and L as:

B = {b(X) | } (2)

The diagonal blocks of Figure ?? correspond to B; codewordsbelonging to B are formed by interleaving BCH codewords.

The glue vectors of Figure ?? correspond to L; codewordsbelonging to L are formed by interleaving some combination

of coset leaders of CBCH .

B. An Alternate Definition of the Glue Code

II. SUBOPTIMAL SISO DECODING OF RS CODES

glue vectors

. . . (3)

0

1

Efficient Soft-In Soft-Out Decoding of

Reed-Solomon CodesKeith M. Chugg, Alex Grant, Thomas R. Halford and Vishakan Ponnampalam

I. OPTIMAL SISO DECODING OF RS CODES

A. The Vardy-Be’ery Decomposition

Let CRS be a (n = 2m ! 1, k, d = n! k + 1) RS code

defined over GF (2m) with generator polynomial:

gRS(X) =d!1!

i=1

"X + !i

#(1)

where ! is primitive in GF (2m). Associated with CRS is

the (n, k" " k, d" # d) binary BCH code CBCH generated by

gBCH(X) with roots$!, !2, . . . ,!d!1

%and their cyclotomic

conjugates over GF (2). Let " : GF (2m) $ (GF (2))mbe

a linear mapping that defines the binary image of CRS . Vardy

and Be’ery proved in [VaBe91] that a generator matrix of the

binary image of CRS can be found with the block diagonal

structure pictured in Figure 1 where B generates CBCH .

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

Based on this structure, define codes B and L as:

B = {b(X) | } (2)

The diagonal blocks of Figure ?? correspond to B; codewordsbelonging to B are formed by interleaving BCH codewords.

The glue vectors of Figure ?? correspond to L; codewordsbelonging to L are formed by interleaving some combination

of coset leaders of CBCH .

B. An Alternate Definition of the Glue Code

II. SUBOPTIMAL SISO DECODING OF RS CODES

glue vectors

. . . (3)

0

3

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding

by nearly 2 dB.

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

The power of the optimal SISO lies in its performance in

iterative systems. Figure 4 illustrates the block diagram of

a Turbo-like code (TLC) formed by the serial concatenation

of a (7, 5, 3) Reed-Solomon code and a differential encodertransmitted over a 3-tap ISI channel. Note that the inner SISO

treats the differential encoder and ISI as a single code. Figure

5 illustrates the performance of this code for block lengths of

1050 and 5250 bits. I need a sentence defining the interleaver

and one stating that this code exhibits the performance we

expect.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients“1!2, 12 , 1

2

”.

III. EXTENSIONS OF THE OPTIMAL SISO

In this Section, the optimal SISO is extended to codes

that are closely related to Reed-Solomon codes. Section III-A

presents an optimal SISO for shortened RS codes. Section III-

B presents an optimal SISO for punctured RS codes. Section

III-C presents a SISO for extended RS codes. Section III-D

presents the Vardy-Be’ery decomposition in a more general

framework that acts as a recipe for creating linear codes with

efficient SISO decoding algorithms.

A. Shortened RS Codes

B. Punctured RS Codes

C. Extended RS Codes

D. Code Constructions Based on the Vardy-Be’ery Decompo-

sition

IV. SUBOPTIMAL SISO DECODING OF RS CODES

For high rate Reed-Solomon codes (for example, the (n, n!2, 3) and (n, n ! 4, 5) code families) the complexity of theoptimal SISO decoder presented in Section II is dominated

by the size of the glue code. As an example, consider the

(127, 125, 3) RS code: the parallel trellises have maximumstate space size 128 while there are 214 valid coset config-

urations in the glue code. The decoding complexity can be

dramatically reduced by considering only a subset of likely

coset configurations rather than the exhaustive combination

and marginalization described in Section II-C. Because the

glue decode is decoded suboptimally, the reduced-complexity

SISO is not optimal. Sections IV-A, IV-B and IV-C detail

algorithms for generating lists of likely coset configurations.

Simulation results of the suboptimal SISO as both a standalone

decoder and in iterative schemes are given in Section IV-D.

A. List Decoding of the Glue Code

LetM be the glue code corresponding to the RS code CRS

defined as per (6). Let M" be the binary image of this codewith generator matrix G"

M and parity check matrix H"M.

B. Special Case: The (n, n! 2, 3) RS CodesC. Special Case: The (n, n! 4, 5) RS CodesD. Simulation Results

V. COMPARISON TO THE KOETTER-VARDY SIHO

VI. CONCLUSIONS AND FUTURE WORK

mnmkmk"

m(k ! k")

APPENDIX

REFERENCES

[1] O. Aitsab and R. Pyndiah, “Performance of Reed-Solomon block turbocode,” in Proc. Globecom Conf., London, 1996, pp. 121–125.

[2] ——, “Performance of concatenated Reed-Solomon/convolutional codeswith iterative,” in Proc. Globecom Conf., Phoenix, AZ, 1997, pp. 934–938.

[3] K. M. Chugg, A. Anastasopoulos, and X. Chen, Iterative Detection:

Adaptivity, Complexity Reduction, and Applications. Kluwer AcademicPublishers, 2001.

[4] G. D. Forney, Jr., Concatenated Codes. Cambridge, Massachusetts:MIT Press, 1966.

[5] T. R. Halford, “Optimal soft-in soft-out decoding of Reed-Solomoncodes,” Communication Sciences Institute, USC, Los Angeles, CA,Tech. Rep. CSI-04-05-03, April 2003.

[6] R. Koetter, “Fast generalized minimum distance decoding of algebraicgeometric and Reed-Solomon codes,” IEEE Trans. Information Theory,vol. 42, pp. 721–737, May 1996.

[7] R. Koetter and A. Vardy, “Algebraic soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Information Theory, vol. 49, pp. 2809–2825, November 2003.

[8] V. Ponnampalam and A. Grant, “An efficient siso algorithm for Reed-Solomon codes,” in Proc. 4th Australian Comm. Theory Workshop,February 2003.

[9] J. G. Proakis, Digital Communications, 3rd ed. New York: McGraw-Hill, 1995.

[10] A. Vardy and Y. Be’ery, “Bit level soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Commununication, vol. 39, pp. 440–444,March 1991.

[11] N. Wiberg, “Codes and decoding on general graphs,” Ph.D. dissertation,Linkoping University (Sweden), 1996.

[12] S. B. Wicker and V. K. Bhargava, Reed-Solomon Codes and Their

Applications. John Wiley & Sons, 1999.[13] J. K. Wolf, “Efficient maximum-likelihood decoding of linear block

codes using a trellis,” IEEE Trans. Information Theory, vol. 24, pp.76–80, 1978.

3

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding

by nearly 2 dB.

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

The power of the optimal SISO lies in its performance in

iterative systems. Figure 4 illustrates the block diagram of

a Turbo-like code (TLC) formed by the serial concatenation

of a (7, 5, 3) Reed-Solomon code and a differential encodertransmitted over a 3-tap ISI channel. Note that the inner SISO

treats the differential encoder and ISI as a single code. Figure

5 illustrates the performance of this code for block lengths of

1050 and 5250 bits. I need a sentence defining the interleaver

and one stating that this code exhibits the performance we

expect.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients“1!2, 12 , 1

2

”.

III. EXTENSIONS OF THE OPTIMAL SISO

In this Section, the optimal SISO is extended to codes

that are closely related to Reed-Solomon codes. Section III-A

presents an optimal SISO for shortened RS codes. Section III-

B presents an optimal SISO for punctured RS codes. Section

III-C presents a SISO for extended RS codes. Section III-D

presents the Vardy-Be’ery decomposition in a more general

framework that acts as a recipe for creating linear codes with

efficient SISO decoding algorithms.

A. Shortened RS Codes

B. Punctured RS Codes

C. Extended RS Codes

D. Code Constructions Based on the Vardy-Be’ery Decompo-

sition

IV. SUBOPTIMAL SISO DECODING OF RS CODES

For high rate Reed-Solomon codes (for example, the (n, n!2, 3) and (n, n ! 4, 5) code families) the complexity of theoptimal SISO decoder presented in Section II is dominated

by the size of the glue code. As an example, consider the

(127, 125, 3) RS code: the parallel trellises have maximumstate space size 128 while there are 214 valid coset config-

urations in the glue code. The decoding complexity can be

dramatically reduced by considering only a subset of likely

coset configurations rather than the exhaustive combination

and marginalization described in Section II-C. Because the

glue decode is decoded suboptimally, the reduced-complexity

SISO is not optimal. Sections IV-A, IV-B and IV-C detail

algorithms for generating lists of likely coset configurations.

Simulation results of the suboptimal SISO as both a standalone

decoder and in iterative schemes are given in Section IV-D.

A. List Decoding of the Glue Code

LetM be the glue code corresponding to the RS code CRS

defined as per (6). Let M" be the binary image of this codewith generator matrix G"

M and parity check matrix H"M.

B. Special Case: The (n, n! 2, 3) RS CodesC. Special Case: The (n, n! 4, 5) RS CodesD. Simulation Results

V. COMPARISON TO THE KOETTER-VARDY SIHO

VI. CONCLUSIONS AND FUTURE WORK

mnmkmk"

m(k ! k")

APPENDIX

REFERENCES

[1] O. Aitsab and R. Pyndiah, “Performance of Reed-Solomon block turbocode,” in Proc. Globecom Conf., London, 1996, pp. 121–125.

[2] ——, “Performance of concatenated Reed-Solomon/convolutional codeswith iterative,” in Proc. Globecom Conf., Phoenix, AZ, 1997, pp. 934–938.

[3] K. M. Chugg, A. Anastasopoulos, and X. Chen, Iterative Detection:

Adaptivity, Complexity Reduction, and Applications. Kluwer AcademicPublishers, 2001.

[4] G. D. Forney, Jr., Concatenated Codes. Cambridge, Massachusetts:MIT Press, 1966.

[5] T. R. Halford, “Optimal soft-in soft-out decoding of Reed-Solomoncodes,” Communication Sciences Institute, USC, Los Angeles, CA,Tech. Rep. CSI-04-05-03, April 2003.

[6] R. Koetter, “Fast generalized minimum distance decoding of algebraicgeometric and Reed-Solomon codes,” IEEE Trans. Information Theory,vol. 42, pp. 721–737, May 1996.

[7] R. Koetter and A. Vardy, “Algebraic soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Information Theory, vol. 49, pp. 2809–2825, November 2003.

[8] V. Ponnampalam and A. Grant, “An efficient siso algorithm for Reed-Solomon codes,” in Proc. 4th Australian Comm. Theory Workshop,February 2003.

[9] J. G. Proakis, Digital Communications, 3rd ed. New York: McGraw-Hill, 1995.

[10] A. Vardy and Y. Be’ery, “Bit level soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Commununication, vol. 39, pp. 440–444,March 1991.

[11] N. Wiberg, “Codes and decoding on general graphs,” Ph.D. dissertation,Linkoping University (Sweden), 1996.

[12] S. B. Wicker and V. K. Bhargava, Reed-Solomon Codes and Their

Applications. John Wiley & Sons, 1999.[13] J. K. Wolf, “Efficient maximum-likelihood decoding of linear block

codes using a trellis,” IEEE Trans. Information Theory, vol. 24, pp.76–80, 1978.

3

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding

by nearly 2 dB.

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

The power of the optimal SISO lies in its performance in

iterative systems. Figure 4 illustrates the block diagram of

a Turbo-like code (TLC) formed by the serial concatenation

of a (7, 5, 3) Reed-Solomon code and a differential encodertransmitted over a 3-tap ISI channel. Note that the inner SISO

treats the differential encoder and ISI as a single code. Figure

5 illustrates the performance of this code for block lengths of

1050 and 5250 bits. I need a sentence defining the interleaver

and one stating that this code exhibits the performance we

expect.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients“1!2, 12 , 1

2

”.

III. EXTENSIONS OF THE OPTIMAL SISO

In this Section, the optimal SISO is extended to codes

that are closely related to Reed-Solomon codes. Section III-A

presents an optimal SISO for shortened RS codes. Section III-

B presents an optimal SISO for punctured RS codes. Section

III-C presents a SISO for extended RS codes. Section III-D

presents the Vardy-Be’ery decomposition in a more general

framework that acts as a recipe for creating linear codes with

efficient SISO decoding algorithms.

A. Shortened RS Codes

B. Punctured RS Codes

C. Extended RS Codes

D. Code Constructions Based on the Vardy-Be’ery Decompo-

sition

IV. SUBOPTIMAL SISO DECODING OF RS CODES

For high rate Reed-Solomon codes (for example, the (n, n!2, 3) and (n, n ! 4, 5) code families) the complexity of theoptimal SISO decoder presented in Section II is dominated

by the size of the glue code. As an example, consider the

(127, 125, 3) RS code: the parallel trellises have maximumstate space size 128 while there are 214 valid coset config-

urations in the glue code. The decoding complexity can be

dramatically reduced by considering only a subset of likely

coset configurations rather than the exhaustive combination

and marginalization described in Section II-C. Because the

glue decode is decoded suboptimally, the reduced-complexity

SISO is not optimal. Sections IV-A, IV-B and IV-C detail

algorithms for generating lists of likely coset configurations.

Simulation results of the suboptimal SISO as both a standalone

decoder and in iterative schemes are given in Section IV-D.

A. List Decoding of the Glue Code

LetM be the glue code corresponding to the RS code CRS

defined as per (6). Let M" be the binary image of this codewith generator matrix G"

M and parity check matrix H"M.

B. Special Case: The (n, n! 2, 3) RS CodesC. Special Case: The (n, n! 4, 5) RS CodesD. Simulation Results

V. COMPARISON TO THE KOETTER-VARDY SIHO

VI. CONCLUSIONS AND FUTURE WORK

mnmkmk"

m(k ! k")

APPENDIX

REFERENCES

[1] O. Aitsab and R. Pyndiah, “Performance of Reed-Solomon block turbocode,” in Proc. Globecom Conf., London, 1996, pp. 121–125.

[2] ——, “Performance of concatenated Reed-Solomon/convolutional codeswith iterative,” in Proc. Globecom Conf., Phoenix, AZ, 1997, pp. 934–938.

[3] K. M. Chugg, A. Anastasopoulos, and X. Chen, Iterative Detection:

Adaptivity, Complexity Reduction, and Applications. Kluwer AcademicPublishers, 2001.

[4] G. D. Forney, Jr., Concatenated Codes. Cambridge, Massachusetts:MIT Press, 1966.

[5] T. R. Halford, “Optimal soft-in soft-out decoding of Reed-Solomoncodes,” Communication Sciences Institute, USC, Los Angeles, CA,Tech. Rep. CSI-04-05-03, April 2003.

[6] R. Koetter, “Fast generalized minimum distance decoding of algebraicgeometric and Reed-Solomon codes,” IEEE Trans. Information Theory,vol. 42, pp. 721–737, May 1996.

[7] R. Koetter and A. Vardy, “Algebraic soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Information Theory, vol. 49, pp. 2809–2825, November 2003.

[8] V. Ponnampalam and A. Grant, “An efficient siso algorithm for Reed-Solomon codes,” in Proc. 4th Australian Comm. Theory Workshop,February 2003.

[9] J. G. Proakis, Digital Communications, 3rd ed. New York: McGraw-Hill, 1995.

[10] A. Vardy and Y. Be’ery, “Bit level soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Commununication, vol. 39, pp. 440–444,March 1991.

[11] N. Wiberg, “Codes and decoding on general graphs,” Ph.D. dissertation,Linkoping University (Sweden), 1996.

[12] S. B. Wicker and V. K. Bhargava, Reed-Solomon Codes and Their

Applications. John Wiley & Sons, 1999.[13] J. K. Wolf, “Efficient maximum-likelihood decoding of linear block

codes using a trellis,” IEEE Trans. Information Theory, vol. 24, pp.76–80, 1978.

3

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding

by nearly 2 dB.

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

The power of the optimal SISO lies in its performance in

iterative systems. Figure 4 illustrates the block diagram of

a Turbo-like code (TLC) formed by the serial concatenation

of a (7, 5, 3) Reed-Solomon code and a differential encodertransmitted over a 3-tap ISI channel. Note that the inner SISO

treats the differential encoder and ISI as a single code. Figure

5 illustrates the performance of this code for block lengths of

1050 and 5250 bits. I need a sentence defining the interleaver

and one stating that this code exhibits the performance we

expect.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients“1!2, 12 , 1

2

”.

III. EXTENSIONS OF THE OPTIMAL SISO

In this Section, the optimal SISO is extended to codes

that are closely related to Reed-Solomon codes. Section III-A

presents an optimal SISO for shortened RS codes. Section III-

B presents an optimal SISO for punctured RS codes. Section

III-C presents a SISO for extended RS codes. Section III-D

presents the Vardy-Be’ery decomposition in a more general

framework that acts as a recipe for creating linear codes with

efficient SISO decoding algorithms.

A. Shortened RS Codes

B. Punctured RS Codes

C. Extended RS Codes

D. Code Constructions Based on the Vardy-Be’ery Decompo-

sition

IV. SUBOPTIMAL SISO DECODING OF RS CODES

For high rate Reed-Solomon codes (for example, the (n, n!2, 3) and (n, n ! 4, 5) code families) the complexity of theoptimal SISO decoder presented in Section II is dominated

by the size of the glue code. As an example, consider the

(127, 125, 3) RS code: the parallel trellises have maximumstate space size 128 while there are 214 valid coset config-

urations in the glue code. The decoding complexity can be

dramatically reduced by considering only a subset of likely

coset configurations rather than the exhaustive combination

and marginalization described in Section II-C. Because the

glue decode is decoded suboptimally, the reduced-complexity

SISO is not optimal. Sections IV-A, IV-B and IV-C detail

algorithms for generating lists of likely coset configurations.

Simulation results of the suboptimal SISO as both a standalone

decoder and in iterative schemes are given in Section IV-D.

A. List Decoding of the Glue Code

LetM be the glue code corresponding to the RS code CRS

defined as per (6). Let M" be the binary image of this codewith generator matrix G"

M and parity check matrix H"M.

B. Special Case: The (n, n! 2, 3) RS CodesC. Special Case: The (n, n! 4, 5) RS CodesD. Simulation Results

V. COMPARISON TO THE KOETTER-VARDY SIHO

VI. CONCLUSIONS AND FUTURE WORK

mnmkmk"

m(k ! k")

APPENDIX

REFERENCES

[1] O. Aitsab and R. Pyndiah, “Performance of Reed-Solomon block turbocode,” in Proc. Globecom Conf., London, 1996, pp. 121–125.

[2] ——, “Performance of concatenated Reed-Solomon/convolutional codeswith iterative,” in Proc. Globecom Conf., Phoenix, AZ, 1997, pp. 934–938.

[3] K. M. Chugg, A. Anastasopoulos, and X. Chen, Iterative Detection:

Adaptivity, Complexity Reduction, and Applications. Kluwer AcademicPublishers, 2001.

[4] G. D. Forney, Jr., Concatenated Codes. Cambridge, Massachusetts:MIT Press, 1966.

[5] T. R. Halford, “Optimal soft-in soft-out decoding of Reed-Solomoncodes,” Communication Sciences Institute, USC, Los Angeles, CA,Tech. Rep. CSI-04-05-03, April 2003.

[6] R. Koetter, “Fast generalized minimum distance decoding of algebraicgeometric and Reed-Solomon codes,” IEEE Trans. Information Theory,vol. 42, pp. 721–737, May 1996.

[7] R. Koetter and A. Vardy, “Algebraic soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Information Theory, vol. 49, pp. 2809–2825, November 2003.

[8] V. Ponnampalam and A. Grant, “An efficient siso algorithm for Reed-Solomon codes,” in Proc. 4th Australian Comm. Theory Workshop,February 2003.

[9] J. G. Proakis, Digital Communications, 3rd ed. New York: McGraw-Hill, 1995.

[10] A. Vardy and Y. Be’ery, “Bit level soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Commununication, vol. 39, pp. 440–444,March 1991.

[11] N. Wiberg, “Codes and decoding on general graphs,” Ph.D. dissertation,Linkoping University (Sweden), 1996.

[12] S. B. Wicker and V. K. Bhargava, Reed-Solomon Codes and Their

Applications. John Wiley & Sons, 1999.[13] J. K. Wolf, “Efficient maximum-likelihood decoding of linear block

codes using a trellis,” IEEE Trans. Information Theory, vol. 24, pp.76–80, 1978.

Fig. 1. Structure of the generator matrix of the binary image of CRS whereB generates CBCH .

The structure of Figure 1 implies that a codeword in thebinary image of CRS is formed by interleaving m codewordsdrawn from CBCH and adding a glue code codeword. For-mally, define codes B and L over GF (2m)1:

B =

$b(X)

%%%%%b(X) =m#

j=1

#jc(j)(X),

c(j)(X) % CBCH

&(2)

L =

$l(X)

%%%%%l(X) =m#

j=1

#jl(j)(X), l(j)(X) % E

and l($) = 0 for $ % {!, . . . , !d!1}&

(3)

where E is the set of coset leaders of CBCH . The diagonalblocks of Figure 1 correspond to the binary image of B;codewords belonging to B are formed by interleaving BCHcodewords. The glue vectors of Figure 1 correspond to thebinary image of L; codewords belonging to L are formedby interleaving some combination of coset representatives ofCBCH . The RS code CRS is a binary linear combination of Band L.Example: (7, 5, 3) RS Code: Let C(7,5,3) be the (7, 5, 3)

RS code defined over GF (8) with generator polynomialg(7,5,3)(X) = (X + !)

'X + !2

(and let " : GF (8) $

GF (2)3 have basis!1,!,!2

". Specifically, with this basis:

1 $ 100, ! $ 010 and !2 $ 001. The associated BCHcode, C(7,4,3), has roots !, !2 and !4 and thus has dimension4 and minimum distance 3. A generator matrix for the binary

1Throughout this correspondence, codewords are described interchangeablyas n-tuples: c = (c0, c1, . . . , cn!1), and as polynomials in an indeterminantX: c(X) =

Pn!1i=0 ciXi.

image of C(7,5,3) with the structure shown in Figure 1 is:

G(7,5,3) =

)

******************+

1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 00 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 00 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 11 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 00 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 00 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0

,

------------------.

(4)

If the coset leaders of C(7,4,3) are labelled:

!0 = (0, 0, 0, 0, 0, 0, 0) !4 = (0, 0, 1, 0, 0, 0, 0)!1 = (1, 0, 0, 0, 0, 0, 0) !5 = (0, 0, 0, 0, 0, 0, 1) (5)!2 = (0, 1, 0, 0, 0, 0, 0) !6 = (0, 0, 0, 0, 1, 0, 0)!3 = (0, 0, 0, 1, 0, 0, 0) !7 = (0, 0, 0, 0, 0, 1, 0)

then the coset configurations that satify (3) are:

(!0, !0, !0) , (!1, !6, !4) , (!2, !7, !3) , (!3, !1, !7) ,

(!4, !5, !6) , (!5, !3, !2) , (!6, !2, !5) , (!7, !4, !1)(6)

To illustrate that the glue vectors of (4) generate the cosetconfigurations of (6), consider the binary sum of the three gluevectors:

(1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0)

and note that:

(1, 1, 1, 0, 0, 0, 0) % C(7,4,3) + !7

(0, 0, 0, 0, 1, 1, 1) % C(7,4,3) + !4 (7)(0, 0, 1, 1, 1, 0, 0) % C(7,4,3) + !1

The coset configuration corresponding to the sum of the gluevectors is thus (!7, !4, !1). !

B. An Alternate Definition of the Glue CodeLet

!!0, !1, . . . , !|E|!1

"= E be the set of coset leaders of

CBCH and let!s0, s1, . . . , s|E|!1

"= S be the corresponding

set of syndromes where |E| = 2n!k! . The map between cosetleaders and syndromes is defined by:

si = !iH#BCH (8)

where HBCH is an n! k"&n parity check matrix for CBCH

and ' denotes matrix transposition.Let T be the code obtained from L by replacing each coset

leader l(j) by the corresponding syndrome t(j) = l(j)H#BCH :

T =

$t(X)

%%%%%t(X) =m#

j=1

#jt(j)(X), t(j)(X) % S (9)

and t($) = 0 for $ %!!, . . . , !d!1

"&

Page 3: Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ... · PDF file1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ardy and BeÕeryÕ s Decomposition Thomas R

3

where {#1, #2, . . . , #m} defines the mapping " : GF (2m) $(GF (2))m and is a subset of a basis for the mapping "" :GF (2n!k!

) $ (GF (2))n!k!. Codewords in T are thus

formed by interleaving m binary syndrome vectors to forman m-tuple of symbols drawn from GF (2n!k!

).The code defined by (9) has length m and is defined over

GF (2n!k!). The codes L and T are clearly closely related;

the term glue code is used to denote L and T interchangeablythroughout this correspondence. Codewords l(X) % L andt(X) % T must both satisfy l($) = t($) = 0 for $ %!!, . . . , !d!1

". When considering the generation or encoding

of a codeword, the L representation of the glue code is moreuseful as was demonstrated in the example of Section II-A. As will be seen in Sections II-C and III, however, whenconsidering the decoding of a codeword, the T representationof the glue code is more useful.Example: (7, 5, 3) RS Code: The glue code T of the

(7, 5, 3) RS code is defined over GF (2n!k!) = GF (8), has

roots!!, !2

"and is thus the (7, 5, 3) RS code shortened to

length 3 with generator matrix:

GT =/

!3 !4 10

(10)

over GF (8). The columns of the binary image of GT can berearranged so that the bits from each syndrome are groupedtogether yielding a generator matrix for the resulting binarycode T " that is systematic in t(1)(X):

t(1)0 t(1)1 t(1)2 t(2)0 t(2)1 t(2)2 t(3)0 t(3)1 t(3)2

GT ! =

1 1 0 0 0 1 1 0 0 10 1 0 1 1 1 1 1 00 0 1 1 0 1 0 1 1

2(11)

Eight syndrome 3-tuples, or syndrome configurations, aregenerated by GT ! :

(s0, s0, s0) , (s1, s6, s4) , (s2, s7, s3) , (s3, s1, s7) ,

(s4, s5, s6) , (s5, s3, s2) , (s6, s2, s5) , (s7, s4, s1)(12)

The set of syndrome configurations in (12) are in one-to-onecorrespondence with the set of coset configurations in (6) via(8). Note that not every syndrome 3-tuple corresponds to acodeword in T . For example, (s1, s1, s1) /% T . Syndromeconfigurations in T are denoted valid syndrome configurations.The corresponding coset configurations in L are denoted validcoset configurations. !

C. RS Code Factor GraphThe generator matrix structure seen in Figure 1 implies a

cycle-free factor graph for RS codes. The RS factor graphconsists ofm parallel n-stage binary trellises and an additionalglue node as illustrated in Figure 2 where variables arerepresented by circular vertices, state variables by doublecircles and local constraints by square vertices. The binarytrellises correspond to CBCH and are constructed using theWolf method [18]. The final trellis stage is a 2n!k! -aryvariable node corresponding to the cosets, or equivalently thesyndromes, of CBCH . The node connecting the final trellisstages corresponds to the glue code.

Coded bits are labeled {c(j)i }i=0,...,n!1;j=1,...,m and un-

coded bits are similarly labeled {a(j)i }i=0,...,k!1;j=1,...,m. If

there is no a priori soft information on uncoded bits thenthe corresponding sections of the factor graph are ignored. Ifthere is a priori soft information on uncoded bits and if asystematic encoder is used then the equality constraints in thecorresponding sections of the factor graph enforce a(j)

i = c(j)i

for i = 0, . . . , k ! 1 and j = 1, . . . ,m.

3

The RS factor graph is cycle free. Given a proper activation

schedule the standard message passing algorithm thus per-

forms optimal decoding [3]. One such activation schedule that

ensures optimality is the inward-outward recursion described

above.

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding by2.5 dB. The power of the optimal SISO lies in its performance

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

in iterative systems. Figure 4 illustrates the block diagram of

a TLC formed by the serial concatenation of a (7, 5, 3) Reed-Solomon code and a differential encoder transmitted over a 3-

tap ISI channel. Note that the inner SISO treats the differential

encoder and ISI as a single code. Figure 5 illustrates the

performance of this code.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients(1, 1, 1).

III. EXTENSIONS OF THE OPTIMAL SISO

This section is TBD. This section will include all of the

obvious extensions of the RS code that Alex and Tom talked

about: shortened RS codes, lengthening etc. I thought it

might also be useful to present Visha’s code construction and

then say that this type of SISO can be applied to any such

constructed code.

IV. SUBOPTIMAL SISO DECODING OF RS CODES

V. COMPARISON OF THE KOETTER-VARDY SIHO

This section is mostly TBD. Here I wanted to tabulate the

complexity of our SISOs and then compare, in the case of

(n, n!2) and (n, n!4) codes the complexity vs. performancetradeoffs between our SISO and the Koetter-Vardy algorithm. I

have included a few plots in this draft that show our SISO vs.

the KVA. One thing to stress is that the KVA does not lend itself

to a SISO extension - I talked to Ralf Koetter about this at CTW

or an application of the Chase algorithm because although the

KVA is a list decoder, the list of candidate codewords produced

by the algorithm is often just 1 or 2 (this has been shown in

the literature).

VI. CONCLUSIONS AND FUTURE WORK

This section is TBD. One thing to say maybe is that the

top down approach - ie decomposition of RS codes - has been

exhausted and that less complex suboptimal SISOs must be

motivated by the bottom up approach - ie Tanner graphs.

APPENDIX

I think two appendices are in order: 1) an explanation

of the shortest path algorithm used 2) a development of the

complexity measures used for the Koetter-Vardy algorithm - I

worked on this measure last summer for Trellisware

x0,0 (9)

x0,1 (10)

x0,m!1 (11)

x1,0 (12)

x1,1 (13)

x1,m!1 (14)

xn!1,0 (15)

xn!1,1 (16)

xn!1,m!1 (17)

a0,0 (18)

a0,1 (19)

a0,m!1 (20)

a1,0 (21)

a1,1 (22)

a1,m!1 (23)

an!1,0 (24)

an!1,1 (25)

an!1,m!1 (26)

. . . (27)

... (28)

3

The RS factor graph is cycle free. Given a proper activation

schedule the standard message passing algorithm thus per-

forms optimal decoding [3]. One such activation schedule that

ensures optimality is the inward-outward recursion described

above.

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding by2.5 dB. The power of the optimal SISO lies in its performance

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

in iterative systems. Figure 4 illustrates the block diagram of

a TLC formed by the serial concatenation of a (7, 5, 3) Reed-Solomon code and a differential encoder transmitted over a 3-

tap ISI channel. Note that the inner SISO treats the differential

encoder and ISI as a single code. Figure 5 illustrates the

performance of this code.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients(1, 1, 1).

III. EXTENSIONS OF THE OPTIMAL SISO

This section is TBD. This section will include all of the

obvious extensions of the RS code that Alex and Tom talked

about: shortened RS codes, lengthening etc. I thought it

might also be useful to present Visha’s code construction and

then say that this type of SISO can be applied to any such

constructed code.

IV. SUBOPTIMAL SISO DECODING OF RS CODES

V. COMPARISON OF THE KOETTER-VARDY SIHO

This section is mostly TBD. Here I wanted to tabulate the

complexity of our SISOs and then compare, in the case of

(n, n!2) and (n, n!4) codes the complexity vs. performancetradeoffs between our SISO and the Koetter-Vardy algorithm. I

have included a few plots in this draft that show our SISO vs.

the KVA. One thing to stress is that the KVA does not lend itself

to a SISO extension - I talked to Ralf Koetter about this at CTW

or an application of the Chase algorithm because although the

KVA is a list decoder, the list of candidate codewords produced

by the algorithm is often just 1 or 2 (this has been shown in

the literature).

VI. CONCLUSIONS AND FUTURE WORK

This section is TBD. One thing to say maybe is that the

top down approach - ie decomposition of RS codes - has been

exhausted and that less complex suboptimal SISOs must be

motivated by the bottom up approach - ie Tanner graphs.

APPENDIX

I think two appendices are in order: 1) an explanation

of the shortest path algorithm used 2) a development of the

complexity measures used for the Koetter-Vardy algorithm - I

worked on this measure last summer for Trellisware

x0,0 (9)

x0,1 (10)

x0,m!1 (11)

x1,0 (12)

x1,1 (13)

x1,m!1 (14)

xn!1,0 (15)

xn!1,1 (16)

xn!1,m!1 (17)

a0,0 (18)

a0,1 (19)

a0,m!1 (20)

a1,0 (21)

a1,1 (22)

a1,m!1 (23)

an!1,0 (24)

an!1,1 (25)

an!1,m!1 (26)

. . . (27)

... (28)

3

The RS factor graph is cycle free. Given a proper activation

schedule the standard message passing algorithm thus per-

forms optimal decoding [3]. One such activation schedule that

ensures optimality is the inward-outward recursion described

above.

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding by2.5 dB. The power of the optimal SISO lies in its performance

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

in iterative systems. Figure 4 illustrates the block diagram of

a TLC formed by the serial concatenation of a (7, 5, 3) Reed-Solomon code and a differential encoder transmitted over a 3-

tap ISI channel. Note that the inner SISO treats the differential

encoder and ISI as a single code. Figure 5 illustrates the

performance of this code.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients(1, 1, 1).

III. EXTENSIONS OF THE OPTIMAL SISO

This section is TBD. This section will include all of the

obvious extensions of the RS code that Alex and Tom talked

about: shortened RS codes, lengthening etc. I thought it

might also be useful to present Visha’s code construction and

then say that this type of SISO can be applied to any such

constructed code.

IV. SUBOPTIMAL SISO DECODING OF RS CODES

V. COMPARISON OF THE KOETTER-VARDY SIHO

This section is mostly TBD. Here I wanted to tabulate the

complexity of our SISOs and then compare, in the case of

(n, n!2) and (n, n!4) codes the complexity vs. performancetradeoffs between our SISO and the Koetter-Vardy algorithm. I

have included a few plots in this draft that show our SISO vs.

the KVA. One thing to stress is that the KVA does not lend itself

to a SISO extension - I talked to Ralf Koetter about this at CTW

or an application of the Chase algorithm because although the

KVA is a list decoder, the list of candidate codewords produced

by the algorithm is often just 1 or 2 (this has been shown in

the literature).

VI. CONCLUSIONS AND FUTURE WORK

This section is TBD. One thing to say maybe is that the

top down approach - ie decomposition of RS codes - has been

exhausted and that less complex suboptimal SISOs must be

motivated by the bottom up approach - ie Tanner graphs.

APPENDIX

I think two appendices are in order: 1) an explanation

of the shortest path algorithm used 2) a development of the

complexity measures used for the Koetter-Vardy algorithm - I

worked on this measure last summer for Trellisware

x0,0 (9)

x0,1 (10)

x0,m!1 (11)

x1,0 (12)

x1,1 (13)

x1,m!1 (14)

xn!1,0 (15)

xn!1,1 (16)

xn!1,m!1 (17)

a0,0 (18)

a0,1 (19)

a0,m!1 (20)

a1,0 (21)

a1,1 (22)

a1,m!1 (23)

an!1,0 (24)

an!1,1 (25)

an!1,m!1 (26)

. . . (27)

... (28)

3

The RS factor graph is cycle free. Given a proper activation

schedule the standard message passing algorithm thus per-

forms optimal decoding [3]. One such activation schedule that

ensures optimality is the inward-outward recursion described

above.

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding by2.5 dB. The power of the optimal SISO lies in its performance

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

in iterative systems. Figure 4 illustrates the block diagram of

a TLC formed by the serial concatenation of a (7, 5, 3) Reed-Solomon code and a differential encoder transmitted over a 3-

tap ISI channel. Note that the inner SISO treats the differential

encoder and ISI as a single code. Figure 5 illustrates the

performance of this code.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients(1, 1, 1).

III. EXTENSIONS OF THE OPTIMAL SISO

This section is TBD. This section will include all of the

obvious extensions of the RS code that Alex and Tom talked

about: shortened RS codes, lengthening etc. I thought it

might also be useful to present Visha’s code construction and

then say that this type of SISO can be applied to any such

constructed code.

IV. SUBOPTIMAL SISO DECODING OF RS CODES

V. COMPARISON OF THE KOETTER-VARDY SIHO

This section is mostly TBD. Here I wanted to tabulate the

complexity of our SISOs and then compare, in the case of

(n, n!2) and (n, n!4) codes the complexity vs. performancetradeoffs between our SISO and the Koetter-Vardy algorithm. I

have included a few plots in this draft that show our SISO vs.

the KVA. One thing to stress is that the KVA does not lend itself

to a SISO extension - I talked to Ralf Koetter about this at CTW

or an application of the Chase algorithm because although the

KVA is a list decoder, the list of candidate codewords produced

by the algorithm is often just 1 or 2 (this has been shown in

the literature).

VI. CONCLUSIONS AND FUTURE WORK

This section is TBD. One thing to say maybe is that the

top down approach - ie decomposition of RS codes - has been

exhausted and that less complex suboptimal SISOs must be

motivated by the bottom up approach - ie Tanner graphs.

APPENDIX

I think two appendices are in order: 1) an explanation

of the shortest path algorithm used 2) a development of the

complexity measures used for the Koetter-Vardy algorithm - I

worked on this measure last summer for Trellisware

x0,0 (9)

x0,1 (10)

x0,m!1 (11)

x1,0 (12)

x1,1 (13)

x1,m!1 (14)

xn!1,0 (15)

xn!1,1 (16)

xn!1,m!1 (17)

a0,0 (18)

a0,1 (19)

a0,m!1 (20)

a1,0 (21)

a1,1 (22)

a1,m!1 (23)

an!1,0 (24)

an!1,1 (25)

an!1,m!1 (26)

. . . (27)

... (28)

3

The RS factor graph is cycle free. Given a proper activation

schedule the standard message passing algorithm thus per-

forms optimal decoding [3]. One such activation schedule that

ensures optimality is the inward-outward recursion described

above.

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding by2.5 dB. The power of the optimal SISO lies in its performance

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

in iterative systems. Figure 4 illustrates the block diagram of

a TLC formed by the serial concatenation of a (7, 5, 3) Reed-Solomon code and a differential encoder transmitted over a 3-

tap ISI channel. Note that the inner SISO treats the differential

encoder and ISI as a single code. Figure 5 illustrates the

performance of this code.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients(1, 1, 1).

III. EXTENSIONS OF THE OPTIMAL SISO

This section is TBD. This section will include all of the

obvious extensions of the RS code that Alex and Tom talked

about: shortened RS codes, lengthening etc. I thought it

might also be useful to present Visha’s code construction and

then say that this type of SISO can be applied to any such

constructed code.

IV. SUBOPTIMAL SISO DECODING OF RS CODES

V. COMPARISON OF THE KOETTER-VARDY SIHO

This section is mostly TBD. Here I wanted to tabulate the

complexity of our SISOs and then compare, in the case of

(n, n!2) and (n, n!4) codes the complexity vs. performancetradeoffs between our SISO and the Koetter-Vardy algorithm. I

have included a few plots in this draft that show our SISO vs.

the KVA. One thing to stress is that the KVA does not lend itself

to a SISO extension - I talked to Ralf Koetter about this at CTW

or an application of the Chase algorithm because although the

KVA is a list decoder, the list of candidate codewords produced

by the algorithm is often just 1 or 2 (this has been shown in

the literature).

VI. CONCLUSIONS AND FUTURE WORK

This section is TBD. One thing to say maybe is that the

top down approach - ie decomposition of RS codes - has been

exhausted and that less complex suboptimal SISOs must be

motivated by the bottom up approach - ie Tanner graphs.

APPENDIX

I think two appendices are in order: 1) an explanation

of the shortest path algorithm used 2) a development of the

complexity measures used for the Koetter-Vardy algorithm - I

worked on this measure last summer for Trellisware

x0,0 (9)

x0,1 (10)

x0,m!1 (11)

x1,0 (12)

x1,1 (13)

x1,m!1 (14)

xn!1,0 (15)

xn!1,1 (16)

xn!1,m!1 (17)

a0,0 (18)

a0,1 (19)

a0,m!1 (20)

a1,0 (21)

a1,1 (22)

a1,m!1 (23)

an!1,0 (24)

an!1,1 (25)

an!1,m!1 (26)

. . . (27)

... (28)

3

The RS factor graph is cycle free. Given a proper activation

schedule the standard message passing algorithm thus per-

forms optimal decoding [3]. One such activation schedule that

ensures optimality is the inward-outward recursion described

above.

D. Simulation Results

Throughout this work BPSK modulation over AWGN chan-

nels is assumed unless otherwise stated. Figure 3 compares the

performance of the optimal SISO to Berlekamp-Massey hard-

decision decoder for the (7, 5, 3) Reed-Solomon code. At aBER of 10!6, the optimal SISO outperforms hard decoding by2.5 dB. The power of the optimal SISO lies in its performance

Fig. 3. Bit error rate performance comparison of the optimal SISO decoderand Berlekamp-Massey decoding of the (7, 5, 3) Reed-Solomon code.

in iterative systems. Figure 4 illustrates the block diagram of

a TLC formed by the serial concatenation of a (7, 5, 3) Reed-Solomon code and a differential encoder transmitted over a 3-

tap ISI channel. Note that the inner SISO treats the differential

encoder and ISI as a single code. Figure 5 illustrates the

performance of this code.

Fig. 4. Block diagram of TLC formed using the (7, 5, 3) RS code in ISIchannel and the associated iterative detection network.

Fig. 5. Performance of the (7, 5, 3) RS TLC with channel coeffecients(1, 1, 1).

III. EXTENSIONS OF THE OPTIMAL SISO

This section is TBD. This section will include all of the

obvious extensions of the RS code that Alex and Tom talked

about: shortened RS codes, lengthening etc. I thought it

might also be useful to present Visha’s code construction and

then say that this type of SISO can be applied to any such

constructed code.

IV. SUBOPTIMAL SISO DECODING OF RS CODES

V. COMPARISON OF THE KOETTER-VARDY SIHO

This section is mostly TBD. Here I wanted to tabulate the

complexity of our SISOs and then compare, in the case of

(n, n!2) and (n, n!4) codes the complexity vs. performancetradeoffs between our SISO and the Koetter-Vardy algorithm. I

have included a few plots in this draft that show our SISO vs.

the KVA. One thing to stress is that the KVA does not lend itself

to a SISO extension - I talked to Ralf Koetter about this at CTW

or an application of the Chase algorithm because although the

KVA is a list decoder, the list of candidate codewords produced

by the algorithm is often just 1 or 2 (this has been shown in

the literature).

VI. CONCLUSIONS AND FUTURE WORK

This section is TBD. One thing to say maybe is that the

top down approach - ie decomposition of RS codes - has been

exhausted and that less complex suboptimal SISOs must be

motivated by the bottom up approach - ie Tanner graphs.

APPENDIX

I think two appendices are in order: 1) an explanation

of the shortest path algorithm used 2) a development of the

complexity measures used for the Koetter-Vardy algorithm - I

worked on this measure last summer for Trellisware

x0,0 (9)

x0,1 (10)

x0,m!1 (11)

x1,0 (12)

x1,1 (13)

x1,m!1 (14)

xn!1,0 (15)

xn!1,1 (16)

xn!1,m!1 (17)

a0,0 (18)

a0,1 (19)

a0,m!1 (20)

a1,0 (21)

a1,1 (22)

a1,m!1 (23)

an!1,0 (24)

an!1,1 (25)

an!1,m!1 (26)

. . . (27)

... (28)

4

for the binary image of C is constructed with n3 parallel

Wolf trellises connected by a glue node corresponding to G.The Wolf trellises are pruned so that only paths correponding

to the cosets in T remain. Optimal SISO decoding of the

binary imageC is achieved using the inward-outward activationschedule described in Section II-C.

For specific choices of C1 and C2 there may exist factor

graphs of lower complexity based on generator matrix trellises.

For these codes, the n3 parallel Wolf trellises can be replaced

by n32k1!k2 generator matrix trellises because one generator

matrix trellis is required for each coset. As an example,

consider a construction where C2 is the (16, 11, 5) extendedHamming code. The Wolf trellis for this code has 32 states

while there exists an 8 state generator matrix trellis [?]; it may

be more efficient to decode over n32k1!k2 8 state trellises in

parallel than n3 32 states in parallel.

IV. SUBOPTIMAL SISO DECODING OF RS CODES

For high rate Reed-Solomon codes (for example, the (n, n!2, 3) and (n, n ! 4, 5) code families) the complexity of theoptimal SISO decoder presented in Section II is dominated

by the size of the glue code. As an example, consider the

(127, 125, 3) RS code: the parallel trellises have maximumstate space size 128 while there are 214 valid coset config-

urations in the glue code. The decoding complexity can be

dramatically reduced by considering only a subset of likely

coset configurations rather than the exhaustive combination

and marginalization described in Section II-C. Because the

glue decode is decoded suboptimally, the reduced-complexity

SISO is not optimal. Sections IV-A, IV-B and IV-C detail

algorithms for generating lists of likely coset configurations.

Simulation results of the suboptimal SISO as both a standalone

decoder and in iterative schemes are given in Section IV-D.

A. List Decoding of the Glue Code

LetM be the glue code corresponding to the RS code CRS

defined as per (6). Let M" be the binary image of this codewith generator matrix G"

M and parity check matrix H"M.

B. Special Case: The (n, n! 2, 3) RS CodesC. Special Case: The (n, n! 4, 5) RS CodesD. Simulation Results

V. COMPARISON TO THE KOETTER-VARDY SIHO

VI. CONCLUSIONS AND FUTURE WORK

Glue code

. . . (13)

constraint node

APPENDIX

REFERENCES

[1] O. Aitsab and R. Pyndiah, “Performance of Reed-Solomon block turbocode,” in Proc. Globecom Conf., London, 1996, pp. 121–125.

[2] ——, “Performance of concatenated Reed-Solomon/convolutional codeswith iterative,” in Proc. Globecom Conf., Phoenix, AZ, 1997, pp. 934–938.

[3] S. Crozier, “New high-spread high-distance interleavers for turbo-codes,”in Proc. 20th Biennial Symp. Comm., Kingston, ON, Canada, 2000, pp.3–7.

[4] G. D. Forney, Jr., Concatenated Codes. Cambridge, Massachusetts:MIT Press, 1966.

[5] T. R. Halford, “Optimal soft-in soft-out decoding of Reed-Solomoncodes,” Communication Sciences Institute, USC, Los Angeles, CA,Tech. Rep. CSI-04-05-03, April 2003.

[6] R. Koetter, “Fast generalized minimum distance decoding of algebraicgeometric and Reed-Solomon codes,” IEEE Trans. Information Theory,vol. 42, pp. 721–737, May 1996.

[7] R. Koetter and A. Vardy, “Algebraic soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Information Theory, vol. 49, pp. 2809–2825, November 2003.

[8] V. Ponnampalam and A. Grant, “An efficient siso algorithm for Reed-Solomon codes,” in Proc. 4th Australian Comm. Theory Workshop,February 2003.

[9] J. G. Proakis, Digital Communications, 3rd ed. New York: McGraw-Hill, 1995.

[10] A. Vardy and Y. Be’ery, “Bit level soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Commununication, vol. 39, pp. 440–444,March 1991.

[11] N. Wiberg, “Codes and decoding on general graphs,” Ph.D. dissertation,Linkoping University (Sweden), 1996.

[12] S. B. Wicker and V. K. Bhargava, Reed-Solomon Codes and Their

Applications. John Wiley & Sons, 1999.[13] J. K. Wolf, “Efficient maximum-likelihood decoding of linear block

codes using a trellis,” IEEE Trans. Information Theory, vol. 24, pp.76–80, 1978.

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

a(1)0 (1)

a(2)0 (2)

a(m)0 (3)

a(1)k!1 (4)

a(2)k!1 (5)

a(m)k!1 (6)

c(1)0 (7)

c(2)0 (8)

c(m)0 (9)

c(1)k!1 (10)

Wolf parity checkc(2)k!1 (11)

trellisc(m)k!1 (12)

Uncoded bitsc(1)k (13)

for systematicc(2)k (14)

trellisc(m)k (15)

Glue codec(1)n!1 (16)

1

constraint nodec(2)n!1 (17)

c(m)n!1 (18)

2

constraint nodec(2)n!1 (17)

c(m)n!1 (18)

2

constraint nodec(2)n!1 (17)

c(m)n!1 (18)

2

Fig. 2. Reed-Solomon code factor graph based on the Vardy Be’erydecomposition.

In [10] Vardy and Be’ery noted that generator matrices withthe structure shown in Figure 1 can be found for any codecontaining a subfield subcode. Accordingly, there exist factorgraph representations similar to that shown in Figure 2 forcodes containing subfield subcodes. Specifically, such factorgraphs can be found for shortened and extended RS codes[19].

D. Optimal SISO Decoding

Using the factor graph shown in Figure 2, the following(non-unique) message passing schedule ensures optimal SISOdecoding. An inward recursion is performed on each trellis inparallel corresponding to the forward recursion of the standardforward-backward algorithm (FBA) on a trellis. The forwardstate messages for the the final state then act as soft input toan optimal SISO decoding of the glue code. The backwardstate metrics of the trellises are initialized with the soft outputof the glue code SISO decoder. The outward recursion oneach trellis is then performed in parallel corresponding to thebackward recursion of the standard FBA. After the forwardand backward metrics are computed at each trellis state,

Page 4: Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ... · PDF file1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ardy and BeÕeryÕ s Decomposition Thomas R

4

soft-out information on coded (and possibly uncoded) bits isobtained as per the FBA.Optimal SISO decoding of CRS requires optimal SISO

decoding of the glue code. One such optimal SISO decoderuses a trellis. When |T | is small, a trellis need not be usedand the SISO decoding of the glue code proceeds as follows.The metric associated with each valid syndrome configurationt % T is computed by combining soft input information onthe individual syndromes. The soft output on each syndromeis then found by marginalizing over all configurations consis-tent with that syndrome. This process is denoted exhaustivecombination and marginalization.

E. Complexity of the Optimal SISOThe complexity of SISO decoding of RS codes using a Wolf

trellis grows as the number of trellis states [20]:

O32m(n!k)

4

For most RS codes, the complexity of the proposed optimalSISO decoder is dominated by the complexity of the gluecode SISO decoder. If the glue code is decoded via exhaustivecombination and marginalization then the complexity of theglue code SISO grows as the number of glue codewords:2m(k!k!). If the glue code is decoded with a trellis then thecomplexity of the glue code SISO grows as the number ofstates in the glue code trellis representation: 2m(n!k). Thecomplexity of the proposed optimal SISO decoder thus growsas:

O3min

32m(n!k), 2m(k!k!)

44

For RS codes where k ! k" " n! k, the proposed optimalSISO decoder is much less complex than trellis decoding.This set includes the (7, 3, 5), (7, 5, 3), (15, 9, 7), (15, 11, 5)and (15, 13, 3) codes. For larger RS codes, the complexityof the Wolf trellis and proposed SISO decoders both growas O

'2m(n!k)

(and neither decoder is practically realizable

in accordance with the Cut-Set Bound [13], [14]. Moreover,the Cut-Set Bound precludes the existence of any practicallyrealizable optimal SISO decoding algorithms for large RScodes.A more specific complexity comparison is made by first

estimating the number of real operations required by the FBAon an N -stage, S-state Wolf trellis. The forward and backwardrecursions each require 1 add-compare-select operation, or 3real operations, per-state, per-stage [11]. The completion steprequires 2 log2 S real operations per stage. Exhaustive com-bination and marginalization of a length m code requires 2mreal operations per codeword. Table I compares the complexityof the proposed optimal SISO and Wolf trellis decoders fora number of RS codes. Complexity is given as the base-2logarithm of real operations per codeword.

III. SUBOPTIMAL SISO DECODING OF RS CODESFor high-rate RS codes, the complexity of the proposed

optimal SISO decoder is dominated by the complexity of theoptimal glue code SISO decoder. Practically realizable subop-timal SISO decoding algorithms for high-rate RS codes can be

TABLE ICOMPLEXITY COMPARISON OF PROPOSED OPTIMAL SISO AND WOLF

TRELLIS DECODERS FOR VARIOUS REED-SOLOMON CODES. COMPLEXITYIS MEASURED BY THE BASE-2 LOGARITHM OF REAL OPERATIONS PER

CODEWORD.Wolf Trellis Proposed SISO

m n k k" Complexity Complexity3 7 3 1 19.0 13.1

5 4 13.0 10.04 15 9 5 32.5 19.8

11 7 24.5 19.213 11 16.5 13.0

5 31 25 16 39.9 38.827 21 29.9 28.229 26 19.9 17.5

developed by replacing the optimal glue code SISO decoderby a suboptimal glue code SISO decoder. This correspondenceproposes such a decoder that uses a list-based glue code SISOdecoder.List-based SISO decoders for linear block codes have been

examined extensively in the literature (see, for example, [21]and the references therein). List-based SISO decoders firstproduce a list of likely codewords K ( C and then perform themarginalization described in Section II-D over K rather thanall codewords C. The complexity of list-based SISO decodersdepends on K = |K| and can thus be controlled.

A. Generic Glue Code List Generation

The generation of K is the most difficult design aspectof list-based SISO decoders [21]. The following presents ageneric approach to list generation for the glue code.As described in Section II-B, codewords of T are BCH

syndrome m-tuples (or configurations). Specifically, let w bethe syndrome configuration:

w =3s(1)w , . . . , s(m)

w

4% Sm (13)

where S is the set of BCH syndromes. Associated with w isthe syndrome configuration metric:

M[w] =m#

i=1

MI[s(i)w ] (14)

where MI[s(i)w ] is the final state metric corresponding to the

syndrome s(i)w in the i-th parallel BCH trellis2. Recall from

Section II-B that the set of all syndrome configurations is asuperset of the glue code T . Algorithm 1 generates a list oflikely codewords by generating a list of likely syndrome con-figurations and throwing out those configurations not containedin T . The next shortest path algorithm described in [23] is usedobtain the likely syndrome configurations; this algorithm wasused successfully for list detection in multiple access channelsin [24].

2Note that min-sum or min!-sum processing is assumed and combinationis achieved via addition of metrics rather than multiplication of probabilities.Metrics are negative logartithms of probabilities [22].

Page 5: Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ... · PDF file1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ardy and BeÕeryÕ s Decomposition Thomas R

5

Input: MI[s(i)] for all s % S and i = 1, . . . ,m.Output: List of glue codewords with K-shortest

syndrome configuration metrics.K) *;codewords found ) 0;repeat

w ) syndrome configuration with next smallestsyndrome configuration metric;

if w % T thenK) K + {w};codewords found ) codewords found + 1;

enduntil codewords found = K;

Algorithm 1: Generic glue code list generation.

B. Glue Code List Generation for (n, n! 2, 3) RS CodesAlgorithm 1 is inefficient because in order to generate a

list of K codewords, many more than K syndrome configu-rations must be generated. The following presents a reduced-complexity list generation algorithm for the glue codes cor-responding to the (n, n ! 2, 3) RS codes that exploits thealgebraic structure of T . The authors have developed similarreduced-complexity glue code list generation algorithms forthe glue codes corresponding to the (n, n ! 4, 5) RS codesand the (15, 9, 7) code [19]; these are omitted for the sake ofbrevity.Let CRS be an (n = 2m ! 1, n! 2, 3) RS code with roots

! and !2 where ! is primitive in GF (2m). Since !2m

= !in GF (2m), the union of the sets of cyclotomic conjugates of! and !2 over GF (2) is

!!, !2, . . . ,!2m!1

". The associated

BCH code CBCH thus has dimension:

k" = k ! (m + 2) = n!m (15)

For the (n, n! 2, 3) RS codes, the glue code T is, therefore,a length m code defined over GF (2m) with roots

!!, !2

".

Since |T | = 2m(k!k!) = 2m(m!2), the dimension of the gluecode is m! 2 and T is a shortened (n, n! 2, 3) RS code.As per the example of Section II-B, let GT generate T

over GF (2m) and let GT ! be the binary image of GTwith columns reordered so that bits from each syndrome aregrouped together. For the specific (n, n! 2, 3) RS codes con-sidered in this correspondence, binary generators matrices thatare systematic in the bits corresponding to t(1), . . . , t(m!2)

were obtained.Define a syndrome sub-configuration, w|m!2, as the first

m! 2 elements of a syndrome configuration w % Sm:

w|m!2 =3s(1)w , . . . , s(m!2)

w

4% Sm!2 (16)

Associated with wm!2 is the syndrome sub-configurationmetric:

M[w|m!2] =m!2#

i=1

MI[s(i)w ] (17)

where MI[s(i)w ] is defined as per Section III-A. Algorithm 2

generates a list of likely codewords by generating a list oflikely syndrome sub-configurations and exploits the structure

of T to determine a glue code codeword corresponding to eachsub-configuration. Note that only K sub-configurations needbe generated since each sub-configuration generates a singlecodeword. Also note that the lists of codewords produced byAlgorithms 1 and 2 may not be identical since Algorithm 2uses metrics from only m ! 2 trellises in its shortest pathsearch.

Input: MI[s(i)] for all s % S and i = 1, . . . ,m! 2.Output: List of glue codewords with K-shortest

syndrome sub-configuration metrics.K) *;for k = 1, . . . ,K do

w|m!2 ) syndrome sub-configuration with nextsmallest syndrome sub-configurationmetric;

w"|m!2 ) binary image of w|m!2;

t" ) w"|m!2GT ! ;

t) glue code codeword corresponding to t";K) K + {t};

end

Algorithm 2: List generation for (n, n! 2, 3) RS codes.

Since RS codes are MDS over GF (2m), Algorithm 2 canreadily be adapted to generate a list of likely RS codewordsby considering configurations of GF (2m) symbols and thecorresponding symbol-level soft information. The resultinglist-based SISO decoder is denoted the standard list-basedSISO in the following section since it does not exploit theVardy-Be’ery decomposition.

C. Simulation Results and Discussion

In this section the performance of the proposed suboptimalSISO decoder is compared to that of both the proposed optimalSISO decoder and the standard list-based SISO decoder. Notethat the list generation algorithms described in Section III-Bare used rather than Algorithm 1. Binary antipodal signalingover AWGN channels is assumed throughout.Figure 3 compares the performance of the three algorithms

when used as stand-alone decoders for the (15, 13, 3) and(15, 11, 5) codes. The glue codes of the (15, 13, 3) and(15, 11, 5) contain 256 and 65536 codewords respectively.A negligible performance loss is incurred by the proposedsuboptimal decoders when respective glue code list sizes of 32and 1024 are used. Observe that the standard list-based SISOdecoder with list size 1024 incurs a 0.5 dB loss with respectto the proposed algorithms. Generating 1024 codewords of the(15, 11, 5) glue code, which is a length 4 code over GF (256),is substantially less complex than generating 1024 codewordsof the (15, 11, 5) code over GF (16).The proposed suboptimal SISO was also compared to the

proposed optimal SISO for the (31, 29, 3) and (63, 61, 3)codes. It was found that respective glue code list sizes of64 and 128 were required in order to approximate optimalperformance for these codes.

Page 6: Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ... · PDF file1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ardy and BeÕeryÕ s Decomposition Thomas R

6

10-4

10-3

10-2

3.5 4 4.5 5 5.5 6 6.5 7

(15,13) Optimal(15,13) Suboptimal (32)(15,13) Std. List (64)(15,11) Optimal(15,11) Suboptimal (1024)(15,11) Std. List (1024)

Codew

ord

Err

or

Rate

Eb/No (dB)

Fig. 3. Codeword error rate performance comparison of the proposedoptimal SISO, proposed suboptimal SISO and standard list-based SISO forthe (15, 13, 3) and (15, 11, 5) RS codes. List sizes appear in parentheses.

In order to investigate the quality of soft-out informationproduced by the proposed suboptimal SISO decoder, its per-formance was compared to that of the proposed optimal andstandard list-based SISO decoders in an iteratively decodedsystem. Specifically, a rate 13

17 (15, 13, 3) & (15, 13, 3) Reed-Solomon turbo product code was considered with input blocklength 2704 bits. A high-spread pseudo-random bit-level inter-leaver was constructed using the real-relaxation optimizationmethod described in [25].Figure 4 illustrates the performance of the iterative turbo

product decoder after 10 iterations using five different RSSISO decoders: the proposed optimal SISO, the proposedsuboptimal SISO with glue code list sizes 32 and 64 andthe standard list-based SISO with list sizes 256 and 1024.With a list size of 64, the decoder employing the suboptimalSISO incurs a negligible loss with respect to the decoderemploying the optimal SISO. With a list size of 256, thedecoder employing the standard list-based SISO performsapproximately 0.3 dB worse than the decoder employing theoptimal SISO at a bit error rate of 10!4; increasing the listsize to 1024 narrows, but does not close, this performance gap.Generating 64 codewords of the (15, 13, 3) codeword, whichis a length 4 code defined over GF (16), is much less complexthan generating 1024 codewords of the (15, 13, 5) code overGF (16).

IV. CONCLUSIONS AND FUTURE WORK

In this correspondence an optimal soft-in soft-out decodingalgorithm for Reed-Solomon codes has been proposed. Theproposed optimal SISO decoder employs a cycle-free graphicalrepresentation that is an alternative to conventional trellis-based decoding. As predicted by the Cut-Set Bound, theproposed optimal SISO is of reasonable complexity only for

10-5

10-4

10-3

10-2

2.4 2.6 2.8 3 3.2 3.4

Optimal

Suboptimal (32)

Suboptimal (64)

Std. List (256)

Std. List (1024)

Bit E

rro

r R

ate

Eb/No (dB)

Fig. 4. Bit error rate performance comparison of the proposed optimalSISO, proposed suboptimal SISO and standard list-based SISO for the rate1317 (15, 13, 3)! (15, 13, 3) RS turbo product code. Ten decoding iterationswere performed. List sizes appear in parentheses.

small, high-rate codes. Suboptimal SISO decoding algorithmsfor RS codes were thus motivated.A suboptimal SISO decoder for high-rate RS codes that

exploits Vardy and Be’ery’s decomposition of the binary imageof RS codes [10] was also proposed. This suboptimal SISOwas found to outperform a standard list-based SISO decodingalgorithm as a stand-alone decoder and as a constituent SISOdecoder in a turbo product code. Furthermore, the complexityof glue code list generation is less than that of standardlist generation for the RS code because the glue codes havelength m whereas the full RS codes have length 2m ! 1. Aninteresting area for future work is to investigate the use of soft-in soft-out ordered statistics decoding [15] of the glue codeand to compare the resulting suboptimal SISO with a SISOemploying ordered statistics decoding of the full RS codes.The proposed suboptimal algorithm is of practically realiz-

able complexity only for very high-rate codes. However, veryhigh-rate RS codes are highly relevant as component codes initeratively decodable systems. The Cut-Set Bound precludesthe existence of practically realizable optimal SISO decodersfor large Reed-Solomon codes and the search for effectivegraphical models with cycles - and thus effective suboptimalSISO decoders - for RS codes remains an interesting openproblem.

V. ACKNOWLEDGEMENTSThe first author wishes to acknowledge the support of

TrellisWare Technologies Inc., where a portion of this workwas completed as part of an internship in the summer of 2003.

REFERENCES[1] V. Ponnampalam and A. Grant, “An efficient SISO for Reed-Solomon

codes,” in Proc. IEEE Symposium on Information Theory, Yokohama,Japan, 2003.

Page 7: Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ... · PDF file1 Soft-In Soft-Out Decoding of Reed-Solomon Codes Based on V ardy and BeÕeryÕ s Decomposition Thomas R

7

[2] T. R. Halford, “Optimal soft-in soft-out decoding of Reed-Solomoncodes,” Communication Sciences Institute, USC, Los Angeles, CA,Tech. Rep. CSI-04-05-03, April 2003.

[3] S. B. Wicker and V. K. Bhargava, Reed-Solomon Codes and TheirApplications. John Wiley & Sons, 1999.

[4] J. G. Proakis, Digital Communications, 3rd ed. New York: McGraw-Hill, 1995.

[5] R. Koetter and A. Vardy, “Algebraic soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Information Theory, vol. 49, pp. 2809–2825, November 2003.

[6] F. Parvaresh and A. Vardy, “Multiplicity assignments for algebraicsoft-decoding of Reed-Solomon codes,” in Proc. IEEE Symposium onInformation Theory, July 2003, p. 205.

[7] M. El-Khamy, R. J. McEliece, and J. Harel, “Performance enhance-ments for algebraic soft decision decoding of Reed-Solomon codes,” inProc. IEEE Symposium on Information Theory, 2004, p. 419.

[8] C. Y. Liu and S. Lin, “Turbo encoding and decoding of Reed-Solomoncodes through binary decomposition and self-concatenation,” IEEETrans. Commununication, vol. 52, no. 9, pp. 1484–1493, September2004.

[9] S. K. Shin and P. Sweeney, “Soft decision decoding of Reed-Solomoncodes using trellis methods,” IEE Proceedings-Communications, vol.141, no. 5, pp. 303–308, October 1994.

[10] A. Vardy and Y. Be’ery, “Bit level soft-decision decoding of Reed-Solomon codes,” IEEE Trans. Commununication, vol. 39, pp. 440–444,March 1991.

[11] V. Ponnampalam and B. Vucetic, “Soft decision decoding of Reed-Solomon codes,” IEEE Trans. Commununication, vol. 50, pp. 1758–1768, November 2002.

[12] F. R. Kschischang, B. J. Frey, and H.-A. Loeliger, “Factor graphs andthe sum-product algorithm,” IEEE Trans. Information Theory, vol. 47,no. 2, pp. 498–519, 2001.

[13] N. Wiberg, “Codes and decoding on general graphs,” Ph.D. dissertation,Linkoping University (Sweden), 1996.

[14] G. D. Forney, Jr., “Codes on graphs: Normal realizations,” IEEETrans. Information Theory, vol. 47, no. 2, pp. 520–548, February 2001.

[15] M. P. C. Fossorier and S. Lin, “Soft-input soft-output decoding of linearblock codes based on ordered statistics,” in Proc. Globecom Conf.,Sydney, Australia, November 1998, pp. 2828–2833.

[16] J. Jiang and K. R. Narayanan, “Iterative soft decision decoding of Reed-Solomon codes based on adaptive parity check matrices,” in Proc. IEEESymposium on Information Theory, Chicago, Illinois, June 2004, p. 261.

[17] R. M. Pyndiah, “Near optimum decoding of product codes: block turbocodes,” IEEE Trans. Commununication, vol. 46, pp. 1003–1010, August1998.

[18] J. K. Wolf, “Efficient maximum-likelihood decoding of linear blockcodes using a trellis,” IEEE Trans. Information Theory, vol. 24, pp.76–80, 1978.

[19] T. R. Halford, “Systematic extraction of good cyclic graphical modelsfor inference,” Communication Sciences Institute, USC, Los Angeles,CA, Ph.D. Dissertation Proposal, Tech. Rep. CSI-05-01-02, November2004.

[20] S. Lin, T. Kasami, T. Fujiwara, and M. Fossorier, Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes. Norwell, MS:Kluwer Academic Publishers, 1998.

[21] P. A. Martin, D. P. Taylor, and M. P. C. Fossorier, “Soft-input soft-output list-based decoding algorithm,” IEEE Trans. Information Theory,vol. 52, no. 2, pp. 252–262, February 2004.

[22] K. M. Chugg, A. Anastasopoulos, and X. Chen, Iterative Detection:Adaptivity, Complexity Reduction, and Applications. Kluwer AcademicPublishers, 2001.

[23] D. Eppstein, “Finding the k shortest paths,” Dept. Info. and Comp. Sci.,University of California, Irvine, CA, Tech. Rep., March 1997.

[24] A. B. Reid, A. J. Grant, and P. D. Alexander, “List detection for multi-access channels,” in Proc. Globecom Conf., vol. 2, November 2002, pp.1083–1087.

[25] S. Crozier, “New high-spread high-distance interleavers for turbo-codes,”in Proc. 20th Biennial Symp. Comm., Kingston, ON, Canada, 2000, pp.3–7.

PLACEPHOTOHERE

Thomas R. Halford (S’01) received the B.A.Sc.degree in Engineering Physics from Simon FraserUniversity, Burnaby, Canada in 2001. He is currentlya doctoral candidate at the University of SouthernCalifornia where his research focuses primarily ongraphical models of codes. He spent the summerof 2005 visiting the Natural Language ProcessingGroup at IBM’s T. J. Watson Research Center.

PLACEPHOTOHERE

Vishakan Ponnampalam (S’97–M’05) receivedB.E. and Ph.D. degrees from the University ofSydney in 1996 and 2001, respectively. Since 2003,Vishakan has been a Technical Associate with IP-Wireless Ltd., UK. Prior to that, he held ResearchEngineer roles at Cambridge Positioning Systemsin Australia and the Center for Wireless Communi-cations in Singapore. His research interests includeforward error correction, multiuser detection, MIMOprocessing and channel estimation applied to wire-less communication systems.

PLACEPHOTOHERE

Alex J. Grant (S’93–M’96–SM’03) received theB.E. and Ph.D. degrees from the University ofSouth Australia in 1993 and 1996 respectively. In1997, he was a post-doctoral research fellow atthe Laboratory for Signal and Information Process-ing, Swiss Federal Institute of Technology (ETH),Zurich. Since March 1998 he has been with the Insti-tute for Telecommunication Research (ITR), wherehe is now Research Professor of Information Theory.Alex’s research interests include information theoryand its applications to multiple user and other com-

munications problems. Alex served as Chairman for the IEEE SA/ACT/VicSections Information Theory Society Joint Chapter (2000-2003). He servedas technical co-chair for the 2001 IEEE Information Theory Workshop and isgeneral co-chair for the 2005 IEEE International Symposium on InformationTheory.

PLACEPHOTOHERE

Keith M. Chugg (S’88–M’95) received the B.S. de-gree (high distinction) in Engineering from HarveyMudd College, Claremont, CA in 1989 and theM.S. and Ph.D. degrees in Electrical Engineering(EE) from the University of Southern California(USC), Los Angeles, CA in 1990 and 1995, re-spectively. During the 1995-1996 academic yearhe was an Assistant Professor with the Electricaland Computer Engineering Dept., The University ofArizona, Tucson, AZ. In 1996 he joined the EEDept. at USC in 1996 where he is currently an

Associate Professor.His research interests are in the general areas of signaling, detection,

and estimation for digital communication and data storage systems. He isalso interested in architectures for efficient implementation of the resultingalgorithms. Along with his former Ph.D. students, A. Anastasopoulos andX. Chen, he is co-author of the book Iterative Detection: Adaptivity, Com-plexity Reduction, and Applications published by Kluwer Academic Press.He is a co-founder of TrellisWare Technologies, Inc., where he is ChiefScientist. He has served as an associate editor for the IEEE Transactions onCommunications and was Program Co-Chair for the Communication TheorySymposium at Globecom 2002. He received the Fred W. Ellersick award forthe best unclassified paper at MILCOM 2003.