turbo source coding

64
IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 1 Institute of Communication Systems and Data Processing Prof. Dr.-Ing. Peter Vary Turbo Source Coding Laurent Schmalen and Peter Vary FlexCode Public Seminar June 16, 2008

Upload: others

Post on 11-Jun-2022

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 1

Institute of Communication

Systems and Data Processing

Prof. Dr.-Ing. Peter Vary

Turbo Source Coding

Laurent Schmalen and Peter Vary

FlexCode Public Seminar

June 16, 2008

Page 2: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 2

Outline

• FlexCode in a nutshell

• Entropy coding

• Turbo coding and decoding

• Application of Turbo codes as source codes

• A joint-source channel coding scheme with

iterative decoding for compression

• Possible Application to FlexCode

Page 3: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 3

Who?

Orange(France Telecom)

RWTH Aachen

KTH

NokiaEricsson

Page 4: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 4

The Problem

• Heterogeneity of networks increasing

• Networks inherently variable (mobile users)

• But:

– Coders not designed for specific environment

– Coders inflexible (codebooks and FEC)

– Feedback channel underutilized

Page 5: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 5

Adaptation and Coding

set of models, fixed quantizer

flexible

rigid

CELP with FEC

fixed model

fixed quantizer

adaptive coding

AMR

FlexCode

FlexCode

FlexCode

FlexCode

PCM

Page 6: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 6

• Receiver

Conventional Transmission

• Transmitter

Page 7: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 7

Lossless Source Coding

• Prominent examples of lossless entropy coders

– Huffman coding

– Lempel-Ziv coding

– Arithmetic coding

• Example: Huffman code

– AACABA � 0 0 110 0 10 0

– No bit pattern is prefix of another

– Unambiguous decoding 00.5A

1110.05D

1100.15C

100.3B

Bit patternP(S)Symbol S

Page 8: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 8

Lossless Source Coding

• Usability of lossless source codes

• High sensitivity against transmission errors

– Huffman code: synchronization loss

– Arithmetic code: selection of wrong interval, completedecoding failure

Page 9: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 9

Lossless Source Coding

• Usability of lossless source codes

• High sensitivity against transmission errors– Huffman code: synchronization loss

– Arithmetic code: selection of wrong interval, completedecoding failure

• Very strong channel codes required– Error floor, i.e., seldom bit errors leading to decoding

failures

• Iterative source-channel decoding schemes forEntropy Codes– High complexity

– Difficult to apply to arithmetic coding [Guionnet 04]Channel quality Eb/N0

BE

R

Page 10: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 10

Lossless Source Coding

• Usability of lossless source codes

• High sensitivity against transmission errors– Huffman code: synchronization loss

– Arithmetic code: selection of wrong interval, completedecoding failure

• Very strong channel codes required– Error floor, i.e., seldom bit errors leading to decoding

failures

• Iterative source-channel decoding schemes forEntropy Codes– High complexity

– Difficult to apply to arithmetic coding [Guionnet 04]

Page 11: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 11

Lossless Source Coding

• Wanted:

– A flexible compression scheme (entropy coder) which

• Has similar performance as known compression schemes

• Is robust against transmission errors

• Can instantaneously adapt to varying channel conditions byexchanging compression ratio against error robustness

– Analogy between channel codes and source codes

• A good channel code is often also a good source code

• Use of LDPC codes for compression [Caire 03]

– Can Turbo codes be used for compression?

Page 12: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 12

Turbo Codes, Concept

Concatenated Encoding

parallel scheme[Berrou 93]

serial scheme[Benedetto 98]

Page 13: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 13

Turbo Codes, Concept

Concatenated Encoding

parallel scheme[Berrou 93]

serial scheme[Benedetto 98]

Iterative Turbo Decoding

advantages by iterative feedback!

Page 14: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 14

Turbo Codes, Concept

Concatenated Encoding

parallel scheme[Berrou 93]

serial scheme[Benedetto 98]

Iterative Turbo Decoding

advantages by iterative feedback!

Page 15: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 15

Turbo Coding & Decoding

• Turbo Code Encoder and Decoder [Berrou 93]

is an (almost) arbitrary channel encoder

Tra

nsm

issio

n

Channel

denotes an interleaver

Page 16: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 16

Turbo Coding & Decoding

• Turbo Code Encoder and Decoder [Berrou 93]

is an (almost) arbitrary channel encoder

Tra

nsm

issio

n

Channel

denotes an interleaver

Page 17: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 17

Turbo Coding & Decoding

• Turbo Decoding (1st iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Page 18: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 18

Turbo Coding & Decoding

• Turbo Decoding (1st iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Transmission over AWGN channel

data symbols

additive

Gaussian

noise

noisy symbols

Page 19: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 19

Turbo Coding & Decoding

• Turbo Decoding (1st iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Page 20: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 20

Turbo Coding & Decoding

• Turbo Decoding (1st iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Page 21: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 21

Turbo Coding & Decoding

• Turbo Decoding (1st iteration, 2nd step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Page 22: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 22

Turbo Coding & Decoding

• Turbo Decoding (1st iteration, 2nd step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Page 23: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 23

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Turbo Coding & Decoding

• Turbo Decoding (2nd iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Page 24: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 24

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Turbo Coding & Decoding

• Turbo Decoding (2nd iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.1

0.2

0.3

0.4

0.5

0.6

Page 25: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 25

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Turbo Coding & Decoding

• Turbo Decoding (2nd iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.1

0.2

0.3

0.4

0.5

0.6

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Previous iteration

Page 26: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 26

Turbo Coding & Decoding

• Turbo Decoding (2nd iteration, 2nd step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.1

0.2

0.3

0.4

0.5

0.6

Page 27: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 27

Turbo Coding & Decoding

• Turbo Decoding (2nd iteration, 2nd step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.1

0.2

0.3

0.4

0.5

0.6

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

Page 28: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 28

Turbo Coding & Decoding

• Turbo Decoding (2nd iteration, 2nd step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.1

0.2

0.3

0.4

0.5

0.6

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Previous iteration

Page 29: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 29

Turbo Coding & Decoding

• Turbo Decoding (n‘th iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.2

0.4

0.6

0.8

1

Page 30: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 30

Turbo Coding & Decoding

• Turbo Decoding (n‘th iteration, 1st step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.2

0.4

0.6

0.8

1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

Page 31: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 31

Turbo Coding & Decoding

• Turbo Decoding (n‘th iteration, 2nd step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

Page 32: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 32

Turbo Coding & Decoding

• Turbo Decoding (n‘th iteration, 2nd step)

Decoder 2

Decoder 1

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

Page 33: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 33

Turbo Principle - Interleaver Design

• Block interleaver vs. random interleaver– Example: Propagation of a single information

– Block:

(50 x 50)

– Random:

• Better information distribution with random interleavers

• Careful interleaver design required

Page 34: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 34

Turbo Principle - Interleaver Design

• Block interleaver vs. random interleaver– Example: Propagation of a single information

– Block:

(50 x 50)

– Random:

• Better information distribution with random interleavers

• Careful interleaver design required

Page 35: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 35

Turbo Source Coding

• Can Turbo codes be used for entropy coding?

• Yes!

– Turbo Codes for compressing binary memorylesssources [Garcia-Frias 02]

– Only transmitting a fraction of the output bits such that

the overall coding rate > 1

– Decoder has to take into consideration the statistics of

the source (unequal distribution of bits)

– Source statistics can be estimated at the decoder

Page 36: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 36

• Receiver

Transmission with Turbo Source Coding

• Transmitter

Page 37: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 37

• Receiver

Transmission with Turbo Source Coding

• Transmitter

Page 38: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 38

• Receiver

Transmission with Turbo Source Coding

• Transmitter

Mapping of quantizer repro-

duction levels to bit patterns

0 � 000

1 � 001

2 � 010

7 � 111

Page 39: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 39

Turbo Source Coding

• Turbo coder, channel coding rate 1/3 (1 bit � 3 bit)

• Before puncturing:

• After puncturing (compression ratio 0.5, 1 bit � 0.5 bit)

Page 40: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 40

Turbo Source Coding

• Turbo coder, channel coding rate 1/3 (1 bit � 3 bit)

• Before puncturing:

• After puncturing (compression ratio 0.5, 1 bit � 0.5 bit)

Page 41: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 41

Turbo Source Coding

• Turbo coder, channel coding rate 1/3 (1 bit � 3 bit)

• Before puncturing:

• After puncturing (compression ratio 0.5, 1 bit � 0.5 bit)

Page 42: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 42

Turbo Source Coding

• Puncturing unit corresponds to binary erasure

channel

• Adapt puncturing w.r.t. source statistics

• Theoretical minimum rate: source entropy

• Realization of puncturing

– Regular puncturing

– Pseudo-random puncturing

• Puncturing has to be known at the receiver

• Adaptively increase number of transmitted bits

with increasing channel noise

Page 43: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 43

Turbo Source Coding

• Simulation results for a binary source [Garcia-Frias 02]

• Comparison with standard Unix compression tools

compress, gzip, and bzip2

0.960.800.980.870.7219

0.830.720.830.750.6098

0.670.600.650.580.4690

0.440.410.410.380.2864

0.150.160.160.160.0808

bzip2gzipcompressTurboH(X)

Achieved compression ratiosSource entropy

Page 44: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 44

Turbo Source Coding

• Advantages: – Robustness against transmission errors

– If the channel quality drops, puncturing can be changedin order to transmit more parity bits

– High flexibility by adapting puncturing on the fly

• Disadvantages: – Higher computational costs than conventional entropy

coding schemes

– Lossless compression is not guaranteed, Turbo decodermight not decode every bit correctly

– Difficult to adapt to varying parameter statistics for theuse in speech and audio codecs

Page 45: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 45

Lossless Turbo Source Coding

• Lossless Turbo source coding [Hagenauer 04]

• Adapt puncturing such that lossless decoding is

possible

• Analysis-by-Synthesis encoder:

– change puncturing and test if decodable without error

– Small amount of side information required

Page 46: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 46

Non-Binary Sources

• Turbo Source Coding only for binary sources

• Feasible only if bit pattern after source coding has

low entropy H(X) < 1

• Extension towards non-binary sources

– Utilization of non-binary Turbo codes [Zhao 02]

– Utilization of special binary LDPC [Zhong 05]

– Utilization of non-binary LDPC codes [Potluri 07]

• Joint source-channel coding approach

„However, any redundancy in the source will usually

help if it is utilized at the receiving point. [...] redundancywill help combat noise.“, Shannon 1948

Page 47: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 47

Iterative Source-Channel Decoding

• Approach: Enhancement of the robustness of transmission of variable length codes

• Iterative Source-Channel Decoding (ISCD) of variable length codes (VLC) [Guyader 01]

– Combats adverse effects of channel noise

– Exploits the structure and redundancy of variable lengthcodes

– Achieve near-capacity system performance

– Works considerably well for Huffman codes

– Difficult to adapt to arithmetic codes

– Very high computational complexity

– Limited flexibility!

Page 48: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 48

Iterative Source-Channel Decoding

• Approach: Enhancement of the robustness of transmission of variable length codes

• Iterative Source-Channel Decoding (ISCD) of variable length codes (VLC) [Guyader 01]

– Combats adverse effects of channel noise

– Exploits the structure and redundancy of variable lengthcodes

– Achieve near-capacity system performance

– Works considerably well for Huffman codes

– Difficult to adapt to arithmetic codes

– Very high computational complexity

– Limited flexibility!

Page 49: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 49

Iterative Source-Channel Decoding

• Iterative Source-Channel Decoding (ISCD) [Adrat 01]

for fixed-length codes (FLC)– Combat adverse effects of channel noise

– Exploit residual source redundancy

– Achieve near-capacity overall system performance[Clevorn 06]

• Can also be used effectively for compression[Thobaben 08]

– Leave all redundancy in the source symbols

– Utilization of redundant bit mappings

– Puncture output of convolutional code in order to obtaincompression

Page 50: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 50

Turbo Codes, Concept

Concatenated Encoding

parallel scheme[Berrou 93]

serial scheme[Benedetto 98]

Iterative Turbo Decoding

advantages by iterative feedback!

Page 51: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 51

Turbo Codes, Concept

Concatenated Encoding

parallel scheme[Berrou 93]

serial scheme[Benedetto 98]

Iterative Turbo Decoding

advantages by iterative feedback!

Page 52: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 52

• Receiver

Transmission with ISCD

• Transmitter

Page 53: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 53

• Receiver

Transmission with ISCD

• Transmitter

Page 54: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 54

• Receiver

Transmission with ISCD

• Transmitter

Mapping of quantizer repro-duction

levels to bit patterns. Additional

redundancy, e.g., by parity check bit

0 � 000 0

1 � 001 1

2 � 010 1

7 � 111 1

Page 55: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 55

Soft Decision Source Decoding

Exploitation of residual redundancy for quality improvement

Calculate extrinsic feedback information using source statistics and bit mapping redundancy [Adrat 01]

1D a priori knowledge

(parameter distribution)

Bit pattern redundancy(e.g., parity check bits)

Quantization &

Bit mapping

Soft Decision

Source Dec.

Channel

Decoder

Page 56: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 56

Exploitation of residual redundancy for quality improvement

Calculate extrinsic feedback information using source statisticsof order 0 and 1 and bit mapping redundancy [Adrat 01]

Soft Decision Source Decoding

2D a priori knowledge(parameter correlation)

1D a priori knowledge

(parameter distribution)

Bit pattern redundancy(e.g., parity check bits)

Quantization &

Bit mapping

Soft Decision

Source Dec.

Channel

Decoder

Page 57: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 57

ISCD Source Compression

• Simulation example

• System components

– Scalar quantization with Q levels

– Single parity check code index assignment

– R > 1 convolutional code, random puncturing [Thobaben 07]

K parameters

bits

bits

Page 58: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 58

Iterative Source-Channel Decoding

• Simulation results for simple experiment

– Gauss-Markov source (AR(1) process)

– Lloyd-Max Quantization with Q = 16 levels

– 25 decoding iterations

– Unoptimized standard system components!

0.55

0.66

0.72

0.76

H(Ut | Ut-1) / 4

Entropy

0.620.710.730.620.95

0.720.800.870.740.9

0.780.830.940.780.85

0.820.860.990.830.8

bzip2gzipcompressISCDρ

Achieved compression ratiosSource

correlation

Page 59: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 59

Possible Application in FlexCode

• Entropy coding for constrained entropy quantization

• Utilization of entropy coding, e.g., arithmetic coding

• Can be replaced by the presented compression

scheme

Quantizer

reproduction

levels

Page 60: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 60

Possible Application in FlexCode

• Entropy coding for constrained entropy quantization

• Flexible adaptation to varying channel conditions by

puncturing adaptation

Page 61: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 61

Conclusions

• Channel codes can be used for compression

• Turbo codes for compressing binary sources

• Iterative Source-Channel Decoding for fixed length

codes

• Joint Source-Channel Coding with Iterative

Decoding for Source Compression

• Promising results already with unoptimized system

components

Page 62: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 62

References

A. Guyader, E. Fabre, C. Guillemot, and M. Robert, “Joint source-channel turbo decoding of entropy-codedsources,” IEEE Journal on Sel. Areas in Comm., vol. 19, no. 9, pp. 1680–1696, Sept. 2001.

[Guyader 01]

M. Adrat, J.-M. Picard, P. Vary, „Soft-Bit Source Decoding Based on the Turbo Principle“, IEEE VehicularTechnology Conference (VTC-Fall), Atlantic City, Oct. 2001.

[Adrat 01]

M. Potluri, S. Chilumuru, S. Kambhampati, K. R. Namuduri, „Distributed Source Coding using non-binaryLDPC codes for sensor network applications“, Canadian Workshop on Information Theory, June 2007

[Potluri 07]

W. Zhong and J. García-Frías: “Compression of Non-Binary Sources Using LDPC Codes”, CISS, March2005, Baltimore, Maryland.

[Zhong 05]

Y. Zhao, J. Garcia-Frias, „Data Compression of Correlated Non-Binary Sources Using Punctured Turbo

Codes“, IEEE Data Compression Conference (DCC), 2002

[Zhao 02]

J. Hagenauer, J. Barros, A. Schaefer, „Lossless Turbo Source Coding with Decremental Redundancy“, ITG

Conference on Source and Channel Coding (SCC), Erlangen, 2004

[Hagenauer 04]

J. Garcia-Frias, Y. Zhao, „Compression of Binary Memoryless Sources Using Punctured Turbo Codes“,

IEEE Comm. Letters, vol. 6, no. 9, September 2002

[Garcia-Frias 02]

S. Benedetto, D. Divsalar, G. Montorsi, F. Pollara, „Analysis, Design, and Iterative Decoding of Double

Serially Concatenated Codes with Interleavers“, IEEE Journal on Sel. Areas in Comm., vol. 16, no. 2,

February 1998

[Benedetto 98]

C. Berrou, A. Glavieux, P. Thitimajshima, „Near Shannon Limit Error-Correcting Coding and Decoding:

Turbo Codes“, International Conference on Communications, Geneve, Switzerland, Mai, 1993

[Berrou 93]

G. Caire, S. Shamai and S. Verdú, "Universal Data Compression with LDPC Codes," Third International

Symposium On Turbo Codes and Related Topics, Brest, France, September 1-5, 2003

[Caire 03]

T. Guionnet and C. Guillemot, “Soft and joint source-channel decoding of quasi-arithmetic codes,” EURASIP

Journal on Applied Signal Processing, vol. 2004, no. 3, pp. 393–411, Mar. 2004.

[Guionnet 04]

Page 63: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 63

References

R. Thobaben, „A new transmitter concept for iteratively-decoded source-channel coding schemes“, IEEE

SPAWC, Helsinki, June 2007

[Thobaben 07]

R. Thobaben, L. Schmalen, P. Vary, „Joint Source-Channel Coding with Inner Irregular Codes“, International

Symposium on Information Theory (ISIT), Toronto, CA, July 2008

[Thobaben 08]

T. Clevorn, L. Schmalen, P. Vary „On the Optimum Performance Theoretically Attainable for Scalarly

Quantized Correlated Sources, International Symposium on Information Theory and its Applications

(ISITA), Seoul, Korea, Oct. 2006.

[Clevorn 06]

Page 64: Turbo Source Coding

IND - Institute of Communication Systems and Data Processing, RWTH Aachen University 64

Institute of Communication

Systems and Data Processing

Prof. Dr.-Ing. Peter Vary

Turbo Source Coding

Laurent Schmalen and Peter Vary

http://www.flexcode.eu