digital communications iii (ece 154c) introduction to...

23
1 / 23 Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014

Upload: vunga

Post on 23-May-2018

232 views

Category:

Documents


1 download

TRANSCRIPT

1 / 23

Digital Communications III (ECE 154C)

Introduction to Coding and Information Theory

Tara Javidi

These lecture notes were originally developed by late Prof. J. K. Wolf.

UC San Diego

Spring 2014

Discrete Memoryless Channels

Discrete Memoryless

Channel

• Basic Definition

Capacity of DMC

Channel Coding

Capacity: Examples

Computing Capacity

2 / 23

Discrete Memoryless Channel

Discrete Memoryless

Channel

• Basic Definition

Capacity of DMC

Channel Coding

Capacity: Examples

Computing Capacity

3 / 23

X1...XN−→ DMCY1...YN−→

PY1...YN |X1...XN(y1...yN |x1...xN)

= PY |X(y1|x1)PY |X(y2|x2)PY |X(yN |xN )

From the memoryless property of the chancel, the “Single Input –

Single Output” representation is sufficiently informative.

X−→ DMC

Y−→

PY |X(y|x)

Channel Capacity:

Fundamental Limits

Discrete Memoryless

Channel

Capacity of DMC

• Basica Defintion

• Intuition

• Block Input

Channel Coding

Capacity: Examples

Computing Capacity

4 / 23

Maximum Mutual Information

Discrete Memoryless

Channel

Capacity of DMC

• Basica Defintion

• Intuition

• Block Input

Channel Coding

Capacity: Examples

Computing Capacity

5 / 23

=⇒ An important property of a DMC is its mutual information,

I(X;Y )

=⇒ But in order to calculate I(X;Y ) we need to know

PX,Y (x, y) = PY |X(y|x)PX(x)

=⇒ Thus, in order to calculate I(X;Y ) one has to specify an input

distribution PX(x)

=⇒ The capacity, C , of a DMC is the maximum I(X;Y ) that can be

achieved over all input distributions

C = maxPX(x)

I(X;Y )

Channel Capacity: Intuitition

Discrete Memoryless

Channel

Capacity of DMC

• Basica Defintion

• Intuition

• Block Input

Channel Coding

Capacity: Examples

Computing Capacity

6 / 23

• Recall that I(X;Y ) = H(X)−H(X/Y )• Recall the interpretation that

◦ For any random variable X , H(X) measure the randomness

or uncertainty about X◦ Zy = X|Y=y as a new random variable with (conditional)

pmf PX|Y=y(x|Y = y)◦ H(X|Y ) is the entropy of this random variable averaged

over choices of Y = y

• Mutual Information is nothing but the average reduction in the

randomness about X after Y is observed

• Channel Capacity is the maximum such reduction in uncertainty

when one can design X

C = maxPX (x)

I(X;Y )

Multiple Input – Multiple Output

Discrete Memoryless

Channel

Capacity of DMC

• Basica Defintion

• Intuition

• Block Input

Channel Coding

Capacity: Examples

Computing Capacity

7 / 23

=⇒ One can show that for a DMC

maxPX1···XN

(x1,...,xN )

1

NI(X1, . . . , XN ;Y1, . . . , YN ) = max

PX(x)I(X;Y )

=⇒ How?

Channel Coding and Capacity

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

• Definitions

• Definitions

• Coding Theorem

Capacity: Examples

Computing Capacity

8 / 23

Coding For A Binary Input DMC

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

• Definitions

• Definitions

• Coding Theorem

Capacity: Examples

Computing Capacity

9 / 23

W−→

Channel

Encoder

m−vector

︷︸︸︷

x−−−−−−→

Binary-Input

DMC

m−vector

︷︸︸︷

y

−−−−−−→Channel

Decoder

W−→

• Code Rate: R < 1 (at most one digit transmitted)

• Message: An integer number, denoted by W, between 1 and

2mR (equivalently, a binary vector of length mR)

Assumption: Messages occur with equal probabilities

P [W = i] =1

2mR, for all i = 1, 2, ..., 2mR

Coding For A Binary Input DMC

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

• Definitions

• Definitions

• Coding Theorem

Capacity: Examples

Computing Capacity

10 / 23

W−→

Channel

Encoder

m−vector

︷︸︸︷

x−−−−−−→

Binary-Input

DMC

m−vector

︷︸︸︷

y

−−−−−−→Channel

Decoder

W−→

• Block Code (Binary): Collection of 2mR length m binary vectors

• Channel Decoder: Chooses most likely code word (or

equivalently most likely message) based upon received vector y.

• Error Probability: Probability that message produced by decoder

is not equal to the original message, i.e. P{W 6= W}

Channel Coding Theorem

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

• Definitions

• Definitions

• Coding Theorem

Capacity: Examples

Computing Capacity

11 / 23

W−→

Channel

Encoder

m−vector

︷︸︸︷

x−−−−−−→

Binary-Input

DMC

m−vector

︷︸︸︷

y

−−−−−−→Channel

Decoder

W−→

Channel Coding Theorem:

Let R < C (base 2) . For m large enough, there exists an encoder

and a decoder such that P{W 6= W} < ǫ for any ǫ > 0.

Computation of Capacity for

Well-known Channels

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

12 / 23

Capacity of BSC

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

13 / 23

EXAMPLE: Binary Symmetric Channel (BSC)

Placeholder Figure A

I2(X;Y ) = H2(Y )−H2(Y |X)

H2(Y |X) =1∑

x=0

1∑

y=0

PX(x)PY |X(y|x) log21

PY |X(y|x)

=

1∑

x=0

PX(x)

(

(1− p) log21

1− p+ p log2

1

p

)

= h2(p)

where h2(p) := (1− p) log21

1−p + p log21p .

Capacity of BSC

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

14 / 23

EXAMPLE: Binary Symmetric Channel (BSC)

On the other hand, H2(Y ) ≤ log2 2 with equality iff

P [Y = 0] = P [Y = 1] = 1/2.

But note that if PX(0) = 1/2 = PX(1) = 1/2, then

PY (0) = PY (1) = 1/2.

In other words,

C = maxPX (x)

I(X;Y ) = log2 2− h2(p) = 1− h2(p).

Capacity of BEC

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

15 / 23

EXAMPLE: Binary Erasure Channel (BEC)

Placeholder Figure A

I2(X;Y ) = H2(X)−H(X|Y )

Assume P [X = 0] = α, and P [X = 1] = (1− α)(⇒ PY (2) = p, PY (0) = α(1− p), and PY [1] = (1−α)(1− p)).

Hence, H2(X) = α log21α + (1− α) log2

11−α = h2(α).

On the other hand,

H2(X|Y ) =∑2

y=0 PY (y)1∑

x=0

PX|Y (x|y) log21

PX|Y (x|y)︸ ︷︷ ︸

H(X|Y=y)

Capacity of BEC

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

16 / 23

Also,

H(X|Y = 0) =1∑

x=0

PX|Y (x|0) log21

PX|Y (x|0)= 0

H(X|Y = 1) =1∑

x=0

PX|Y (x|1) log21

PX|Y (x|1)= 0

H(X|Y = 2) =1∑

x=0

PX|Y (x|2) log21

PX|Y (x|2)

= α log2(1

α) + (1− α) log2

1

1− α= h2(α)

Hence,

C = maxα

h2(α)− ph2(α) = 1− p.

Capacity of a Z-Channel

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

17 / 23

EXAMPLE:

Placeholder Figure A

Recall I2(X;Y ) = H2(X)−H(X|Y ).

Assume P [X = 0] = α, and P [X = 1] = (1− α).

Hence, H2(X) = α log21α + (1− α) log2

11−α = h2(α).

Also, H2(Y |X) = αh2(12). So

I2(X;Y ) = H2(Y )−H2(Y |X) = h2(α

2)− αh2(

1

2)

2log2

2

α+ (1−

α

2) log2

1

1− α2

− αh2(1

2)

Capacity of a Z-Channel

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

18 / 23

Equivalently Ie(x; y) =α2 ln 2

α(1−α2 ) ln

11−α

2

− αhe(12).

We maximize Ie(X;Y ) over α (same α maximizes I2(X;Y )).

0 =δIe(X;Y )

δα=

1

2ln

2

α−

α

2.1

α+

(

1−α

2

) 1/2(1− α

2

) − he(1

2)

︸ ︷︷ ︸

ln 2

=1

2ln 2−

1

2lnα−

1

2+

1

2+

1

2ln

(

1−α

2

)

− ln 2

This means

1

2ln

α

1− α2

= −1

2ln 2 =

1

2ln

1

2⇒

1− α2

α= 2 =⇒ α =

2

5

C2 = I2(X;Y )|α=2/5 =1

5log2 5 +

4

5log2

5

4−

2

5= log2 5− 2 = .3219

Capacity of a 5-ary Symmetric Channel

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

19 / 23

EXAMPLE:

Placeholder Figure A

C = maxP (X)

I(X;Y ) = I(X;Y )|P [X=x]= 1

5,x=0,1,2,3,4

= [H(Y )−H(Y |X)]P [X=x]= 1

5,x=0,1,2,3,4 .

If inputs are equally likely, outputs are equally likely. Thus,

H(Y ) = log 5.

So what remains is to compute H(Y |X)|P [X=x]= 1

5,x=0,1,2,3,4.

Capacity of a 5-ary Symmetric Channel

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

20 / 23

For any value of the input, there are two possible outputs that are

equally likely. In other words,

H(Y |X = x) = log 2, x = 0, . . . , 4.

Hence,

H(Y |X) = log 2.

And

C = log 5− log 2 = log5

2

Note: one can also measure the capacity in terms of 5-art digits

where capacity would be nothing but C5 = log552 .

A simple Code for 5-ary Symmetric Channel

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

• BSC

• BSC

• BEC

• BEC

• Z-Channel

• Z-Channel

• 5-ary Symmetric

• 5-ary Symmetric

• Code for 5-ary

Symmetric

Computing Capacity

21 / 23

Code Words Outputs

00 −→ 00 01 10 1112 −→ 12 13 22 2324 −→ 24 20 34 3031 −→ 31 32 41 4243 −→ 43 44 03 04

Decoder works in reverse (and never gets confused!).

There are 5 code words and each code word can represent one

5-ary digit or log2 5 binary digits. Since each code word represents

2 uses of the channel the rate of the code is 12 = .5 5-ary digits

(12 log2 5 = 1.1609 binary digits) per channel use.

Now note that this rate is below the capacity of the channel:12 log2 5 = 1.11609 ≤ log 5

2 = 1.322 (or equivalently 12 ≤ log5

52 ).

Computation of Capacity

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

Computing Capacity

• Blahut-Arimoto

22 / 23

Finding the Capacity Using A Computer

Discrete Memoryless

Channel

Capacity of DMC

Channel Coding

Capacity: Examples

Computing Capacity

• Blahut-Arimoto

23 / 23

• For an arbitrary DMC, one cannot, in general, use analytic

techniques to find the channel capacity.

• Instead of one can use a computer to search out the input

distribution, PX(x), that maximizes I(X;Y )• These is a special algorithm, called the BLAHUT-ARIMOTO

algorithm, for doing this. For simple cases, a brute-force search

can be used.

EXAMPLE:

Placeholder Figure A

1. Use computer to evaluate I(X;Y ) = f(α)2. Optimize with respect to α