correlated sources help transmission over an arbitrarily varying channel

2
1254 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 4, JULY 1997 Correspondence Correlated Sources Help Transmission Over an Arbitrarily Varying Channel Rudolf Ahlswede and Ning Cai Abstract— It is well known that the deterministic code capacity (for the average error probability criterion) of an arbitrarily varying channel (AVC) either equals its random code capacity or zero. Here it is shown that if two components of a correlated source are additionally available to the sender and receiver, respectively, the capacity always equals its random code capacity. Index Terms—Arbitrarily varying channel, common randomness, pos- itivity of average error capacity, side information. I. INTRODUCTION AND RESULT A well-known dichotomy [1] in the behavior of the deterministic code capacity (for the average error probability criterion) for an arbitrarily varying channel (AVC) with a set of transmission matrices is that either or else , the random code capacity where is the ordinary capacity of the discrete memoryless channel (DMC) and is the convex closure of It can happen that and We prove here that in the presence of a correlated source (which is independent of the message) with and when the sender has access to and the receiver has access to , then it is not possible for and Clearly, if and have positive common random- ness CR , then the result is readily established by the elimi- nation technique of [1]. Here the common randomness CR of a correlated source is defined as the maximal real such that for all and sufficiently large there exist functions and with (c.f. [2]). So the interesting case arises when a) the common randomness CR of the correlated source is and b) but Now we know not only that a positive CR can help to make the channel capacity positive, but also, from [3], that a positive channel capacity can be used to make the common randomness positive. However, we are confronted with the situation where both quantities are ! Actually, we can always find binary-valued functions and so that satisfies if Manuscript received December 25, 1995; revised December 12, 1996. The material in this correspondence was presented in part at the Information Theory Meeting, Oberwolfach, Germany, February 1996, and presented in full at the World Congress of Bernoulli Society, Vienna, Austria, August 1996. The authors are with the Fakult¨ at f¨ ur Mathematik, Universit¨ at Bielefeld, Postfach 100131, D-33501, Bielefeld, Germany. Publisher Item Identifier S 0018-9448(97)03450-0. So we can assume that we are given a binary correlated source with alphabets We obtain the following result. Theorem: For an AVC let the sender observe and let the receiver observe where is a memoryless correlated source (which is independent of the message) with a generic pair of RV’s having mutual information Then the capacity for deterministic codes and the average error criterion equals the random capacity II. CODE CONCEPTS AND AN AUXILIARY CHANNEL An code is a system for for for and (2.1) for if The corresponding capacity is denoted by It turns out that it suffices to restrict the class of encoding functions as follows: (2.2) It is, therefore, natural to consider an associated AVC with input letters and output letters Indeed, we set for (2.3) Using (2.2) and (2.3) we can rewrite the left-hand side of (2.1) as where (2.4) We thus have shown (by going from (2.4) backwards) that (2.5) Similarly, one can show that actually equality holds here, but this is not used in the sequel. 0018–9448/97$10.00 1997 IEEE

Upload: r

Post on 22-Sep-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Correlated sources help transmission over an arbitrarily varying channel

1254 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 4, JULY 1997

Correspondence

Correlated Sources Help TransmissionOver an Arbitrarily Varying Channel

Rudolf Ahlswede and Ning Cai

Abstract—It is well known that the deterministic code capacity (forthe average error probability criterion) of an arbitrarily varying channel(AVC) either equals its random code capacity or zero. Here it is shownthat if two components of a correlated source are additionally availableto the sender and receiver, respectively, the capacity always equals itsrandom code capacity.

Index Terms—Arbitrarily varying channel, common randomness, pos-itivity of average error capacity, side information.

I. INTRODUCTION AND RESULT

A well-known dichotomy [1] in the behavior of the deterministiccode capacity (for the average error probability criterion)C(W)

for an arbitrarily varying channel (AVC) with a set of transmissionmatricesW is that eitherC(W) = 0 or elseC(W) = CR(W),the random code capacitymin

W2W C(W ); whereC(W ) is theordinary capacity of the discrete memoryless channel (DMC)W andW is the convex closure ofW: It can happen thatCR(W)> 0 andC(W) = 0:

We prove here that in the presence of a correlated source(Un; V n)1n=1 (which is independent of the message) withI(U ^ V )> 0 and when the sender has access toUn and thereceiver has access toV n, then it is not possible forCR(W)> 0

and C(W) = 0:

Clearly, if (Un)1n=1 and(V n)1n=1 have positive common random-ness CR(U; V ), then the result is readily established by the elimi-nation technique of [1]. Here the common randomness CR(U; V ) ofa correlated source(Un; V n)1n=1 is defined as the maximal realRsuch that for all"> 0 and sufficiently largen there exist functions

�: Un!f1; 2; � � � ; 2

n(R�")g

and

�: Vn!f1; 2; � � � ; 2

n(R�")g

with Pr(�(Un) 6= �(V n))<" (c.f. [2]).So the interesting case arises when

a) the common randomness CR(U; V ) of the correlated source is0 and

b) C(W) = 0; but CR(W)> 0:

Now we know not only that a positive CR(U; V ) can help tomake the channel capacity positive, but also, from [3], that a positivechannel capacityC(W ) can be used to make the common randomnesspositive. However, we are confronted with the situation wherebothquantities are0!

Actually, we can always find binary-valued functionsf and g sothat(f(Ut); g(Vt))1t=1 satisfiesI(f(U)^g(V ))> 0; if I(U^V )> 0:

Manuscript received December 25, 1995; revised December 12, 1996. Thematerial in this correspondence was presented in part at the InformationTheory Meeting, Oberwolfach, Germany, February 1996, and presented in fullat the World Congress of Bernoulli Society, Vienna, Austria, August 1996.

The authors are with the Fakultat fur Mathematik, Universitat Bielefeld,Postfach 100131, D-33501, Bielefeld, Germany.

Publisher Item Identifier S 0018-9448(97)03450-0.

So we can assume that we are given a binary correlated sourcewith alphabetsU = V = f0; 1g: We obtain the following result.

Theorem: For an AVC W let the sender observeUn =

U1; � � � ; Un and let the receiver observeV n = V1; � � � ; Vn;

where (Un; V n)1n=1 is a memoryless correlated source (which isindependent of the message) with a generic pair of RV’s(U; V )

having mutual informationI(U ^ V )> 0: Then the capacityC(W; (U; V )) for deterministic codes and the average error criterionequals the random capacityCR(W):

II. CODE CONCEPTS AND AN AUXILIARY CHANNEL

An (n;M; �) code is a system

f(gnm(u

n);Dm(v

n))Mm=1; u

n2 U

n; vn2 V

ng

gnm(u

n) 2X

n; for u

n2 U

n

Dm(vn) �Y

n; for v

n2 V

n

Dm(vn)\Dm (v

n) = ;; for m 6= m

0

and

1

M

M

m=1 u ;v

PnUV (u

n; vn)W

n(Dm(v

n)jg

nm(u

n); s

n)> 1� �

(2.1)for sn 2 Sn; if fW (�j�; s): s 2 Sg = W: The correspondingcapacity is denoted byC(W; (U; V )):

It turns out that it suffices to restrict the class of encoding functionsgnm as follows:

gnm(u

n) = (gm;1(u1); � � � ; gm;n(un)): (2.2)

It is, therefore, natural to consider an associated AVC

W = fW (�; �j�; s): s 2 Sg

with input lettersg: U ! X and output letters(y; v): Indeed, we set

W (y; vjg; s) = PV (v)

1

u=0

PUjV (ujv)W (yjg(u); s)

for y 2 Y; v 2 f0; 1g; s 2 S: (2.3)

Using (2.2) and (2.3) we can rewrite the left-hand side of (2.1) as

1

M

M

m=1 v

PnV (v

n)

u

PnUjV (u

njvn)

�Wn(Dm(v

n)jg

nm(u

n); s

n)

=1

M

M

m=1

Wn

v 2V

(Dm(vn)� v

n)jg

nm; s

n

=1

M

M

m=1

Wn(Dmjg

nm; s

n)

where

Dm =

v 2V

(Dm(vn)� v

n): (2.4)

We thus have shown (by going from (2.4) backwards) that

C(W; (U; V )) � C(W): (2.5)

Similarly, one can show that actually equality holds here, but thisis not used in the sequel.

0018–9448/97$10.00 1997 IEEE

Page 2: Correlated sources help transmission over an arbitrarily varying channel

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 4, JULY 1997 1255

III. PROOF OF THEOREM

Clearly, if C(W; (U; V )) is positive, then we can use a codewith block lengthO(logn) in the sense of (2.1) to get the commonrandomness needed to operate the correlated code of rate�CR(W)

gained by the elimination technique.On the other hand, ifC(W; (U; V )) = 0; then by (2.5)C(W) = 0

as well. This implies by [5] thatW is symmetrizable in the sense of[4]. It remains to be seen that this, in turn, implies the existence ofa W 2 W with C(W ) = 0 and, therefore,CR(W) = 0:

Let X = f0; 1; � � � ; a � 1g; let G be the set of functions fromUto X ; and let

G�

= fg�

g [ fgi: 0 � i � a� 1g � G

where

g�

(0) = g�

(1) = 0

and

gi(u) = i+ u mod a; for u 2 f0; 1g: (3.1)

Now by symmetrizability ofW there is a stochastic matrix� : G !S with

PV (v)

1

u=0

PUjV (ujv)s

�(sjg�)W (yjgi(u); s)

= PV (v)

1

u=0

PUjV (ujv)s

�(sjgi)W (yj0; s) (3.2)

for all v = 0; 1; i 2 X ; and y 2 Y:

CancelPV (v) and consider (3.2) withv = 0 andv = 1: Then1

u=0

PUjV (uj0)s

�(sjg�)W (yjgi(u); s)

=

1

u=0

PUjV (uj0)s

�(sjgi)W (yj0; s) (3.3)

1

u=0

PUjV (uj1)s

�(sjg�)W (yjgi(u); s)

=

1

u=0

PUjV (uj1)s

�(sjgi)W (yj0; s): (3.4)

Clearly, the right-hand sides of both (3.3) and (3.4) equal�s�(sjgi)W (yj0; s):

We evaluate these equations by inserting the values for thegi andget with the conventioni � j = i + j mod a

PUjV (0j0)s

�(sjg�)W (yji; s)

+ PUjV (1j0)s

�(sjg�)W (yji� 1; s)

=s

�(sjgi)W (yj0; s): (3.5)

PUjV (0j1)s

�(sjg�)W (yji; s)

+ PUjV (1j1)s

�(sjg�)W (yji� 1; s)

=s

�(sjgi)W (yj0; s): (3.6)

With the abbreviations

� =s

�(sjgi)W (yj0; s)

z0 =s

�(sjg�)W (yji; s)

and

z1 =s

�(sjg�)W (yji� 1; s)

we get, therefore, the system of two equations

PUjV (0j0)z0 + PUjV (1j0)z1 =�

PUjV (0j1)z0 + PUjV (1j1)z1 =�: (3.7)

SinceI(U ^ V )> 0 impliesPUjV (�j0) 6= PUjV (�j1); we get

detPUjV (0j0) PUjV (1j0)

PUjV (0j1) PUjV (1j1)6= 0

and (3.7) has a unique solutionz0 = z1 = �: Hence, for alli 2 Xand all y 2 Y

s

�(sjg�)W (yji; s) =

s

�(sjg�)W (yji� 1; s)

and

W (�j�) �s

�(sjg�)W (�j�; s) 2 W

has identical rows.Therefore,C(W ) = 0 = CR(W): Q.E.D.Remark:: Inspection of the proof shows that actually we proved

the following.If U V Y forms a Markov chain andU and V are binary, then

I(U ^ V )> 0 andI(U ^ Y ) = 0 imply I(V ^ Y ) = 0:

Indeed

PU V Y

(U = u; V = v; Y = y) = PU(u)P

V jU(vju)P

Y jV(yjv)

and

I(U ^ Y ) = 0

imply that for all y

PV jU

(0j0)PY jV

(yj0) + PV jU

(1j0)PY jV

(yj1) =PY(y)

PV jU

(0j1)PY jV

(yj0) + PV jU

(1j1)PY jV

(yj1) =PY(y): (3.8)

Suppose thatI(U ^ V )> 0, then PV jU

(�j0) 6= PV jU

(�j1) and,therefore, (3.8) has the unique solution

PY jV

(yj0) = PY jV

(yj1) = PY(y); for all y

i.e., I(V ^ Y ) = 0:

REFERENCES

[1] R. Ahlswede, “Elimination of correlation in random codes for arbitrarilyvarying channels,”Z. Wahrscheinlichkeitstheorie und verw. Geb., vol.44, pp. 159–185, 1978.

[2] R. Ahlswede and I. Csiszar, “Common randomness in informationtheory and cryptography, Part I: Secret sharing,”IEEE Trans. Inform.Theory, vol. 39, no. 4, pp. 1121–1132, 1993. Part II to be published inIEEE Trans. Inform. Theory

[3] .[4] R. Ahlswede and V. Balakirsky, “Identification under random pro-

cesses,” preprint 95–098, SFB 343 “Diskrete Strukturen in der Mathe-matik”, Universitat Bielefeld. Also,Probl. Pered. Inform.(Special Issuehonoring M. S. Pinsker), vol. 32, no. 1, pp. 144–160, Jan.–Mar. 1996.

[5] T. Ericson, “Exponential error bounds for random codes in the arbitrarilyvarying channel,”IEEE Trans. Inform. Theory, vol. 31, pp. 42–48, 1985.

[6] I. Csiszaar and P. Narayan, “The capacity of the arbitrarily channelrevisited: Positivity, constraints,”IEEE Trans. Inform. Theory, vol. 34,no. 2, pp. 181–193, 1988.