can `finite' be more than `infinite' in distributed source ... · lossy distributed...

115
Can ‘finite’ be more than ‘infinite’ in distributed source coding S. Sandeep Pradhan (Joint work with Farhad Shirani) University of Michigan, Ann Arbor U of M 2014

Upload: others

Post on 18-Mar-2020

17 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Can ‘finite’ be more than ‘infinite’ in distributed

source coding

S. Sandeep Pradhan

(Joint work with Farhad Shirani)

University of Michigan, Ann Arbor

U of M 2014

Page 2: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Distributed Information Coding

I Proliferation of Internet, wireless and sensor network applications

I Supported by distributed information processing

I Information-theoretic perspective

Page 3: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

1: Distributed Field Gathering

Page 4: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

2: Broadcast and Interference Networks

Mobile

Transmitter

1

Mobile

Transmitter

2

Mobile

2

Mobile

1

Receiver

Receiver

Page 5: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

3: Streaming over the Internet

(n,k) Erasure Channel

Encoder

1

2

n-1

n

Decoder

k packets

X X^

Page 6: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Information and Coding theory: Tradition

Information Theory:

I Develop efficient communication strategies

I No constraints on memory/computation for encoding/decoding

I Obtain performance limits that are independent of technology

Coding Theory:

I Approach these limits using algebraic codes (Ex: linear codes)

I Fast encoding and decoding algorithms

I Objective: practical implementability of optimal communication

systems

Page 7: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Information and Coding theory: Tradition

Information Theory:

I Develop efficient communication strategies

I No constraints on memory/computation for encoding/decoding

I Obtain performance limits that are independent of technology

Coding Theory:

I Approach these limits using algebraic codes (Ex: linear codes)

I Fast encoding and decoding algorithms

I Objective: practical implementability of optimal communication

systems

Page 8: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point Data Compression (lossy)

Start with Binary Symmetric Source: X is IID Be(1/2)

I Wish to compress with Hamming distortion: dH(x, x) = 1 if x 6= x

and equals 0 otherwise.

I Minimum Rate of Compression for a given distortion δ:

I

R(D) = minP (X|X)

I(X; X) = 1− h(δ)

I E(dH(X, X)) ≤ δ.

I Single-letter relation between source and its quantized version

+X

N

X X

N

X^

+

I Test Channel: N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 9: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point Data Compression (lossy)

Start with Binary Symmetric Source: X is IID Be(1/2)

I Wish to compress with Hamming distortion: dH(x, x) = 1 if x 6= x

and equals 0 otherwise.

I Minimum Rate of Compression for a given distortion δ:

I

R(D) = minP (X|X)

I(X; X) = 1− h(δ)

I E(dH(X, X)) ≤ δ.

I Single-letter relation between source and its quantized version

+X

N

X X

N

X^

+

I Test Channel: N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 10: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point Data Compression (lossy)

Start with Binary Symmetric Source: X is IID Be(1/2)

I Wish to compress with Hamming distortion: dH(x, x) = 1 if x 6= x

and equals 0 otherwise.

I Minimum Rate of Compression for a given distortion δ:

I

R(D) = minP (X|X)

I(X; X) = 1− h(δ)

I E(dH(X, X)) ≤ δ.

I Single-letter relation between source and its quantized version

+X

N

X X

N

X^

+

I Test Channel: N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 11: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point Data Compression (lossy)

Start with Binary Symmetric Source: X is IID Be(1/2)

I Wish to compress with Hamming distortion: dH(x, x) = 1 if x 6= x

and equals 0 otherwise.

I Minimum Rate of Compression for a given distortion δ:

I

R(D) = minP (X|X)

I(X; X) = 1− h(δ)

I E(dH(X, X)) ≤ δ.

I Single-letter relation between source and its quantized version

+X

N

X X

N

X^

+

I Test Channel: N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 12: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point Data Compression (lossy)

Start with Binary Symmetric Source: X is IID Be(1/2)

I Wish to compress with Hamming distortion: dH(x, x) = 1 if x 6= x

and equals 0 otherwise.

I Minimum Rate of Compression for a given distortion δ:

I

R(D) = minP (X|X)

I(X; X) = 1− h(δ)

I E(dH(X, X)) ≤ δ.

I Single-letter relation between source and its quantized version

+X

N

X X

N

X^

+

I Test Channel: N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 13: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point communication: Biased Source

Binary Source with bias (BBS) p: X is IID, Be(p)

I Wish to compress with Hamming distortion

I Minimum Rate of Compression for a given distortion δ:

I R(D) = h(p)− h(δ).

X

N

X^

+X X

^

p(x|x)^

I N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 14: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point communication: Biased Source

Binary Source with bias (BBS) p: X is IID, Be(p)

I Wish to compress with Hamming distortion

I Minimum Rate of Compression for a given distortion δ:

I R(D) = h(p)− h(δ).

X

N

X^

+X X

^

p(x|x)^

I N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 15: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Point-to-point communication: Biased Source

Binary Source with bias (BBS) p: X is IID, Be(p)

I Wish to compress with Hamming distortion

I Minimum Rate of Compression for a given distortion δ:

I R(D) = h(p)− h(δ).

X

N

X^

+X X

^

p(x|x)^

I N ∼ Be(δ), and + is addition modulo 2

I Achieved using Random, Infinite-dimensional-vector Quantization

Page 16: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Many-to-one transformation: Quantization

Set of all n-length sequences 3-bit quantization

I Sequences that get the same color are NEARBY

I Xn = f(Xn), i.e., deterministically related

I But Xi is related to Xi probabilistically: P (Xi|Xi).

Page 17: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Many-to-one transformation: Binning

Set of all n-length sequences 2-bit binning

I Sequences that get the same color are FAR APART

Page 18: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

I More is thought be better in information theory and coding theory

I BOON of dimensionality: there is more space in higher dimensions

I Proven to be true for point-to-point communication

I A lot of effort in constructing codes of large block-lengths

I Even more effort in trying to encode and decode

I Where else more is better?

Page 19: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

I More is thought be better in information theory and coding theory

I BOON of dimensionality: there is more space in higher dimensions

I Proven to be true for point-to-point communication

I A lot of effort in constructing codes of large block-lengths

I Even more effort in trying to encode and decode

I Where else more is better?

Page 20: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

I More is thought be better in information theory and coding theory

I BOON of dimensionality: there is more space in higher dimensions

I Proven to be true for point-to-point communication

I A lot of effort in constructing codes of large block-lengths

I Even more effort in trying to encode and decode

I Where else more is better?

Page 21: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

I More is thought be better in information theory and coding theory

I BOON of dimensionality: there is more space in higher dimensions

I Proven to be true for point-to-point communication

I A lot of effort in constructing codes of large block-lengths

I Even more effort in trying to encode and decode

I Where else more is better?

Page 22: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 23: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 24: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 25: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 26: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 27: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 28: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 29: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 30: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 31: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

More is Better

Page 32: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Four Basic Problems in Information Theory

I Multiple access channel: Ahlswede-Liao region, 71 (wireless uplink)

I Distributed source coding: Berger-Tung region, 77 (sensor networks)

I Broadcast channel: Marton’s region, 79 (wirless downlink)

I Multiple description Coding: Zhang-Berger region, 87 (streaming)

I Till recently we did not know whether these regions are tight or not.

I Wagner et al [’11] proved that Berger-Tung region is not tight using a

continuity argument.

Page 33: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Four Basic Problems in Information Theory

I Multiple access channel: Ahlswede-Liao region, 71 (wireless uplink)

I Distributed source coding: Berger-Tung region, 77 (sensor networks)

I Broadcast channel: Marton’s region, 79 (wirless downlink)

I Multiple description Coding: Zhang-Berger region, 87 (streaming)

I Till recently we did not know whether these regions are tight or not.

I Wagner et al [’11] proved that Berger-Tung region is not tight using a

continuity argument.

Page 34: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Four Basic Problems in Information Theory

I Multiple access channel: Ahlswede-Liao region, 71 (wireless uplink)

I Distributed source coding: Berger-Tung region, 77 (sensor networks)

I Broadcast channel: Marton’s region, 79 (wirless downlink)

I Multiple description Coding: Zhang-Berger region, 87 (streaming)

I Till recently we did not know whether these regions are tight or not.

I Wagner et al [’11] proved that Berger-Tung region is not tight using a

continuity argument.

Page 35: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Four Basic Problems in Information Theory

I Multiple access channel: Ahlswede-Liao region, 71 (wireless uplink)

I Distributed source coding: Berger-Tung region, 77 (sensor networks)

I Broadcast channel: Marton’s region, 79 (wirless downlink)

I Multiple description Coding: Zhang-Berger region, 87 (streaming)

I Till recently we did not know whether these regions are tight or not.

I Wagner et al [’11] proved that Berger-Tung region is not tight using a

continuity argument.

Page 36: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Lossy Distributed Source Coding

I Compression of correlated sources in a distributed setting

D

E

C

O

D

E

REncoder 2

Encoder 1

Y

Y

1

Y2

^

2

I Restrict to reconstruction of one source with distortion

I Window into the world of network information theory

I Single-letter Achievable Rate Distortion Region [Berger-Tung 77]

I Independent, Random (unstructured), Infinite-dimensional quantization

I Let Ui denote the quantized version of Yi

I Curse: Long Markov chain: U1 − Y1 − Y2 − U2

Page 37: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Lossy Distributed Source Coding

I Compression of correlated sources in a distributed setting

D

E

C

O

D

E

REncoder 2

Encoder 1

Y

Y

1

Y2

^

2

I Restrict to reconstruction of one source with distortion

I Window into the world of network information theory

I Single-letter Achievable Rate Distortion Region [Berger-Tung 77]

I Independent, Random (unstructured), Infinite-dimensional quantization

I Let Ui denote the quantized version of Yi

I Curse: Long Markov chain: U1 − Y1 − Y2 − U2

Page 38: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Lossy Distributed Source Coding

I Compression of correlated sources in a distributed setting

D

E

C

O

D

E

REncoder 2

Encoder 1

Y

Y

1

Y2

^

2

I Restrict to reconstruction of one source with distortion

I Window into the world of network information theory

I Single-letter Achievable Rate Distortion Region [Berger-Tung 77]

I Independent, Random (unstructured), Infinite-dimensional quantization

I Let Ui denote the quantized version of Yi

I Curse: Long Markov chain: U1 − Y1 − Y2 − U2

Page 39: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Lossy Distributed Source Coding

I Compression of correlated sources in a distributed setting

D

E

C

O

D

E

REncoder 2

Encoder 1

Y

Y

1

Y2

^

2

I Restrict to reconstruction of one source with distortion

I Window into the world of network information theory

I Single-letter Achievable Rate Distortion Region [Berger-Tung 77]

I Independent, Random (unstructured), Infinite-dimensional quantization

I Let Ui denote the quantized version of Yi

I Curse: Long Markov chain: U1 − Y1 − Y2 − U2

Page 40: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Double Quantization

Quantizer 1

Quantizer 2

Y

Y

1

2

U

U

1

2

I Quantization Rate: I(Y1;U1) and I(Y2;U2).

I Sources are correlated =⇒ quantized vectors are correlated

I Bin the quantized vectors to exploit correlation

I Boon: rate rebate of I(U1;U2).

Y1 M1U1

Y2 M2U2Quantizer 2 Binning 2

Quantizer 1 Binning 1

Page 41: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Double Quantization

Quantizer 1

Quantizer 2

Y

Y

1

2

U

U

1

2

I Quantization Rate: I(Y1;U1) and I(Y2;U2).

I Sources are correlated =⇒ quantized vectors are correlated

I Bin the quantized vectors to exploit correlation

I Boon: rate rebate of I(U1;U2).

Y1 M1U1

Y2 M2U2Quantizer 2 Binning 2

Quantizer 1 Binning 1

Page 42: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Double Quantization

Quantizer 1

Quantizer 2

Y

Y

1

2

U

U

1

2

I Quantization Rate: I(Y1;U1) and I(Y2;U2).

I Sources are correlated =⇒ quantized vectors are correlated

I Bin the quantized vectors to exploit correlation

I Boon: rate rebate of I(U1;U2).

Y1 M1U1

Y2 M2U2Quantizer 2 Binning 2

Quantizer 1 Binning 1

Page 43: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Double Quantization

Quantizer 1

Quantizer 2

Y

Y

1

2

U

U

1

2

I Quantization Rate: I(Y1;U1) and I(Y2;U2).

I Sources are correlated =⇒ quantized vectors are correlated

I Bin the quantized vectors to exploit correlation

I Boon: rate rebate of I(U1;U2).

Y1 M1U1

Y2 M2U2Quantizer 2 Binning 2

Quantizer 1 Binning 1

Page 44: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Double Quantization

Quantizer 1

Quantizer 2

Y

Y

1

2

U

U

1

2

I Quantization Rate: I(Y1;U1) and I(Y2;U2).

I Sources are correlated =⇒ quantized vectors are correlated

I Bin the quantized vectors to exploit correlation

I Boon: rate rebate of I(U1;U2).

Y1 M1U1

Y2 M2U2Quantizer 2 Binning 2

Quantizer 1 Binning 1

Page 45: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Quantization + Binning

Set of all n-length sequences 3-bit quantization+2-bit binning

I Total Rate: I(Y1;U1) + I(Y2;U2)− I(U1;U2) = I(Y1, Y2;U1, U2).

I BT: R1 +R2 = I(Y1, Y2;U1, U2) optmized with U1 − Y1 − Y2 − U2

I Centralized: R1 +R2 = I(Y1, Y2;U1, U2) optimized over everything

Page 46: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Quantization + Binning

Set of all n-length sequences 3-bit quantization+2-bit binning

I Total Rate: I(Y1;U1) + I(Y2;U2)− I(U1;U2) = I(Y1, Y2;U1, U2).

I BT: R1 +R2 = I(Y1, Y2;U1, U2) optmized with U1 − Y1 − Y2 − U2

I Centralized: R1 +R2 = I(Y1, Y2;U1, U2) optimized over everything

Page 47: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Quantization + Binning

Set of all n-length sequences 3-bit quantization+2-bit binning

I Total Rate: I(Y1;U1) + I(Y2;U2)− I(U1;U2) = I(Y1, Y2;U1, U2).

I BT: R1 +R2 = I(Y1, Y2;U1, U2) optmized with U1 − Y1 − Y2 − U2

I Centralized: R1 +R2 = I(Y1, Y2;U1, U2) optimized over everything

Page 48: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Berger Tung: Quantization + Binning

Set of all n-length sequences 3-bit quantization+2-bit binning

I Total Rate: I(Y1;U1) + I(Y2;U2)− I(U1;U2) = I(Y1, Y2;U1, U2).

I BT: R1 +R2 = I(Y1, Y2;U1, U2) optmized with U1 − Y1 − Y2 − U2

I Centralized: R1 +R2 = I(Y1, Y2;U1, U2) optimized over everything

Page 49: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Quantize+Bin

I Quantize+Bin is ubiquitous in communications, signal processing

Page 50: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

BT: Independent, Random, Infinite-dimensional quantizers

Why BT rate region is not optimal?

Is it because of independent and random (unstructured) quantization?

I What if we quantize the two sources using identical linear codes?

I Lemma: If Y1 6= Y2, as n→∞, the two quantization noises become

independent.

I So, as long as dimension (block-length) →∞, it does not matter

whether the quantizers are

(i) independent or identical,

(ii) unstructured or linear.

I You cannot escape the curse with the wand of linear codes.

Page 51: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

BT: Independent, Random, Infinite-dimensional quantizers

Why BT rate region is not optimal?

Is it because of independent and random (unstructured) quantization?

I What if we quantize the two sources using identical linear codes?

I Lemma: If Y1 6= Y2, as n→∞, the two quantization noises become

independent.

I So, as long as dimension (block-length) →∞, it does not matter

whether the quantizers are

(i) independent or identical,

(ii) unstructured or linear.

I You cannot escape the curse with the wand of linear codes.

Page 52: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

BT: Independent, Random, Infinite-dimensional quantizers

Why BT rate region is not optimal?

Is it because of independent and random (unstructured) quantization?

I What if we quantize the two sources using identical linear codes?

I Lemma: If Y1 6= Y2, as n→∞, the two quantization noises become

independent.

I So, as long as dimension (block-length) →∞, it does not matter

whether the quantizers are

(i) independent or identical,

(ii) unstructured or linear.

I You cannot escape the curse with the wand of linear codes.

Page 53: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

BT: Independent, Random, Infinite-dimensional quantizers

Why BT rate region is not optimal?

Is it because of independent and random (unstructured) quantization?

I What if we quantize the two sources using identical linear codes?

I Lemma: If Y1 6= Y2, as n→∞, the two quantization noises become

independent.

I So, as long as dimension (block-length) →∞, it does not matter

whether the quantizers are

(i) independent or identical,

(ii) unstructured or linear.

I You cannot escape the curse with the wand of linear codes.

Page 54: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

BT: Independent, Random, Infinite-dimensional quantizers

Why BT rate region is not optimal?

Is it because of independent and random (unstructured) quantization?

I What if we quantize the two sources using identical linear codes?

I Lemma: If Y1 6= Y2, as n→∞, the two quantization noises become

independent.

I So, as long as dimension (block-length) →∞, it does not matter

whether the quantizers are

(i) independent or identical,

(ii) unstructured or linear.

I You cannot escape the curse with the wand of linear codes.

Page 55: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

BT: Independent, Random, Infinite-dimensional quantizers

Why BT rate region is not optimal?

Is it because of independent and random (unstructured) quantization?

I What if we quantize the two sources using identical linear codes?

I Lemma: If Y1 6= Y2, as n→∞, the two quantization noises become

independent.

I So, as long as dimension (block-length) →∞, it does not matter

whether the quantizers are

(i) independent or identical,

(ii) unstructured or linear.

I You cannot escape the curse with the wand of linear codes.

Page 56: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

BT: Independent, random infinite-length quantizers

I As block-length becomes large, most volume is inside the walls

I infinitesimal perturbation will take you to the next voronoi region

Q(Y )1+

Y 1

+

Y 2

Q(Y )2

Page 57: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Quantize and Bin: Why BT rate region is not optimal?

I Is it because of infinite block-length quantization?

I Answer: YES in the following sense.

I When the quantizers are identical, for a fixed distortion:

I As block-length increases, the correlation between quantized versions

becomes weaker

I For example: Y1 and Y2 jointly Gaussian, distortion = 0.363.

0.991 0.992 0.993 0.994 0.995 0.996 0.997 0.998 0.999 10.86

0.88

0.9

0.92

0.94

0.96

0.98

1

Source Correlation

Quantized Source Correlation

Block length=1Block length=4Block length=8Block length=14

Page 58: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Quantize and Bin: Why BT rate region is not optimal?

I Is it because of infinite block-length quantization?

I Answer: YES in the following sense.

I When the quantizers are identical, for a fixed distortion:

I As block-length increases, the correlation between quantized versions

becomes weaker

I For example: Y1 and Y2 jointly Gaussian, distortion = 0.363.

0.991 0.992 0.993 0.994 0.995 0.996 0.997 0.998 0.999 10.86

0.88

0.9

0.92

0.94

0.96

0.98

1

Source Correlation

Quantized Source Correlation

Block length=1Block length=4Block length=8Block length=14

Page 59: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Quantize and Bin: Why BT rate region is not optimal?

I Is it because of infinite block-length quantization?

I Answer: YES in the following sense.

I When the quantizers are identical, for a fixed distortion:

I As block-length increases, the correlation between quantized versions

becomes weaker

I For example: Y1 and Y2 jointly Gaussian, distortion = 0.363.

0.991 0.992 0.993 0.994 0.995 0.996 0.997 0.998 0.999 10.86

0.88

0.9

0.92

0.94

0.96

0.98

1

Source Correlation

Quantized Source Correlation

Block length=1Block length=4Block length=8Block length=14

Page 60: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Quantize and Bin: Why BT rate region is not optimal?

I Is it because of infinite block-length quantization?

I Answer: YES in the following sense.

I When the quantizers are identical, for a fixed distortion:

I As block-length increases, the correlation between quantized versions

becomes weaker

I For example: Y1 and Y2 jointly Gaussian, distortion = 0.363.

0.991 0.992 0.993 0.994 0.995 0.996 0.997 0.998 0.999 10.86

0.88

0.9

0.92

0.94

0.96

0.98

1

Source Correlation

Quantized Source Correlation

Block length=1Block length=4Block length=8Block length=14

Page 61: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Quantize and Bin: Why BT rate region is not optimal?

I Is it because of infinite block-length quantization?

I Answer: YES in the following sense.

I When the quantizers are identical, for a fixed distortion:

I As block-length increases, the correlation between quantized versions

becomes weaker

I For example: Y1 and Y2 jointly Gaussian, distortion = 0.363.

0.991 0.992 0.993 0.994 0.995 0.996 0.997 0.998 0.999 10.86

0.88

0.9

0.92

0.94

0.96

0.98

1

Source Correlation

Quantized Source Correlation

Block length=1Block length=4Block length=8Block length=14

Page 62: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

I Correlation is inversely related to block-length for a fixed distortion

Finite-length quantizer is better than Infinite-length quantizer

I Able to transfer the correlation from the sources to the quantized

versions more efficiently,

I a.k.a free from the curse of long Markov chain

I As block-length increases, the efficiency goes down, until you hit long

Markov chain

I Short block-length: Correlation Transfer Efficiency ↑, Source

Representation Efficiency ↓

Page 63: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

I Correlation is inversely related to block-length for a fixed distortion

Finite-length quantizer is better than Infinite-length quantizer

I Able to transfer the correlation from the sources to the quantized

versions more efficiently,

I a.k.a free from the curse of long Markov chain

I As block-length increases, the efficiency goes down, until you hit long

Markov chain

I Short block-length: Correlation Transfer Efficiency ↑, Source

Representation Efficiency ↓

Page 64: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

I Correlation is inversely related to block-length for a fixed distortion

Finite-length quantizer is better than Infinite-length quantizer

I Able to transfer the correlation from the sources to the quantized

versions more efficiently,

I a.k.a free from the curse of long Markov chain

I As block-length increases, the efficiency goes down, until you hit long

Markov chain

I Short block-length: Correlation Transfer Efficiency ↑, Source

Representation Efficiency ↓

Page 65: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

I Correlation is inversely related to block-length for a fixed distortion

Finite-length quantizer is better than Infinite-length quantizer

I Able to transfer the correlation from the sources to the quantized

versions more efficiently,

I a.k.a free from the curse of long Markov chain

I As block-length increases, the efficiency goes down, until you hit long

Markov chain

I Short block-length: Correlation Transfer Efficiency ↑, Source

Representation Efficiency ↓

Page 66: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

I Correlation is inversely related to block-length for a fixed distortion

Finite-length quantizer is better than Infinite-length quantizer

I Able to transfer the correlation from the sources to the quantized

versions more efficiently,

I a.k.a free from the curse of long Markov chain

I As block-length increases, the efficiency goes down, until you hit long

Markov chain

I Short block-length: Correlation Transfer Efficiency ↑, Source

Representation Efficiency ↓

Page 67: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

I Correlation is inversely related to block-length for a fixed distortion

Finite-length quantizer is better than Infinite-length quantizer

I Able to transfer the correlation from the sources to the quantized

versions more efficiently,

I a.k.a free from the curse of long Markov chain

I As block-length increases, the efficiency goes down, until you hit long

Markov chain

I Short block-length: Correlation Transfer Efficiency ↑, Source

Representation Efficiency ↓

Page 68: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

Infinite-length quantizer is better than finite-length quantizer

I Large block-length: Correlation Transfer Efficiency ↓, Source

Representation Efficiency ↑

I There is a sweet-spot for the block-length where overall

efficiency is maximum

I This is an artifact of quantize and bin strategy

Page 69: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

Infinite-length quantizer is better than finite-length quantizer

I Large block-length: Correlation Transfer Efficiency ↓, Source

Representation Efficiency ↑

I There is a sweet-spot for the block-length where overall

efficiency is maximum

I This is an artifact of quantize and bin strategy

Page 70: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Inverse Correlation Effect

Infinite-length quantizer is better than finite-length quantizer

I Large block-length: Correlation Transfer Efficiency ↓, Source

Representation Efficiency ↑

I There is a sweet-spot for the block-length where overall

efficiency is maximum

I This is an artifact of quantize and bin strategy

Page 71: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Common Information (Gacs-Korner-Witsenhausen)

I A random variable X such that X = fi(Yi) for i ∈ {1, 2} such that

H(X) > 0.

I Example of CI: Y1 = (X, Y1), Y2 = (X, Y2)

I CI-based Coding [Wagner-Kelly-Altug 09]

I Quantize the CI at both encoders using the same code.

I Encoders cooperate at sending the quantized version.

I Treat the quantized version as side-information + BT strategy.

I Break the long Markov chain using CI

I Reduces to BT strategy when CI is trivial

Page 72: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Common Information (Gacs-Korner-Witsenhausen)

I A random variable X such that X = fi(Yi) for i ∈ {1, 2} such that

H(X) > 0.

I Example of CI: Y1 = (X, Y1), Y2 = (X, Y2)

I CI-based Coding [Wagner-Kelly-Altug 09]

I Quantize the CI at both encoders using the same code.

I Encoders cooperate at sending the quantized version.

I Treat the quantized version as side-information + BT strategy.

I Break the long Markov chain using CI

I Reduces to BT strategy when CI is trivial

Page 73: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Common Information (Gacs-Korner-Witsenhausen)

I A random variable X such that X = fi(Yi) for i ∈ {1, 2} such that

H(X) > 0.

I Example of CI: Y1 = (X, Y1), Y2 = (X, Y2)

I CI-based Coding [Wagner-Kelly-Altug 09]

I Quantize the CI at both encoders using the same code.

I Encoders cooperate at sending the quantized version.

I Treat the quantized version as side-information + BT strategy.

I Break the long Markov chain using CI

I Reduces to BT strategy when CI is trivial

Page 74: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Common Information (Gacs-Korner-Witsenhausen)

I A random variable X such that X = fi(Yi) for i ∈ {1, 2} such that

H(X) > 0.

I Example of CI: Y1 = (X, Y1), Y2 = (X, Y2)

I CI-based Coding [Wagner-Kelly-Altug 09]

I Quantize the CI at both encoders using the same code.

I Encoders cooperate at sending the quantized version.

I Treat the quantized version as side-information + BT strategy.

I Break the long Markov chain using CI

I Reduces to BT strategy when CI is trivial

Page 75: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Common Information (Gacs-Korner-Witsenhausen)

I A random variable X such that X = fi(Yi) for i ∈ {1, 2} such that

H(X) > 0.

I Example of CI: Y1 = (X, Y1), Y2 = (X, Y2)

I CI-based Coding [Wagner-Kelly-Altug 09]

I Quantize the CI at both encoders using the same code.

I Encoders cooperate at sending the quantized version.

I Treat the quantized version as side-information + BT strategy.

I Break the long Markov chain using CI

I Reduces to BT strategy when CI is trivial

Page 76: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Common Information (Gacs-Korner-Witsenhausen)

I A random variable X such that X = fi(Yi) for i ∈ {1, 2} such that

H(X) > 0.

I Example of CI: Y1 = (X, Y1), Y2 = (X, Y2)

I CI-based Coding [Wagner-Kelly-Altug 09]

I Quantize the CI at both encoders using the same code.

I Encoders cooperate at sending the quantized version.

I Treat the quantized version as side-information + BT strategy.

I Break the long Markov chain using CI

I Reduces to BT strategy when CI is trivial

Page 77: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Example [Wagner-Kelly-Altug 09]

I Let Y1 = X + E and Y2 = (X,Z). Where

X ∼ Be( 12 ), E ∼ Be(ε), Z ∼ Be(p).

I If ε = 0 then X is a CI.

I The decoder wants to reconstruct X + Z with distortion D.

Page 78: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Example [Wagner-Kelly-Altug 09]

I Let Y1 = X + E and Y2 = (X,Z). Where

X ∼ Be( 12 ), E ∼ Be(ε), Z ∼ Be(p).

I If ε = 0 then X is a CI.

I The decoder wants to reconstruct X + Z with distortion D.

Page 79: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Example [Wagner-Kelly-Altug 09]

I Let Y1 = X + E and Y2 = (X,Z). Where

X ∼ Be( 12 ), E ∼ Be(ε), Z ∼ Be(p).

I If ε = 0 then X is a CI.

I The decoder wants to reconstruct X + Z with distortion D.

Page 80: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is present: ε = 0

I Here is a block diagram of the CI scheme: ε = 0

I Encoder 1: X =W +Nδ

=⇒ X + Z =W +Nδ + Z, R1 = 1− hb(δ)I Uncertainty at decoder: Nδ + Z,

I Encoder 2: Nδ + Z = Q+Nδ1 =⇒ R2 = hb(δ ∗ p)− hb(δ1),D = δ1.

Page 81: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is present: ε = 0

I Here is a block diagram of the CI scheme: ε = 0

I Encoder 1: X =W +Nδ

=⇒ X + Z =W +Nδ + Z, R1 = 1− hb(δ)

I Uncertainty at decoder: Nδ + Z,

I Encoder 2: Nδ + Z = Q+Nδ1 =⇒ R2 = hb(δ ∗ p)− hb(δ1),D = δ1.

Page 82: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is present: ε = 0

I Here is a block diagram of the CI scheme: ε = 0

I Encoder 1: X =W +Nδ

=⇒ X + Z =W +Nδ + Z, R1 = 1− hb(δ)I Uncertainty at decoder: Nδ + Z,

I Encoder 2: Nδ + Z = Q+Nδ1 =⇒ R2 = hb(δ ∗ p)− hb(δ1),D = δ1.

Page 83: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is present: ε = 0

I Here is a block diagram of the CI scheme: ε = 0

I Encoder 1: X =W +Nδ

=⇒ X + Z =W +Nδ + Z, R1 = 1− hb(δ)I Uncertainty at decoder: Nδ + Z,

I Encoder 2: Nδ + Z = Q+Nδ1

=⇒ R2 = hb(δ ∗ p)− hb(δ1),D = δ1.

Page 84: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is present: ε = 0

I Here is a block diagram of the CI scheme: ε = 0

I Encoder 1: X =W +Nδ

=⇒ X + Z =W +Nδ + Z, R1 = 1− hb(δ)I Uncertainty at decoder: Nδ + Z,

I Encoder 2: Nδ + Z = Q+Nδ1 =⇒ R2 = hb(δ ∗ p)− hb(δ1),

D = δ1.

Page 85: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is present: ε = 0

I Here is a block diagram of the CI scheme: ε = 0

I Encoder 1: X =W +Nδ

=⇒ X + Z =W +Nδ + Z, R1 = 1− hb(δ)I Uncertainty at decoder: Nδ + Z,

I Encoder 2: Nδ + Z = Q+Nδ1 =⇒ R2 = hb(δ ∗ p)− hb(δ1),D = δ1.

Page 86: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is absent: ε 6= 0

I If ε 6= 0: Encoder 2 cannot reconstruct W

I Instead can only reconstruct W : W − (X + E)−X −WI =⇒ discontinuity in (R1, R2, D) as a function of ε.

I Actual rate distortion region (performance limit) is continuous in ε

Page 87: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is absent: ε 6= 0

I If ε 6= 0: Encoder 2 cannot reconstruct W

I Instead can only reconstruct W : W − (X + E)−X −W

I =⇒ discontinuity in (R1, R2, D) as a function of ε.

I Actual rate distortion region (performance limit) is continuous in ε

Page 88: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is absent: ε 6= 0

I If ε 6= 0: Encoder 2 cannot reconstruct W

I Instead can only reconstruct W : W − (X + E)−X −WI =⇒ discontinuity in (R1, R2, D) as a function of ε.

I Actual rate distortion region (performance limit) is continuous in ε

Page 89: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

CI is absent: ε 6= 0

I If ε 6= 0: Encoder 2 cannot reconstruct W

I Instead can only reconstruct W : W − (X + E)−X −WI =⇒ discontinuity in (R1, R2, D) as a function of ε.

I Actual rate distortion region (performance limit) is continuous in ε

Page 90: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

New Result: Theorem

For the binary one-help-one problem, the following rate-distortion region is

achievable for any positive integer n.

R1 ≥ 1− hb(δ) + θn (1)

R2 ≥ hb(p ∗ δ)− hb(δ1) (2)

D2 ≤ δ1 ∗((1− (1− ε)n)

(δ +

ε

(1− (1− ε)n) ∗ δ))

(3)

I This is a single-letter characterization

I Here θn = 12lognn +O( 1n ).

I If nε� 1 then the distortion is close to δ1 ∗(nε(δ + 1

n ∗ δ))

.

I The region is continuous in ε and contains the CI scheme when ε = 0.

I To get this performance via BT approach, we need multi-letterization

Page 91: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

New Result: Theorem

For the binary one-help-one problem, the following rate-distortion region is

achievable for any positive integer n.

R1 ≥ 1− hb(δ) + θn (1)

R2 ≥ hb(p ∗ δ)− hb(δ1) (2)

D2 ≤ δ1 ∗((1− (1− ε)n)

(δ +

ε

(1− (1− ε)n) ∗ δ))

(3)

I This is a single-letter characterization

I Here θn = 12lognn +O( 1n ).

I If nε� 1 then the distortion is close to δ1 ∗(nε(δ + 1

n ∗ δ))

.

I The region is continuous in ε and contains the CI scheme when ε = 0.

I To get this performance via BT approach, we need multi-letterization

Page 92: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

New Result: Theorem

For the binary one-help-one problem, the following rate-distortion region is

achievable for any positive integer n.

R1 ≥ 1− hb(δ) + θn (1)

R2 ≥ hb(p ∗ δ)− hb(δ1) (2)

D2 ≤ δ1 ∗((1− (1− ε)n)

(δ +

ε

(1− (1− ε)n) ∗ δ))

(3)

I This is a single-letter characterization

I Here θn = 12lognn +O( 1n ).

I If nε� 1 then the distortion is close to δ1 ∗(nε(δ + 1

n ∗ δ))

.

I The region is continuous in ε and contains the CI scheme when ε = 0.

I To get this performance via BT approach, we need multi-letterization

Page 93: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

New Result: Theorem

For the binary one-help-one problem, the following rate-distortion region is

achievable for any positive integer n.

R1 ≥ 1− hb(δ) + θn (1)

R2 ≥ hb(p ∗ δ)− hb(δ1) (2)

D2 ≤ δ1 ∗((1− (1− ε)n)

(δ +

ε

(1− (1− ε)n) ∗ δ))

(3)

I This is a single-letter characterization

I Here θn = 12lognn +O( 1n ).

I If nε� 1 then the distortion is close to δ1 ∗(nε(δ + 1

n ∗ δ))

.

I The region is continuous in ε and contains the CI scheme when ε = 0.

I To get this performance via BT approach, we need multi-letterization

Page 94: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

New Result: Theorem

For the binary one-help-one problem, the following rate-distortion region is

achievable for any positive integer n.

R1 ≥ 1− hb(δ) + θn (1)

R2 ≥ hb(p ∗ δ)− hb(δ1) (2)

D2 ≤ δ1 ∗((1− (1− ε)n)

(δ +

ε

(1− (1− ε)n) ∗ δ))

(3)

I This is a single-letter characterization

I Here θn = 12lognn +O( 1n ).

I If nε� 1 then the distortion is close to δ1 ∗(nε(δ + 1

n ∗ δ))

.

I The region is continuous in ε and contains the CI scheme when ε = 0.

I To get this performance via BT approach, we need multi-letterization

Page 95: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

New Coding Approach

I Here is a block-diagram of the scheme:

X

Z

X + E

C(n)f

S

V

C(m)r

V

πQ

π−1S Q

X + Z

Encoding Decoding

C(n)f

I n is finite and m is infinitely large

I 3 components: C(n)f , C

(m)r and π.

Page 96: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

New Coding Approach

I Here is a block-diagram of the scheme:

X

Z

X + E

C(n)f

S

V

C(m)r

V

πQ

π−1S Q

X + Z

Encoding Decoding

C(n)f

I n is finite and m is infinitely large

I 3 components: C(n)f , C

(m)r and π.

Page 97: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Specifics of Coding

I C(n)f is an n-length code for quantizing a BSS to a distortion δ with

rate R(n, δ) = 1− hb(δ) + θn. [Kostina-Verdu 12]

I C(m)r is an m-length code for quantizing a BBS ∼ (p ∗ δ) to

distortion δ1, with rates approaching hb(p ∗ δ)− hb(δ1). [Shannon 59]

I Interleaver πi ∈ Sn, i ∈ [1 : m]

Page 98: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Specifics of Coding

I C(n)f is an n-length code for quantizing a BSS to a distortion δ with

rate R(n, δ) = 1− hb(δ) + θn. [Kostina-Verdu 12]

I C(m)r is an m-length code for quantizing a BBS ∼ (p ∗ δ) to

distortion δ1, with rates approaching hb(p ∗ δ)− hb(δ1). [Shannon 59]

I Interleaver πi ∈ Sn, i ∈ [1 : m]

Page 99: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Specifics of Coding

I C(n)f is an n-length code for quantizing a BSS to a distortion δ with

rate R(n, δ) = 1− hb(δ) + θn. [Kostina-Verdu 12]

I C(m)r is an m-length code for quantizing a BBS ∼ (p ∗ δ) to

distortion δ1, with rates approaching hb(p ∗ δ)− hb(δ1). [Shannon 59]

I Interleaver πi ∈ Sn, i ∈ [1 : m]

Page 100: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Encoders

I Encoder 1:

I Upon receiving a sequence (X + E)(1 : m, 1 : n) takes

V (i, 1 : n) = argmincn∈Cf

ndh(c

n, (X + E)(i, 1 : n)) transmits the

index of V (i, 1 : n) in Cnf .

I Encoder 2:

I Upon receiving a sequence X(1 : m, 1 : n) calculates

V (i, 1 : n) = argmincn∈Cf

ndh(c

n, X(i, 1 : n)).

I Calculates S(i, 1 : n) = (V +X + Z)(i, 1 : n).

I Let S(i, 1 : n) = πi(S(i, 1 : n)). Quantizes each S(1 : m, j) using

Cmr to get Q(1 : m, j).

I Transmits the index of Q(1 : m, j) in Cmr .

Page 101: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Encoders

I Encoder 1:

I Upon receiving a sequence (X + E)(1 : m, 1 : n) takes

V (i, 1 : n) = argmincn∈Cf

ndh(c

n, (X + E)(i, 1 : n)) transmits the

index of V (i, 1 : n) in Cnf .

I Encoder 2:

I Upon receiving a sequence X(1 : m, 1 : n) calculates

V (i, 1 : n) = argmincn∈Cf

ndh(c

n, X(i, 1 : n)).

I Calculates S(i, 1 : n) = (V +X + Z)(i, 1 : n).

I Let S(i, 1 : n) = πi(S(i, 1 : n)). Quantizes each S(1 : m, j) using

Cmr to get Q(1 : m, j).

I Transmits the index of Q(1 : m, j) in Cmr .

Page 102: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Observation

I Note that S(1 : m, j) is a DMS:

1 2 3 n

1

2

3

4

m

I The distribution of S(1 : m, j) is Be(p ∗ δ):

P (S(i, j)=1) = P (X(i, πi(j))+V (i, πi(j))+Z(i, πi(j))=1)

= p ∗ P (X(i, πi(j)) + V (i, πi(j)) = 1)

= p ∗ 1

n

n∑

j′=1

E(wH(X(i, j′) + V (i, j′)))

= p ∗ δ

Page 103: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Observation

I Note that S(1 : m, j) is a DMS:

1 2 3 n

1

2

3

4

m

I The distribution of S(1 : m, j) is Be(p ∗ δ):

P (S(i, j)=1) = P (X(i, πi(j))+V (i, πi(j))+Z(i, πi(j))=1)

= p ∗ P (X(i, πi(j)) + V (i, πi(j)) = 1)

= p ∗ 1

n

n∑

j′=1

E(wH(X(i, j′) + V (i, j′)))

= p ∗ δ

Page 104: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Decoder

I Decoder:

I Calculates Q(i, 1 : n) = π−1i

(Q(i, 1 : n)

).

I Declares (Q+ V )(1 : m, 1 : n) as the reconstruction.

I Calculating distortion:

D =1

mnE{dH((X + Z)(1 :m, 1:n), (Q+ V )(1 :m, 1:n))}

=1

mnE{wH((V + V + T )(1 :m, 1:n))}

= (δ1 ∗1

mnE{wH((V + V )(1 :m, 1:n))})

I Note that V (i, 1 : n) = V (i, 1 : n) if E(1 : n) = 0.

D =(δ1∗

1

nE{wH((V+V )(1, 1:n)|E(1, 1:n) 6= 0)}P (E(1, 1:n) 6=0)

)

Page 105: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Decoder

I Decoder:

I Calculates Q(i, 1 : n) = π−1i

(Q(i, 1 : n)

).

I Declares (Q+ V )(1 : m, 1 : n) as the reconstruction.

I Calculating distortion:

D =1

mnE{dH((X + Z)(1 :m, 1:n), (Q+ V )(1 :m, 1:n))}

=1

mnE{wH((V + V + T )(1 :m, 1:n))}

= (δ1 ∗1

mnE{wH((V + V )(1 :m, 1:n))})

I Note that V (i, 1 : n) = V (i, 1 : n) if E(1 : n) = 0.

D =(δ1∗

1

nE{wH((V+V )(1, 1:n)|E(1, 1:n) 6= 0)}P (E(1, 1:n) 6=0)

)

Page 106: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Decoder

I Decoder:

I Calculates Q(i, 1 : n) = π−1i

(Q(i, 1 : n)

).

I Declares (Q+ V )(1 : m, 1 : n) as the reconstruction.

I Calculating distortion:

D =1

mnE{dH((X + Z)(1 :m, 1:n), (Q+ V )(1 :m, 1:n))}

=1

mnE{wH((V + V + T )(1 :m, 1:n))}

= (δ1 ∗1

mnE{wH((V + V )(1 :m, 1:n))})

I Note that V (i, 1 : n) = V (i, 1 : n) if E(1 : n) = 0.

D =(δ1∗

1

nE{wH((V+V )(1, 1:n)|E(1, 1:n) 6= 0)}P (E(1, 1:n) 6=0)

)

Page 107: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Decoder

I Simplifying the previous equations we can get:

D ≤ δ1 ∗ (1− (1− ε)n)(δ +

ε

(1− (1− ε)n ∗ δ))

Theorem

The new rate-distortion region strictly contains the BT rate region

Page 108: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Decoder

I Simplifying the previous equations we can get:

D ≤ δ1 ∗ (1− (1− ε)n)(δ +

ε

(1− (1− ε)n ∗ δ))

Theorem

The new rate-distortion region strictly contains the BT rate region

Page 109: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Hamming Codes

I Using Hamming code of length 2r − 1:

R1 = 1− r

2r − 1

R2 = hb(1

2r∗ p)− hb(δ1)

D ≤ δ1 ∗ (1− (1− ε)n)(

ε

(1− (1− ε)2r−1) ∗1

2r − 1+

1

2r − 1

)

I A good scheme for BT seems to be to time-share between the

following points to avoid double quantization.

r1 = 1, r2 = hb(p)− hb(δ1)r′1 = 0, r′2 = 1− hb(δ1)

I Also the CI scheme for ε = 0 gives an outer bound.

Page 110: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Hamming Codes

I Using Hamming code of length 2r − 1:

R1 = 1− r

2r − 1

R2 = hb(1

2r∗ p)− hb(δ1)

D ≤ δ1 ∗ (1− (1− ε)n)(

ε

(1− (1− ε)2r−1) ∗1

2r − 1+

1

2r − 1

)

I A good scheme for BT seems to be to time-share between the

following points to avoid double quantization.

r1 = 1, r2 = hb(p)− hb(δ1)r′1 = 0, r′2 = 1− hb(δ1)

I Also the CI scheme for ε = 0 gives an outer bound.

Page 111: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Hamming Codes

I Using Hamming code of length 2r − 1:

R1 = 1− r

2r − 1

R2 = hb(1

2r∗ p)− hb(δ1)

D ≤ δ1 ∗ (1− (1− ε)n)(

ε

(1− (1− ε)2r−1) ∗1

2r − 1+

1

2r − 1

)

I A good scheme for BT seems to be to time-share between the

following points to avoid double quantization.

r1 = 1, r2 = hb(p)− hb(δ1)r′1 = 0, r′2 = 1− hb(δ1)

I Also the CI scheme for ε = 0 gives an outer bound.

Page 112: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Numerical Results

I Comparison between the three bounds:(δ1 = 0.1, p = 0.3, ε = 10−10)

Page 113: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Conclusions

I As block-length of quantization is increased

I Correlation Transfer Efficiency decreases

I Source Representation Efficiency increases

I There is a sweet spot for block-length n where overall efficiency is

maximum

I To get this performance in BT framework, we need multi-letterization

(n-letter)

Page 114: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Conclusions

I As block-length of quantization is increased

I Correlation Transfer Efficiency decreases

I Source Representation Efficiency increases

I There is a sweet spot for block-length n where overall efficiency is

maximum

I To get this performance in BT framework, we need multi-letterization

(n-letter)

Page 115: Can `finite' be more than `infinite' in distributed source ... · Lossy Distributed Source Coding I Compression of correlated sources in a distributed setting D E C O D E Encoder

Conclusions

I As block-length of quantization is increased

I Correlation Transfer Efficiency decreases

I Source Representation Efficiency increases

I There is a sweet spot for block-length n where overall efficiency is

maximum

I To get this performance in BT framework, we need multi-letterization

(n-letter)