gillat kol (ias) joint work with ran raz (weizmann + ias) interactive channel capacity

26
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Upload: chloe-rickey

Post on 15-Dec-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Gillat Kol (IAS)

joint work with Ran Raz (Weizmann + IAS)

Interactive Channel Capacity

Page 2: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

“A Mathematical Theory of Communication”

Claude Shannon 1948

An exact formula for the channel capacity of any noisy channel

Page 3: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• -noisy channel: Each bit is flipped with prob (independently)

• Alice wants to send an n bit message to Bob. How many bits does Alice need to send over the -noisy channel, so Bob can retrieve w.p. 1-o(1)?

– Is the blow-up even constant?

Shannon: Channel Capacity1-

1-

0

1 1

0

n bitsnoiseless channel A B

? bits-noisy channel A B

Page 4: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• -noisy channel: Each bit is flipped with prob (independently)

• Alice wants to send an n bit message to Bob. How many bits does Alice need to send over the -noisy channel, so Bob can retrieve w.p. 1-o(1)?

• [Shannon ‘48]: # bits n / 1-H() – Entropy function H() = - log() – (1-) log(1-) – Matching upper and lower bounds

Shannon: Channel Capacity

Channel Capacity

1-

1-

0

1 1

0

Page 5: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• Alice and Bob want to have an n bits long conversation. How many bits do they need to send over the -noisy channel, so both can retrieve transcript w.p. 1-o(1)?

-noisy channel A B

? bits

n bitsnoiseless channel A B

Us: Interactive Channel Capacity

Page 6: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Communication Complexity• Setting: Alice has input x, Bob has input y. They want to compute f(x,y) (f is publicly known)

• Communication Complexity of f: The least number of bits they need to communicate

– Deterministic, CC(f): x,y, compute f(x,y) w.p. 1– Randomized, RCC(f): x,y, compute f(x,y) w.p. 1-o(1)

Players share a random string

– Noisy, CC(f): x,y, compute f(x,y) w.p. 1-o(1) Players communicate over the -noisy channel Players share a random string

Page 7: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• Def: Interactive Channel Capacity

– RCC(f) = Randomized CC (over the noiseless channel)

– CC(f) = Noisy CC (over the -noisy channel)

* Results hold when we use CC(f) instead of RCC(f)

* Results hold for worst case & average case RCC(f),CC(f)

𝐂 (𝛆 )=𝐥𝐢𝐦𝐢𝐧𝐟𝒏→∞

𝐦𝐢𝐧{ 𝒇 :𝑹𝑪𝑪 ( 𝒇 )=𝒏 }( 𝐧

𝐂𝐂𝛆(𝐟 ))

Def: Interactive Channel Capacity

Page 8: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• Def: Interactive Channel Capacity

– RCC(f) = Randomized CC (over the noiseless channel)

– CC(f) = Noisy CC (over the -noisy channel)

• For f(x,y) = x (msg transmission), we get Channel Capacity– Interactive Channel Capacity Channel Capacity In the interactive case, an error in the first bit may cause the whole conversation to be meaningless. We may need to “encode” every bit separately.

𝐂 (𝛆 )=𝐥𝐢𝐦𝐢𝐧𝐟𝒏→∞

𝐦𝐢𝐧{ 𝒇 :𝑹𝑪𝑪 ( 𝒇 )=𝒏 }( 𝐧

𝐂𝐂𝛆(𝐟 ))

Def: Interactive Channel Capacity

Page 9: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• [Schulman ’92]: – Theorem: If RCC(f) = n then CC(f) O(n)

• Corollary: C() > 0– Open Question: Is Interactive Channel Capacity = Channel Capacity?

• Many other works [Sch,BR,B,GMS,BK,BN,FGOS…]: – Simulation of any communication protocol with

adversarial noise– Large constants, never made explicit

Previous Works 𝐂 (𝛆 )=𝐥𝐢𝐦𝐢𝐧𝐟𝒏→∞

𝐦𝐢𝐧{ 𝒇 :𝑹𝑪𝑪 ( 𝒇 )=𝒏}( 𝐧

𝐂𝐂𝛆(𝐟 ))

Page 10: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Our Results

Theorem 1 (Upper Bound): C() 1 - for small : Interactive Channel Capacity is strictly smaller than Channel Capacity (1 - )

Theorem 2 (Lower Bound): C() 1 -O (in the case of alternating turns)

𝐂 (𝛆 )=𝐥𝐢𝐦𝐢𝐧𝐟𝒏→∞

𝐦𝐢𝐧{ 𝒇 :𝑹𝑪𝑪 ( 𝒇 )=𝒏}( 𝐧

𝐂𝐂𝛆 (𝐟 ))

Page 11: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Channel Types• Synchronous Channel: Exactly one player sends a bit

at each time step

• Asynchronous Channel: If both send bits at the same time these bits are lost

• Two channels: Each player sends a bit at any time

this work

Page 12: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Channel Types• Synchronous Channel: Exactly one player sends a bit

at each time step – The order of turns in a protocol is pre-determined

(independent of the inputs, randomness, noise). Otherwise players may send bits at the same time– Alternating turns is a special case

• Asynchronous Channel: If both send bits at the same time these bits are lost

• Two channels: Each player sends a bit at any time

this work

Page 13: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Proof of Upper BoundC() 1 -

Page 14: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• Example f with CC > RCC: 2k-Pointer Jumping Game

• Parameters:– 2k-ary tree, depth d – k = O(1), d – = logk / k2

• Alice owns odd layers, Bob owns even layers

• Pointer Jumping Game:– Inputs: Each player gets an edge going out of

every node he owns– Goal: Find the leaf reached

deg=

dept

h =

d

Pointer Jumping 𝐂 (𝛆 )=𝐥𝐢𝐦𝐢𝐧𝐟𝒏→∞

𝐦𝐢𝐧{ 𝒇 :𝑹𝑪𝑪 ( 𝒇 )=𝒏}( 𝐧

𝐂𝐂𝛆 (𝐟 ))

Page 15: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• Example f with CC > RCC: 2k-Pointer Jumping Game

• Parameters:– 2k-ary tree, depth d – k = O(1), d – = logk / k2

• Outline of upper bound:– Clearly, RCC(f) dk– We prove CC(f) d (k + (logk)) – involved proof!

RCC lower bounds are typically up-to a constant. We are interested in the second order terms C() 1-(logk/k) = 1- = 1-

deg=

dept

h =

d

Pointer Jumping 𝐂 (𝛆 )=𝐥𝐢𝐦𝐢𝐧𝐟𝒏→∞

𝐦𝐢𝐧{ 𝒇 :𝑹𝑪𝑪 ( 𝒇 )=𝒏}( 𝐧

𝐂𝐂𝛆 (𝐟 ))

Page 16: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Bounding CC(PJG) - The Idea “Any good PJG protocol does the following:”

• Alice starts by sending the first edge (k bits)– wp k a bit was flipped

• Case 1: Alice sends additional bits to correct first edge– Even if a single error occurred and Alice knows its

index, she needs to send the index logk bit waste

• Case 2: Bob sends the next edge (k bits)– wp k these k bits are wasted, as Bob had wrong first

edge In expectation, k2 = logk bit waste

• In both cases, sending the first edge costs k+(log k)!– was chosen to balance the 2 losses

deg=

= logk / k2

Page 17: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• Let players exchange the first 1.25k bits of the protocol. t1 = #bits out of the first 1.25k bits sent by Alice

(well defined due to pre-determined order of turns)

• Case 1: Alice sends additional bits to correct first edgecorresponds to t1 k+0.5logk

• Case 2: Bob sends the next edge corresponds to t1 < k+0.5logk

Bounding CC(PJG) - More Formal = logk / k2

Page 18: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

• After the exchange of the first 1.25k bits, we “voluntarily” reveal the first edge to Bob.

• The players now play a new PJG of depth d-1. We need to show that sending the first edge of the new PJG also costs k+(log k).

• Challenge: In the new PJG, some info about the players’ inputs may already be known

– How do we measure the players’ progress?

d

Bounding CC(PJG) - Why is the actual proof challenging?

Page 19: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Proof of Lower BoundC() 1 -O

Page 20: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Simulation• Parameters (same):

– k = O(1)– = logk / k2

• Given a communication protocol P, we simulate P over the -noisy channel using a recursive protocol:– The basic step simulates k steps of P– The ith inductive step simulates ki+1 steps of P

𝐂 (𝛆 )=𝐥𝐢𝐦𝐢𝐧𝐟𝒏→∞

𝐦𝐢𝐧{ 𝒇 :𝑹𝑪𝑪 ( 𝒇 )=𝒏}( 𝐧

𝐂𝐂𝛆(𝐟 ))

Page 21: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Simulating Protocol - Basic Step• Simulating Protocol (Basic Step):

– Players run k steps of P. Alice observes transcript Ta, and Bob transcript Tb

– Players run an O(logk) bit consistency check of Ta,Tb using hash functions, each bit sent many times

– A player that finds an inconsistency starts over and removes this step’s bits from his transcript

k bitsProtocol P

O(logk) bitsconsistency check

inconsistency

Page 22: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Simulating Protocol - Interactive Step• Simulating Protocol (first inductive step):

– Players run the Basic Step k consecutive times. Alice observes transcript Ta, and Bob transcript Tb (Players may go out of sync, but due to the alternating turns they know who should speak next)– Players run an O(log2k) bit consistency check of Ta,Tb

using hash functions, each bit sent many times– A player that finds an inconsistency starts over and removes this step’s bits from his transcript

k timesinconsistency

O(log2k) bits

Page 23: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Analysis: Correctness• The final protocol simulates P with probability 1-o(1):

– If an error occurred or the players went out of sync, they will eventually fix it, as the consistency check checks the whole transcript so far and is done with larger and larger parameters

Page 24: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Analysis: Waste in Basic Step• Length of consistency check: O(logk) bits

• Probability to start over: O(k)

• Total waste (in expectation): O(logk) + O(k) O(k) = O(logk) bits

was chosen to balance the 2 losses• Fraction of bits wasted: O(logk/k) = O= O

= logk / k2

k bitsProtocol P

O(logk) bitsconsistency check

inconsistency

Page 25: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Analysis: Waste in First Inductive Step• Length of consistency check: O(log2k) bits• Probability to start over: << O(1/k10) Prob of undetected error in one of the k Basic Steps• Total waste (in expectation): O(log2k) + O(1/k10) O(k2) = O(log2k) bits• Fraction of bits wasted: O(log2k / k2) << O(logk/k) negligible compared to the basic step!

– Waste in next inductive steps is even smaller

k timesinconsistency

O(log2k) bits

= logk / k2

Page 26: Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

Thank You!