1 / 26

Interactive Channel Capacity

Interactive Channel Capacity. Gillat Kol ( IAS ) joint work with Ran Raz (Weizmann + IAS). “A Mathematical Theory of Communication ” Claude Shannon 1948 An exact formula for the channel capacity of any noisy channel. - noisy channel: Each bit is flipped with prob 

jonah
Download Presentation

Interactive Channel Capacity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interactive Channel Capacity GillatKol(IAS) joint work with Ran Raz (Weizmann + IAS)

  2. “A Mathematical Theory of Communication” • Claude Shannon 1948 • An exact formula for the channel capacity • of any noisy channel

  3. -noisy channel: Each bit is flipped with prob (independently) Alicewants to send an n bit messageto Bob. How many bits does Alice need to send over the -noisy channel, so Bob can retrieve w.p. 1-o(1)? Is the blow-up even constant? Shannon: Channel Capacity 1- 0 0 1 1 n bits 1- A B noiseless channel ? bits B A -noisy channel

  4. -noisy channel: Each bit is flipped with prob (independently) Alicewants to send an n bit messageto Bob. How many bits does Alice need to send over the -noisy channel, so Bob can retrieve w.p. 1-o(1)? [Shannon ‘48]: # bits n / 1-H() Entropy functionH() = - log() – (1-) log(1-) Matching upper and lower bounds Shannon: Channel Capacity 1- 0 0 Channel Capacity 1 1 1-

  5. Alice and Bob want to have an nbits long conversation. How many bits do they need to send over the -noisy channel, so both can retrieve transcript w.p. 1-o(1)? Us: Interactive Channel Capacity n bits A B noiseless channel ? bits B A -noisy channel

  6. Communication Complexity Setting:Alice has input x, Bob has input y. They want to compute f(x,y)(f is publicly known) Communication Complexity of f: The least number of bits they need to communicate Deterministic, CC(f): x,y,computef(x,y) w.p.1 Randomized,RCC(f):x,y,computef(x,y) w.p. 1-o(1) Players share a random string Noisy, CC(f): x,y, compute f(x,y) w.p. 1-o(1) Players communicate over the -noisy channel Players share a random string

  7. Def: Interactive Channel Capacity RCC(f)= RandomizedCC (over the noiseless channel) CC(f) = Noisy CC (over the -noisy channel) *Results hold when we use CC(f) instead of RCC(f) *Results hold for worst case & average case RCC(f),CC(f) Def: Interactive Channel Capacity

  8. Def: Interactive Channel Capacity RCC(f)= RandomizedCC (over the noiseless channel) CC(f) = Noisy CC (over the -noisy channel) Forf(x,y) = x (msg transmission), we get Channel Capacity Interactive Channel Capacity  Channel Capacity In the interactive case, an error in the first bit may cause the whole conversation to be meaningless. We may need to “encode” every bit separately. Def: Interactive Channel Capacity

  9. [Schulman ’92]: Theorem: IfRCC(f) = n thenCC(f)  O(n) Corollary:C() > 0 Open Question: Is Interactive Channel Capacity =Channel Capacity? Many other works [Sch,BR,B,GMS,BK,BN,FGOS…]: Simulation of any communication protocol with adversarialnoise Large constants, never made explicit Previous Works

  10. Our Results Theorem 1 (Upper Bound): C() 1 -  for small : Interactive Channel Capacity is strictly smaller than Channel Capacity (1 - ) Theorem 2 (Lower Bound): C()  1 -O (in the case of alternating turns)

  11. Channel Types Synchronous Channel: Exactly one player sends a bit at each time step Asynchronous Channel: If both send bits at the same time these bits are lost Two channels: Each player sends a bit at any time this work

  12. Channel Types Synchronous Channel: Exactly one player sends a bit at each time step The order of turns in a protocol is pre-determined (independent of the inputs, randomness, noise). Otherwise players may send bits at the same time Alternating turns is a special case Asynchronous Channel: If both send bits at the same time these bits are lost Two channels: Each player sends a bit at any time this work

  13. Proof of Upper BoundC() 1 - 

  14. Pointer Jumping deg= • Example f with CC > RCC: 2k-Pointer Jumping Game • Parameters: • 2k-arytree, depth d • k = O(1), d   •  = logk / k2 • Aliceowns odd layers, Bob owns even layers • Pointer Jumping Game: • Inputs: Each player gets an edge going out of every node he owns • Goal: Find the leaf reached depth = d

  15. Pointer Jumping deg= • Example f with CC > RCC: 2k-Pointer Jumping Game • Parameters: • 2k-arytree, depth d • k = O(1), d   •  = logk / k2 • Outline of upper bound: • Clearly,RCC(f)  dk • We prove CC(f)  d (k + (logk)) – involved proof! RCClower bounds are typically up-to a constant. We are interested in the second order terms  C()  1-(logk/k) = 1-= 1- depth = d

  16. Bounding CC(PJG) - The Idea “Any good PJG protocol does the following:” Alicestarts by sending the first edge (kbits) wpk a bit was flipped Case 1: Alicesends additional bits to correct first edge Even if a single error occurred and Alice knows its index, she needs to send the index logkbit waste Case 2: Bobsends the next edge (k bits) wpk these k bits are wasted, as Bob had wrong first edge  In expectation, k2 = logkbit waste In both cases, sending the first edge costs k+(log k)!  was chosen to balance the 2 losses  = logk / k2 deg =

  17. Let players exchange the first 1.25k bits of the protocol. t1 = #bits out of the first 1.25kbits sent by Alice (well defined due to pre-determined order of turns) Case 1: Alicesends additional bits to correct first edge corresponds to t1 k+0.5logk Case 2: Bobsends the next edge corresponds to t1 < k+0.5logk Bounding CC(PJG) - More Formal  = logk / k2

  18. After the exchange of the first 1.25k bits, we “voluntarily” reveal the first edge to Bob. The players now play a new PJG of depth d-1. We need to show that sending the first edge of the new PJG also costs k+(log k). Challenge: In the new PJG, some info about the players’ inputs may already be known How do we measure the players’ progress? Bounding CC(PJG) - Why is the actual proof challenging? d

  19. Proof of Lower BoundC()  1 -O

  20. Simulation Parameters (same): k = O(1)  = logk / k2 Given a communication protocol P, we simulate P over the -noisy channel using a recursive protocol: The basic step simulates k steps of P The ith inductive step simulates ki+1steps of P

  21. Simulating Protocol - Basic Step • Simulating Protocol (Basic Step): • Players run k stepsofP. Aliceobserves transcript Ta, and Bob transcriptTb • Players run an O(logk) bit consistency check of Ta,Tb using hash functions, each bit sent many times • A player that finds an inconsistency starts over and removes this step’s bits from his transcript k bits Protocol P O(logk) bits consistency check inconsistency

  22. Simulating Protocol - Interactive Step • Simulating Protocol (first inductive step): • Players run the Basic Step k consecutive times. Aliceobserves transcript Ta, and Bob transcript Tb (Players may go out of sync, but due to the alternating turnsthey know who should speak next) • Players run an O(log2k)bit consistency check of Ta,Tbusing hash functions, each bit sent many times • A player that finds an inconsistency starts over and removes this step’s bits from his transcript k times O(log2k) bits inconsistency

  23. Analysis: Correctness • The final protocol simulates Pwith probability 1-o(1): • If an error occurred or the players went out of sync, they will eventually fix it, as the consistency check checks the whole transcript so far and is done with larger and larger parameters

  24. Analysis: Waste in Basic Step  = logk / k2 • Length of consistency check: O(logk)bits • Probability to start over: O(k) • Total waste (in expectation): O(logk) + O(k) O(k) = O(logk) bits  was chosen to balance the 2losses • Fraction of bits wasted: O(logk/k) = O= O k bits Protocol P O(logk) bits consistency check inconsistency

  25. Analysis: Waste in First Inductive Step  = logk / k2 • Length of consistency check: O(log2k)bits • Probability to start over: << O(1/k10) Probof undetected error in one of the k Basic Steps • Total waste (in expectation): O(log2k) + O(1/k10) O(k2) = O(log2k) bits • Fraction of bits wasted: O(log2k / k2) << O(logk/k) negligible compared to the basic step! • Waste in next inductive steps is even smaller k times O(log2k) bits inconsistency

  26. Thank You!

More Related