1 / 66

Rate-distortion Theory for Secrecy Systems

Rate-distortion Theory for Secrecy Systems. Paul Cuff Electrical Engineering Princeton University. Information Theory. Channel Coding. Source Coding. Secrecy. Secrecy. Channel. Source. Source Coding. Describe an information signal ( source ) with a message. Information.

mirabelle
Download Presentation

Rate-distortion Theory for Secrecy Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rate-distortion Theory for Secrecy Systems Paul Cuff Electrical Engineering Princeton University

  2. Information Theory Channel Coding Source Coding Secrecy Secrecy Channel Source

  3. Source Coding • Describe an information signal (source) with a message. Information Reconstruction Message Encoder Decoder

  4. Entropy • If Xn is i.i.d. according to pX • R > H(X) is necessary and sufficient for lossless reconstruction Space of Xn sequences Enumerate the typical set

  5. Many Methods • For lossless source coding, the encoding method is not so important • It should simply use the full entropy of the bits

  6. Single Letter Encoding (method 1) • Encode each Xi separately • Under the constraints of decodability, Huffman codes are optimal • Expected length is within one bit of entropy • Encode tuples of symbols to get closer to the entropy limit

  7. Random Binning(method 2) • Assign to each Xn sequence a random bit sequence (hash function) 0100110101011 Space of Xn sequences 0110100010010 1101011000100

  8. Linear Transformation(method 3) Source Message Random Matrix J Xn

  9. Summary • For lossless source coding, structure of communication doesn’t matter much Information Gathered H(Xn) Message Bits Received

  10. Lossy Source Coding • What if the decoder must reconstruct with less than complete information? • Error probability will be close to one • Distortion as a performance metric

  11. Poor Performance • Random Binning and Random Linear Transformations are useless! Distortion E d(X,y) Time Sharing Massey Conjecture: Optimal for linear codes Message Bits Received

  12. Puzzle • Describe an n-bit random sequence • Allow 1 bit of distortion • Send only 1 bit

  13. Rate Distortion Theorem • [Shannon] • Choose p(y|x):

  14. Structure of Useful Partial Information • Coordination (Given source PX construct Yn ~ PY|X ) • Empirical • Strong

  15. Empirical Coordination Codes • Codebook • Random subset of Yn sequences • Encoder • Find the codeword that has the right joint first-order statistics with the source

  16. Strong Coordination PY|X • Black box acts like a memoryless channel • X and Y are an i.i.d. multisource Communication Resources Output Source

  17. Strong Coordination Synthetic Channel PY|X • Related to: • Reverse Shannon Theorem [Bennett et. al.] • Quantum Measurements [Winter] • Communication Complexity [Harsha et. al.] • Strong Coordination [C.-Permuter-Cover] • Generating Correlated R.V. [Anantharam, Gohari, et. al.] Common Randomness Message Output Source Node A Node B

  18. Structure of Strong Coord. K

  19. Information Theoretic Security

  20. Wiretap Channel[Wyner 75]

  21. Wiretap Channel[Wyner 75]

  22. Wiretap Channel[Wyner 75]

  23. Confidential Messages[Csiszar, Korner 78]

  24. Confidential Messages[Csiszar, Korner 78]

  25. Confidential Messages[Csiszar, Korner 78]

  26. Merhav 2008

  27. Villard-Piantanida 2010

  28. Other Examples of“rate-equivocation” theory • Gunduz-Erkip-Poor 2008 • Lia-H. El-Gamal 2008 • Tandon-Ulukus-Ramchandran 2009 • …

  29. Rate-distortion theory (secrecy)

  30. Achievable Rates and Payoff Given [Schieler, Cuff 2012 (ISIT)]

  31. How to Force High Distortion • Randomly assign bins • Size of each bin is • Adversary only knows bin • Adversary has no knowledge of only knowledge of

  32. Causal Disclosure

  33. Causal Disclosure (case 1)

  34. Causal Disclosure (case 2)

  35. Example • Source distribution is Bernoulli(1/2). • Payoff: One point if Y=X but Z≠X.

  36. Rate-payoff Regions

  37. General Disclosure Causal or non-causal

  38. Strong Coord. for Secrecy Channel Synthesis Information Action Node A Node B Attack Adversary Not optimal use of resources!

  39. Strong Coord. for Secrecy Channel Synthesis Information Action Node A Node B Un Attack Adversary Reveal auxiliary Un “in the clear”

  40. Payoff-Rate Function • Maximum achievable average payoff • Markov relationship: Theorem:

  41. Structure of Secrecy Code K

  42. Equivocation next Intermission

  43. Log-loss Distortion Reconstruction space of Z is the set of distributions.

  44. Best Reconstruction Yields Entropy

  45. Log-loss (disclose X causally)

  46. Log-loss (disclose Y causally)

  47. Log-loss (disclose X and Y)

  48. Result 1 from Secrecy R-D Theory

  49. Result 2 from Secrecy R-D Theory

More Related