html5-img
1 / 31

The Porosity of Additive Noise Sequences

Vinith Misra, Tsachy Weissman { vinith , tsachy }@ stanford.edu. The Porosity of Additive Noise Sequences. ISIT, 7/5/ 2012. Outline. Universality and porosity Related work Main results Converse ideas Predictability and practicality. (Lossless) Source Coding. Known source model

aman
Download Presentation

The Porosity of Additive Noise Sequences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Vinith Misra, TsachyWeissman {vinith, tsachy}@stanford.edu The Porosity ofAdditive Noise Sequences ISIT, 7/5/2012

  2. Outline • Universality and porosity • Related work • Main results • Converse ideas • Predictability and practicality

  3. (Lossless) Source Coding • Known source model • Optimal compression rate achievable.

  4. Universal Source Coding • Encoder and decoder can *adapt*. • Strong sense of universality: optimal compression for *every* source model.

  5. Channel Coding • Relevant component: channel model . • Encoder/decoder can be optimized for given .

  6. Universal Channel Coding • unknown. • Fixed encoder, adaptive *decoder*. • Universality not as strong as in source coding.

  7. Feedback to the rescue? • Encoder and decoder can adapt. • Stronger form of universality? • (More fundamental channel coding problem?)

  8. Modulo-additive channels • Stronger analogy with source coding. • Source process <-> Noise process. • More general: individual noise sequence.

  9. “Porosity,” or • How rapidly can encoder/decoder communicate? • Best possible rate: .

  10. Individual sequence properties Compressibility: (Lempel/Ziv) Predictability: (Feder/Merhav/Gutman) Denoisability: (Weissman et al.)

  11. LZ Parallel #1:Individual sequences LZ Individual source sequence Porosity Individual noise sequence

  12. LZ Parallel #2:Finite-state encoding/decoding LZ Finite state source encoder and decoder Porosity Finite state channel encoder and decoder

  13. LZ Parallel #3:Finite-state converse LZ FSM can compress no better than compressibility. Porosity FSM can transmit no faster than porosity.

  14. LZ Parallel #4:Universal achievability schemes LZ 2 universalachievabilities: • Sequence of FS schemes. • Single infinite-state scheme. Porosity 2 universalachievabilities: • Sequence of FS schemes • Single infinite-state scheme. (Lomnitz/Feder)

  15. Outline • Universality and porosity • Previous work • Main results • Converse and achievability ideas • Predictability and practicality

  16. Lomnitz/Feder (2011) • Competitive universality introduced. • Rate-adaptive scheme achieves . • No “iterated fixed-blocklength” scheme does better.

  17. Shayevitz/Feder (2009) • Achieves first-order “empirical capacity.” • Can generalize to m-order empirical capacity. • Related: Eswaran/Sarwate/Sahai/Gastpar [2010].

  18. Outline • Universality and porosity • Previous work • Main results • Converse ideas • Predictability and practicality

  19. Finite-state (FS) schemes

  20. Converse Suppose an FS scheme achieves rate and error with positive probability. Then . i.e. .

  21. Achievability There exists a sequence of FS schemes such that for all noise sequences .

  22. Outline • Universality and porosity • Previous work • Main results • Converse ideas • Predictability and practicality

  23. Channel Coding into Source Coding

  24. Channel Coding into Source Coding

  25. Channel Coding into Source Coding

  26. Deterministic into Stochastic

  27. Cthulhu vs. Shannon. Cannot beat 1 bit per sample. Entropy code

  28. Outline • Universality and porosity • Previous work • Main results • Converse ideas • Predictability and practicality

  29. A linear-complexity alternative • LZ-baseduniversal predictor. (Feder et al. [92]) • 1st orderS/F scheme

  30. A linear-complexity alternative • LZ-baseduniversal predictor. (Feder et al. [92]) • 1st orderS/F scheme

  31. Performance? • Shayevitz/Feder rate guarantee: • (predictability) • (Closing the gap?)

More Related