1 / 40

Shifted Codes

Best viewed on-screen in slide-show mode. Shifted Codes. Sachin Agarwal Deutsch Telekom A.G., Laboratories Ernst-Reuter-Platz 7 10587 Berlin Germany. Joint work with Andrew Hagedorn and Ari Trachtenberg at Boston University. Outline. Motivation & Problem Definition Background

norton
Download Presentation

Shifted Codes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best viewed on-screen in slide-show mode Shifted Codes Sachin Agarwal Deutsch Telekom A.G., Laboratories Ernst-Reuter-Platz 7 10587 Berlin Germany Joint work with Andrew Hagedorn and Ari Trachtenberg at Boston University

  2. Outline • Motivation & Problem Definition • Background • Rateless Codes • Digital Fountain Codes • Shifted Codes • Motivation – Inefficiency of LT codes • Construction of Shifted Codes • Analysis – Communication and Computation Complexity • Experimental Comparison • LT vs. Shifted Codes • Constraint Sensors – Deployment on TMotes • Discussion and Round-up

  3. Outline • Motivation & Problem Definition • Background • Rateless Codes • Digital Fountain Codes • Shifted Codes • Motivation – Inefficiency of LT codes • Construction of Shifted Codes • Analysis – Communication and Computation Complexity • Experimental Comparison • LT vs. Shifted Codes • Constraint Sensors – Deployment on TMotes • Discussion and Round-up

  4. Transmitter Receiver Partial Information • Transmission Channel with Erasures Input symbols Received Symbols

  5. Transmitter Receiver Partial Information • Transmission Channel with Erasures Input symbols Received Symbols

  6. Transmitter Receiver Partial Information • Transmission Channel with Erasures Input symbols Received Symbols

  7. Transmitter Receiver Partial Information • Transmission Channel with Erasures Input symbols Received Symbols

  8. Transmitter Receiver Partial Information • Transmission Channel with Erasures Input symbols Received Symbols

  9. Transmitter Receiver Partial Information • Transmission Channel with Erasures Input symbols Received Symbols

  10. Transmitter Receiver 1 Receiver 2 Receiver 3 Partial Information • Multiple Receivers may have different erasures Given the situation of multiple receivers having partial information, how can all of them be updated to full information efficiently, and over a broadcast channel?

  11. Mobile device 1 Mobile device 2 Mobile device 3 Partial InformationAnother Example • Multiple mobile devices may have out-dated information • Mobile databases • Sensor network information aggregation • RSS updates for devices Broadcaster Latest version of information

  12. Problem Definition • Given an encoding host with k input symbols and a decoding host with nout of the k input symbols, the goal is to efficiently determine the remaining k-n input symbols at the decoding host. • The encoding host has no information of which k-n input symbols are missing at the decoding host. • Different decoding hosts may be missing different input symbols • Efficiency • Communication complexity – Information transmitted from the encoding host to the decoding host should be close in size to the transmission size of the missing k-n input symbols • Computational complexity – The algorithm must be computationally tractable

  13. Information Theoretic Lower Bound • Known Result • At a minimum, the encoding host would have to send only a little less than the exact contents of the missing input symbols to the decoding host. • Intuition • Decoding host is missing k-n input symbols • Special case of set reconciliation k – Number of input symbols n – Number of symbols known a priori at the decoding host b – Field size of each symbol

  14. Outline • Motivation & Problem Definition • Background • Rateless Codes • Digital Fountain Codes • Shifted Codes • Motivation – Inefficiency of LT codes • Construction of Shifted Codes • Analysis – Communication and Computation Complexity • Experimental Comparison • LT vs. Shifted Codes • Constraint Sensors – Deployment on TMotes • Discussion and Round-up

  15. Rateless Codes • Definition • “A class of erasure codes with the property that a potentially limitless sequence of encoding symbols can be generated from a given set of source symbols such that the original source symbols can be recovered from any subset of the encoding symbols of size equal to or only slightly larger than the number of source symbols. ” • Wikipedia.org • Examples • Random Linear Codes • LT Codes • Raptor Codes • Shifted Codes • …

  16. 1 =A+B 2 =B 3 =A+B+C 4 =A+C Rateless Codes - EncodingUsed for content distribution over error-prone channels k input symbols At least k Encoded Symbols A B C Random choice of edges based on a probability density function

  17. 1 =A+B 2 =B 3 =A+B+C 4 =A+C Rateless Codes - DecodingUsed for content distribution over error-prone channels k input symbols At least k Encoded Symbols A Solve Gaussian Elimination, Belief Propagation B C Irrespective of which encoded symbols are lost in the communication channel, as long as sufficient encoded symbols are received, the decoding can retrieve all the k input symbols System of Linear Equations

  18. Redundant! Decoding host Decoding Using Belief Propagation k+Encoded Symbols Decode Input Symbols Decoded k Input Symbols

  19. Class of rateless erasure codes invented by Michael Luby1 Computationally practical (as compared to Random Linear Codes) Fast decoding algorithm based on Belief propagation instead of Gaussian Elimination Form the outer code for Raptor Codes3, which have linear decoding computational complexity Designed for the case when no input symbols are available at the Decoding host initially Asymptotic Properties2 Expected number of encoded symbols required for successful decoding Expected decoding computational complexity k: number of input symbols 2Assuming a constant probability of failure  Digital Fountain CodesLT Codes 1Michael Luby, “LT codes,” in The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002, pp. 271–282. 3Amin Shokrollahi, “Raptor codes,” IEEE Transactions on Information Theory, vol. 52, no. 6, 2006, pp. 2551–2567.

  20. Robust Soliton Probability Distribution k, Probability of an encoded symbol with degree d is k(d) Property of releasing degree 1 symbols at a controlled, near-constant rate throughout the decoding process Digital Fountain CodesLT Codes’ Robust Soliton Probability Distribution LT code distribution, with parameters k = 1000, c = 0.01,  = 0.5.

  21. Outline • Motivation & Problem Definition • Background • Rateless Codes • Digital Fountain Codes • Shifted Codes • Motivation – Inefficiency of LT codes • Construction of Shifted Codes • Analysis – Communication and Computation Complexity • Experimental Comparison • LT vs. Shifted Codes • Constraint Sensors – Deployment on TMotes • Discussion and Round-up

  22. Decoding host Inefficiency of LT Codes for our Problem Many redundant encoded symbols k+Encoded Symbols Decode Input Symbols n out of k input symbols are known a priori at the decoding host

  23. Inefficiency of LT Codes for our Problem • The number of these redundant encoded symbols grows with the ratio of input symbols known at the decoder (n) to the total input symbols (k) • If n input symbols are known a priori, then an additional LT-encoded symbol will provide no new information to the decoding host with probability • …which quickly approaches 1 as n → k

  24. Intuitive Fix • n known input symbols serve the function of degree 1 encoded symbols, disproportionately skewing the degree distribution for LT encoding • We thus propose to shift the Robust Soliton distribution to the right in order to compensate for the additional functionally degree 1 symbols • Questions • 1) How? • 2) By how much?

  25. Shifted Code Construction • Definition • The shifted robust soliton distribution is given by • Intuition • n known input symbols at the decoding host reduce the degree of each encoding symbols by an expected fraction

  26. Shifted Code Distribution LT code distribution and proposed Shifted code distribution, with parameters k = 1000, c = 0.01,  = 0.5. The number of known input symbols at the decoding host is set to n = 900 for the Shifted code distribution. The probabilities of the occurrence of encoded symbols of some degrees is 0 with the shifted code distribution.

  27. Shifted Code – Communication Complexity • Lemma IV.2 • A decoder that knows n of k input symbols needs • encoding symbols under the shifted distribution to decode all k input symbols with probability at least 1−. • Proof • We have k-n input symbols comprising the encoded symbols after the n known input symbols are removed from the decoding graph. The expresson follows from Luby‘s analysis.

  28. Shifted Code – Average Degree of Encoded Symbol • Lemma IV.3 • The average degree of an encoding node under the k,ndistribution is given by • Proof • The proof follows from the definitions, since a node with degree d in the μk distribution will correspond to a node with degree roughly • in the shifted code distribution. • From Luby‘s analysis,the expresson for the average degree of an LT encoded symbol is

  29. Shifted Codes – Computational Complexity • Lemma IV.4* • For a fixed  , the expected number of edges R removed from the decoding graph upon knowledge of n input symbols at the decoding host is given by • R = O (n ln(k − n)) • Theorem IV.5 • For a fixed probability of decoding failure , the number of operations needed to decode using a shifted code is • O (k ln(k − n)) • Proof • Summing Lemma IV.4 and the computational complexity of (LT) decoding for the unknown k-n input symbols • *Proof described in: S. Agarwal, A. Hagedorn and A. Trachtenberg, “Rateless Codes Under Partial Information”, Information Theory and Applications Workshop, UCSD, San Diego, 2008

  30. Outline • Motivation & Problem Definition • Background • Rateless Codes • Digital Fountain Codes • Shifted Codes • Motivation – Inefficiency of LT codes • Construction of Shifted Codes • Analysis – Communication and Computation Complexity • Experimental Comparison • LT vs. Shifted Codes • Constraint Sensors – Deployment on TMotes • Discussion and Round-up

  31. Experimental ComparisonLT Codes vs. Shifted Codes • Benefit • For k = 1000, n = 900, the decoding host needs to download about 700 encoded symbols using conventional LT codes. But using shifted codes, only about 180 encoded symbols are required Y-axis Number of encoded symbols required at the mobile device to obtain the whole data-set X-axis Number of input symbols n available a priori at the mobile device LT Shifted Code The experiment was repeated 100 times and the error-bars of the standard deviation are also plotted in the graph.

  32. Experimental ComparisonConstraint Sensors – Deployment on TMotes Total time to Encode (Measure of computational complexity) Total time to Decode (Measure of computational complexity)

  33. More Data: Communication Savings

  34. More Data: Communication Savings Normalized

  35. More Data: Time Savings, Normalized

  36. Distribution ShiftingWhen the estimate of n at the Encoding Host is not accurate The Theta distribution shifting decodes input symbols much more quickly than the standard LT codes.

  37. Outline • Motivation & Problem Definition • Background • Rateless Codes • Digital Fountain Codes • Shifted Codes • Motivation – Inefficiency of LT codes • Construction of Shifted Codes • Analysis – Communication and Computation Complexity • Experimental Comparison • LT vs. Shifted Codes • Constraint Sensors – Deployment on TMotes • Discussion and Round-up

  38. Many Applications • Broadcasting coded updates to synchronize databases • Adapting LT codes when partial information has been delivered • Continuous shifting of the distribution • Using the partial information in case of unsuccessful decoding (when only some of the input symbols were decoded) • Efficient erasure correction when channel characteristics are already known • For example, input symbols can be first sent as plain-text, and then depending on the estimate of number of lost input symbols, shifted-coded symbols can be transmitted • Heterogeneous channel data delivery • Application in gossip protocols, particularly in later iterations • Sensor networks - data aggregation, routing information, etc. • Restoring storage media that are partially erased • …

  39. Conclusions & Future-work • Conclusions • Generalization of LT Code when some of the input symbols are already available at the decoding host • Many applications • Future Work • By adopting Raptor Code concepts (inner code), Shifted codes can be made more efficient • Analytical expressions for Distribution Shifting • Application specific shifted codes design • “Shifting” other rateless codes

  40. Further Reading • S. Agarwal, A. Hagedorn and A. Trachtenberg, “Rateless Codes Under Partial Information”, Information Theory and Applications Workshop, UCSD, San Diego, 2008 • S. Agarwal (Deutsche Telekom A.G.), “Method and System for Constructing and Decoding Rateless Codes with Partial Information”, European Patent Application EP 07 023 243.4 • Michael Luby, “LT codes,” in The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002, pp. 271–282. • Amin Shokrollahi, “Raptor codes,” IEEE Transactions on Information Theory, vol. 52, no. 6, 2006, pp. 2551–2567.

More Related