1 / 31

Qiwen Wang, Sidharth Jaggi , Shuo -Yen Robert Li Institute of Network Coding (INC)

Binary Error Correcting Network Codes. Qiwen Wang, Sidharth Jaggi , Shuo -Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information Theory Workshop 2011, Paraty , Brazil. Outline. 1. Motivation. Model. 2. 3. Main Results.

frayne
Download Presentation

Qiwen Wang, Sidharth Jaggi , Shuo -Yen Robert Li Institute of Network Coding (INC)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Binary Error Correcting Network Codes Qiwen Wang, SidharthJaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information Theory Workshop 2011, Paraty, Brazil The Chinese University of Hong Kong

  2. Outline 1 Motivation Model 2 3 Main Results 4 Discussion 5 Conclusion The Chinese University of Hong Kong

  3. Motivation: Challenges of NC over Noisy Networks 1 varying noise level (p-ε,p+ε) The Chinese University of Hong Kong

  4. Motivation: Challenges of NC over Noisy Networks error 1 varying noise level errors propagate through mix-and-forward 2 The Chinese University of Hong Kong

  5. Motivation: Challenges of NC over Noisy Networks [f2,4,f2,7] [f1,3,f1,5] 1 2 1 varying noise level [f3,6,f4,6] 3 4 errors propagate through mix-and-forward 5 2 7 6 [f6,8,f6,9] 9 8 coding kernels unknown a priori 3 The Chinese University of Hong Kong

  6. Network Model & Code Model … … Alice Bob Mincut = C The Chinese University of Hong Kong

  7. Network Model & Code Model n …… C ⁞ ⁞ Alice Bob Mincut = C The Chinese University of Hong Kong

  8. Network Model & Code Model Cm x n Cm x n α ⁞ ⁞ Alice β Bob The Chinese University of Hong Kong

  9. Finite Field & Binary Field I One Packet: n symbols over mn bits mx n binary matrix The Chinese University of Hong Kong

  10. Finite Field & Binary Field II T m x m binary matrix a symbol over TS Multiplication over Multiplication over binary field The Chinese University of Hong Kong

  11. Transfer Matrix Noiseless Network X Y … … Cm x Cm Cm x n Cm x n T X Y × The Chinese University of Hong Kong

  12. Noise Model: Worst-case Bit-flip Error Link A Link B OR Link A Link B Errors can be arbitrarily distributed, with an upper bound of fraction p. Worst possible damage can happen to received packets. The Chinese University of Hong Kong

  13. Noise Model: Worst-case Bit-flip Error Em x n Worst-case bit-flip error matrix Z: no more than pEmn1s, arbitrarily distributed E: num of edges in the network The Chinese University of Hong Kong

  14. Noise Model: Worst-case Bit-flip Error Em x n Error bits on the 1st edge Worst-case bit-flip error matrix Z: no more than pEmn1s, arbitrarily distributed E: num of edges in the network Edge 1 The Chinese University of Hong Kong

  15. Noise Model: Worst-case Bit-flip Error Em x n Error bits on the 1st edge Worst-case bit-flip error matrix Z: no more than pEmn1s, arbitrarily distributed E: num of edges in the network Edge 1 The Chinese University of Hong Kong

  16. Impulse Response Matrix Z … X Y … … Em x n Z Cm x Cm Cm x n Cm x Em Cm x n T × X Y × The Chinese University of Hong Kong

  17. Transform Metric Z T X Y × × Z 00101...0010 TX Y × The Chinese University of Hong Kong

  18. Transform Metric Xi Yi + = di columns … The Chinese University of Hong Kong di is the minimum number of columns of that need to be added to TX(i) to obtain Y(i). Claim: is a distance metric.

  19. Hamming-type Upper Bound Theorem 1 For all p less than C/(2Em), an upper bound on the achievable rate of any code over the worst-case binary-error network is The Chinese University of Hong Kong

  20. Hamming-type Upper Bound Proof (sketch) pEmn • Total number of Cm x n binary matrices (volume of the big square) is . • Lower bound of the volume of the balls • Consider those Z’s where every column has pEm ones in it, distinct Z results in distinct . • The number of distinct is at least ~ • Upper bound on the size of any codebook is • Asymptotically in n, the Hamming-type upper bound is • . pEmn pEmn The Chinese University of Hong Kong

  21. Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong

  22. Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong

  23. Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong

  24. Coherent/Non-coherent NC Coherent NC: receiver knows the internal coding coefficients, hence knows T and . However, the random linear coding coefficients are usually chosen on the fly. Non-coherent NC: coding coefficients, hence T and , are unknown in advance, more realistic setting. The Chinese University of Hong Kong

  25. GV-type Lower Bound Theorem 2 Coherent GV-type network codes achieve a rate of at least Theorem 3 Non-coherent GV-type network codes achieve a rate of at least The Chinese University of Hong Kong

  26. GV-type Lower Bound Proof of Thm2 (sketch) TX(1) TX(2) 2pEmn 2pEmn • Need an upper bound on volume of instead of the lower bound on volume of as in Thm1. (sphere packing vs. covering) • Different Y, or equivalently , can be bounded above by the number of different Z, which equals • The summation can be bounded from above by • ~ • Lower bound on the size of the codebook • Asymptotically in n, the rate of coherent GV-type NC • . TX(3) 2pEmn The Chinese University of Hong Kong

  27. GV-type Lower Bound Proof of Thm3 (sketch) • Crucial differencewith the proof of Thm2: the process of choosing codewords. • Consider all possible values of , at most (and hence T, since it comprises of a specific subset of C columns of ). • The number of potential codewords that can be chosen in the codebook is at least • which equals • Asymptotically in n, it leads to the same rate of as coherent NC in Theorem2. The Chinese University of Hong Kong

  28. Scale of Parameters Claim For all p less than ,the Hamming-type and GV-type bounds hold. Proof • Theorem 1 (Hamming-type upper bound)requires that . • For the GV-type bound in Thm2 and Thm3 to give non-negative rates, . When p is small, The Chinese University of Hong Kong

  29. Coherent/non-coherent GV-type lower bound: Worst-case bit-flip error model GV-type codes: End-to-end nature Complexity: poly. in block length Hamming-type upper bound: Conclusion The Chinese University of Hong Kong

  30. Future Direction • Efficient coding schemes • Other binary noise model • Combine link-by-link codes • with our end-to-end codes The Chinese University of Hong Kong

  31. Thank you! The Chinese University of Hong Kong

More Related