1 / 22

Hybrid Dense /Sparse Matrices in Compressed Sensing Reconstruction

Hybrid Dense /Sparse Matrices in Compressed Sensing Reconstruction. Ilya Poltorak Dror Baron Deanna Needell. The work has been supported by the Israel Science Foundation and National Science Foundation. CS Measurement.

Download Presentation

Hybrid Dense /Sparse Matrices in Compressed Sensing Reconstruction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hybrid Dense/Sparse Matrices in Compressed Sensing Reconstruction Ilya Poltorak Dror Baron Deanna Needell The work has been supported by the Israel Science Foundation and National Science Foundation.

  2. CS Measurement • Replace samples by more general encoderbased on a few linear projections (inner products) sparsesignal measurements # non-zeros

  3. Caveats • Input x strictly sparse w/ real values • Noiseless measurements • noise can be addressed (later) • Assumptions relevant to content distribution (later)

  4. Why is Decoding Expensive? Culprit: dense, unstructured sparsesignal measurements nonzeroentries

  5. Sparse Measurement Matrices (dense later!) • LDPC measurement matrix (sparse) • Only {-1,0,+1} in  • Each row of  contains L randomly placed nonzeros • Fast matrix-vector multiplication • fast encoding & decoding sparsesignal measurements nonzeroentries

  6. Example 0 1 1 4 ? ? ? ? ? ? 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1

  7. Example • What does zero measurement imply? • Hint: x strictly sparse 0 1 1 4 ? ? ? ? ? ? 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1

  8. Example • Graph reduction! 0 1 1 4 ? 0 0 ? ? ? 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1

  9. Example • What do matching measurements imply? • Hint: non-zeros in x are real numbers 0 1 1 4 ? 0 0 ? ? ? 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 1 1

  10. Example • What is the last entry of x? 0 1 1 4 0 0 0 0 1 ? 0 1 1 0 0 0 0 0 0 1 1 0 1 1 0 0 1 0 00 00 1 1

  11. Noiseless Algorithm[Luby & Mitzenmacher 2005] [Sarvotham, Baron, & Baraniuk 2006][Zhang & Pfister 2008] Phase1:zero measurements Initialize Phase2: matchingmeasurements typically iterate 2-3 times Phase3: singleton measurements Arrange output Done? yes no

  12. Numbers (4 seconds) • N=40,000 • 5% non-zeros • M=0.22N • L=20 ones per row • Only 2-3 iterations iteration #1

  13. Challenge • With measurements parts of signal still not reconstructed • How do we recover the rest of the signal?

  14. Solution: Hybrid Dense/Sparse Matrix • With measurements parts of signal still not reconstructed • Add extra dense measurements • Residual of signal w/ residual dense columns residual columns

  15. Sudocodes with Two-Part Decoding[Sarvotham, Baron, & Baraniuk 2006] • Sudocodes (related to sudoku) • Graph reduction solves most of CS problem • Residual solved via matrix inversion Residual via matrix inversion sudo decoder residual columns

  16. Contribution 1: Two-Part Reconstruction • Many CS algorithms for sparse matrices [Gilbert et al., Berinde & Indyk, Sarvotham et al.] • Many CS algorithms for dense matrices [Cormode & Muthukrishnan, Candes et al., Donoho et al., Gilbert et al., Milenkovic et al., Berinde & Indyk, Zhang & Pfister, Hale et al.,…] • Solve each part with appropriate algorithm sparse solver residual via dense solver residual columns

  17. Runtimes (K=0.05N, M=0.22N)

  18. Theoretical Results [Sarvotham, Baron, & Baraniuk 2006] • Fast encoder and decoder • sub-linear decoding complexity • caveat: constructing data structure • Distributed content distribution • sparsified data • measurements stored on different servers • any M measurements suffice • Strictly sparse signals, noiseless measurements

  19. Contribution 2: Noisy Measurements • Results can be extended to noisy measurements • Part 1 (zero measurements): measurement |ym|< • Part 2 (matching): |yi-yj|< • Part 3 (singleton): unchanged

  20. Problems with Noisy Measurements • Multiple iterations alias noise into next iteration! • Use one iteration • Requires small threshold  (large SNR) • Contribution 3:Provable reconstruction • deterministic & random variants

  21. Summary • Hybrid Dense/Sparse Matrix • Two-part reconstruction • Simple (cute?) algorithm • Fast • Applicable to content distribution • Expandable to measurement noise

  22. THE END

More Related