1 / 22

Enhancing Sparsity by Reweighted L-1 Minimization

Enhancing Sparsity by Reweighted L-1 Minimization. Authors: Emmanuel Candes , Michael Wakin , and Stephen Boyd A review by Jeremy Watt. The Basis pursuit problem. The Basis pursuit problem. The related Lasso problem. The Basis pursuit problem. Measures: cardinality and magnitude

cecil
Download Presentation

Enhancing Sparsity by Reweighted L-1 Minimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Enhancing Sparsity by Reweighted L-1 Minimization Authors: Emmanuel Candes, Michael Wakin, and Stephen Boyd A review by Jeremy Watt

  2. The Basis pursuit problem

  3. The Basis pursuit problem The related Lasso problem

  4. The Basis pursuit problem Measures: cardinality and magnitude (sensitive to outliers) Measures: cardinality

  5. Example: failure of L-1 recovery

  6. Reweighted L-1 Basis Pursuit

  7. Ideal weightings: compensate for magnitudes • Say we know and • Want to recover from • So ideal weightings are

  8. Ideal weightings • Only the support of can enter the model • Nullifies true magnitudes • Must stay feasible w.r.t. true support

  9. Algorithm for general problem

  10. Algorithm for general problem • Early iterations may find inaccurate signal estimates, but largest signal coefficients are likely to be identified as nonzero. • Once these locations are identified, their influence is downweighted in order to allow more sensitivity for identifying the remaining small but nonzero signal coefficients.

  11. Simulated example • Signal length = 512 • # spikes = 130 • indep normal entries • = 0.1 • 2 iterations of the algorithm performed for perfect recovery

  12. TV Minimization Image Reconstruction Data: = sampled Fourier coefficients of image = sampled Fourier matrix Goal: Reconstruct original image Leverage: Image gradient sparsity

  13. TV Minimization Image Reconstruction Data: = sampled Fourier coefficients of image = sampled Fourier matrix Goal: Reconstruct original image Leverage: Image gradient sparsity

  14. Reweighted TV Minimization

  15. Same Reweighted Algorithm

  16. TV Reweighted TV

  17. Concluding thoughts on reweighted L-1 minimization • An attempt to nullify the ‘magnitude problem’ with L-1 norm (outliers in some sense) • Same sort of motivation leads to Iteratively Reweighted Least Squares • Many superior results over standard L-1 • Generalizations to other sparsity problems • Deeper justification for efficacy as a Majorization-Minimization algorithm for solving alternative sparse recovery problem

  18. For more details and experiments see the paper or talk to me!

  19. Epilogue: Majorization-Minimization (MM) justification Standard epigraph trick • Epigraph trick • smooths objective • Adds linear inequality constraints to model

  20. Epilogue: Majorization-Minimization (MM) justification In MM approach: • Majorize objective function (use first order approximation) • Form sub-problem with this objective and original constraints • Solve series of such sub-problems to solve original problem In our case tth sub-problem takes reweighted L-1 form

More Related