1 / 22

Sparse and Overcomplete Data Representation

Sparse and Overcomplete Data Representation. Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Israel Statistical Association 2005 Annual Meeting Tel-Aviv University (Dan David bldg.) May 17 th , 2005.

mpfeffer
Download Presentation

Sparse and Overcomplete Data Representation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sparse and Overcomplete Data Representation Michael Elad The CS Department The Technion – Israel Institute of technology Haifa 32000, Israel Israel Statistical Association 2005 Annual Meeting Tel-Aviv University (Dan David bldg.) May 17th, 2005

  2. Welcome to Sparseland Agenda • A Visit to Sparseland • Motivating Sparsity & Overcompleteness • 2. Answering the 4 Questions • How & why should this work? Sparse and Overcomplete Data Representation

  3. Every column in D (dictionary) is a prototype data vector (Atom). N • The vector is generated randomly with few non-zeros in random locations and random values. N A sparse & random vector K A fixed Dictionary Data Synthesis in Sparseland M Sparse and Overcomplete Data Representation

  4. M Multiply by D Sparseland Data is Special • Simple:Every generated vector is built as a linear combination of fewatoms from our dictionaryD • Rich:A general model: the obtained vectors are a special type mixture-of-Gaussians (or Laplacians). Sparse and Overcomplete Data Representation

  5. M • Assume that x is known to emerge from . . M • How about “Given x, find the α that generated it in ” ? M T Transforms in Sparseland ? • We desire simplicity, independence, and expressiveness. Sparse and Overcomplete Data Representation

  6. A sparse & random vector Multiply by D • Is ? Under which conditions? • Are there practical ways to get ? 4 Major Questions Difficulties with the Transform • How effective are those ways? • How would we get D? Sparse and Overcomplete Data Representation

  7. Sparselandis HERE Several recent trends from signal/image processing worth looking at: Sparsity. • JPEG to JPEG2000 - From (L2-norm) KLT to wavelet and non-linear approximation Sparsity. • From Wiener to robust restoration – From L2-norm (Fourier) to L1. (e.g., TV, Beltrami, wavelet shrinkage …) Overcompleteness. • From unitary to richer representations – Frames, shift-invariance, bilateral, steerable, curvelet • Approximation theory – Non-linear approximation Sparsity & Overcompleteness. Independence and Sparsity. • ICA and related models Why Is It Interesting? Sparse and Overcomplete Data Representation

  8. Agenda • 1. A Visit to Sparseland • Motivating Sparsity & Overcompleteness • 2. Answering the 4 Questions • How & why should this work? T Sparse and Overcomplete Data Representation

  9. M Multiply by D Suppose we can solve this exactly Why should we necessarily get ? It might happen that eventually . Question 1 – Uniqueness? Sparse and Overcomplete Data Representation

  10. Definition:Given a matrix D, =Spark{D} is the smallestand and number of columns that are linearly dependent. Donoho & Elad (‘02) Example: Spark = 3 Matrix “Spark” Rank = 4 Sparse and Overcomplete Data Representation

  11. Suppose this problem has been solved somehow Uniqueness If we found a representation that satisfy then necessarily it is unique (the sparsest). Donoho & Elad (‘02) M This result implies that if generates vectors using “sparse enough” , the solution of the above will find it exactly. Uniqueness Rule Sparse and Overcomplete Data Representation

  12. Multiply by D Are there reasonable ways to find ? Question 2 – Practical P0 Solver? M Sparse and Overcomplete Data Representation

  13. Matching Pursuit (MP) Mallat & Zhang (1993) • The MPis a greedy algorithm that finds one atom at a time. • Next steps: given the previously found atoms, find the next one to best fit … • Step 1: find the one atom that best matches the signal. • The Orthogonal MP (OMP) is an improved version that re-evaluates the coefficients after each round. Sparse and Overcomplete Data Representation

  14. Instead of solving Solve Instead Basis Pursuit (BP) Chen, Donoho, & Saunders (1995) • The newly defined problem is convex. • It has a Linear Programming structure. • Very efficient solvers can be deployed: • Interior point methods [Chen, Donoho, & Saunders (`95)] , • Sequential shrinkage for union of ortho-bases [Bruce et.al. (`98)], • If computing Dx and DT are fast, based on shrinkage [Elad (`05)]. Sparse and Overcomplete Data Representation

  15. Multiply by D How effective are the MP/BP in finding ? Question 3 – Approx. Quality? M Sparse and Overcomplete Data Representation

  16. D = DT DTD Mutual Coherence • Compute Assume normalized columns • The Mutual Coherenceµ is the largest entry in absolute value outside the main diagonal of DTD. • The Mutual Coherence is a property of the dictionary (just like the “Spark”). The smaller it is, the better the dictionary. Sparse and Overcomplete Data Representation

  17. BP and MP Equivalence Equivalence Given a vector x with a representation , Assuming that , BP and MP are Guaranteed to find the sparsest solution. Donoho & Elad (‘02) Gribonval & Nielsen (‘03) Tropp (‘03) Temlyakov (‘03) • MP is typically inferior to BP! • The above result corresponds to the worst-case. • Average performance results are available too, showing much better bounds [Donoho (`04), Candes et.al. (`04), Elad and Zibulevsky (`04)]. Sparse and Overcomplete Data Representation

  18. M Multiply by D Skip? Question 4 – Finding D? • Given these P examples and a fixed size [NK] dictionary D: • Is D unique?(Yes) • How to find D? • Train!! The K-SVD algorithm Sparse and Overcomplete Data Representation

  19. D X  A Each example is a linear combination of atoms from D Each example has a sparse representation with no more than L atoms Training: The Objective Sparse and Overcomplete Data Representation

  20. D Initialize D Sparse Coding Use MP or BP X Dictionary Update Column-by-Column by SVD computation The K–SVD Algorithm Aharon, Elad, & Bruckstein (`04) Sparse and Overcomplete Data Representation

  21. Today We Discussed • A Visit to Sparseland • Motivating Sparsity & Overcompleteness • 2. Answering the 4 Questions • How & why should this work? Sparse and Overcomplete Data Representation

  22. There are difficulties in using them! The dream? Summary Sparsity and Over- completeness are important ideas that can be used in designing better tools in data/signal/image processing • We are working on resolving those difficulties: • Performance of pursuit alg. • Speedup of those methods, • Training the dictionary, • Demonstrating applications, • … Future transforms and regularizations will be data-driven, non-linear, overcomplete, and promoting sparsity. Sparse and Overcomplete Data Representation

More Related