1 / 100

Richard Baraniuk Rice University

Randomized Dimensionality Reduction with Applications to Signal Processing and Communications. Richard Baraniuk Rice University. The Digital Universe. Size: 281 billion gigabytes generated in 2007 digital bits > stars in the universe growing by a factor of 10 every 5 years

tolla
Download Presentation

Richard Baraniuk Rice University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Randomized Dimensionality Reduction with Applications to Signal Processing and Communications Richard Baraniuk Rice University

  2. The Digital Universe • Size: 281 billion gigabytes generated in 2007 digital bits > stars in the universe growing by a factor of 10 every 5 years > Avogadro’s number (6.02x1023) in 15 years • Growth fueled by multimedia data audio, images, video, surveillance cameras, sensor nets, … • In 2007 digital data generated > total storage by 2011, ½ of digital universe will have no home [Source: IDC Whitepaper “The Diverse and Exploding Digital Universe” March 2008]

  3. Act I What’s wrong with today’s multimedia sensor systems? why go to all the work to acquire massive amounts of multimedia data only to throw much/most of it away? Act II A way out: dimensionality reduction (compressive sensing) enables the design of radically new sensors and systems Act III Compressive sensing in action new cameras, imagers, ADCs, … Act IV Intersections with info theorynew problems, new solutions

  4. Sense by Sampling sample

  5. Sense by Sampling too much data! sample

  6. Sense then Compress sample compress JPEG JPEG2000 … decompress

  7. Sparsity largewaveletcoefficients (blue = 0) pixels

  8. Sparsity largewaveletcoefficients (blue = 0) largeGabor (TF)coefficients pixels widebandsignalsamples frequency time

  9. What’s Wrong with this Picture? • Why go to all the work to acquire N samples only to discard all but K pieces of data? sample compress decompress

  10. What’s Wrong with this Picture? • nonlinear processing • nonlinear signal model • (union of subspaces) linear processing linear signal model (bandlimited subspace) sample compress decompress

  11. Compressive Sensing • Directly acquire “compressed” data via dimensionality reduction • Replace samples by more general “measurements” compressive sensing recover

  12. Compressive SensingTheory A Geometrical Perspective

  13. Sampling • Signal is -sparse in basis/dictionary • WLOG assume sparse in space domain sparsesignal nonzero entries

  14. Sampling • Signal is -sparse in basis/dictionary • WLOG assume sparse in space domain • Sampling sparsesignal measurements nonzeroentries

  15. Compressive Sampling • When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss through linear dimensionality reduction sparsesignal measurements nonzero entries

  16. How Can It Work? • Projection not full rank…… and so loses information in general • Ex: Infinitely many ’s map to the same(null space)

  17. How Can It Work? • Projection not full rank…… and so loses information in general • But we are only interested in sparse vectors columns

  18. How Can It Work? • Projection not full rank…… and so loses information in general • But we are only interested in sparse vectors • is effectively MxK columns

  19. How Can It Work? • Projection not full rank…… and so loses information in general • But we are only interested in sparse vectors • Design so that each of its MxK submatrices are full rank (ideally orthobasis) columns

  20. Geometrical Situation • Sparse signal: All but K coordinates are zero • Q: Structure of the set of all K-sparse vectors? sparsesignal nonzero entries

  21. Sparse signal: All but K coordinates are zero Model: union of K-dimensional subspaces aligned w/ coordinate axes (highly nonlinear!) Geometrical Situation sparsesignal nonzero entries

  22. Stable Embedding An information preserving projection preserves the geometry of the set of sparse signals How to do this? Ensure K-dim subspaces

  23. Restricted Isometry Property RIP of order 2K implies preserves distancesbetween any two K-sparse vectors each K-sparse; is 2K-sparse RIP: any selection of 2Kcolumns is close to anorthobasis columns

  24. Restricted Isometry Property RIP of order 2K implies preserves distancesbetween any two K-sparse vectors each K-sparse; is 2K-sparse RIP: any selection of 2Kcolumns is close to anorthobasis Unfortunately, a combinatorial, NP-complete design problem columns

  25. Insight from the 80’s [Kashin, Gluskin] • Draw at random • iid Gaussian • iid Bernoulli … • Then has the RIP with high probability provided columns

  26. One Proof Technique Consider one K-dim subspace Cover unit sphere in subspace with a point cloud Apply Johnson-Lindenstrauss Lemma to show isometry for all points in cloud log(Q) random projections preserve inter-point distances in a cloud of Q points in whp Extend to entire K-dim subspace Extend via union bound to all subspaces [B, Davenport, DeVore, Wakin, Constructive Approximation, 2008] K-dim subspace

  27. Randomized Dim. Reduction • Measurements = random linear combinations of the entries of • No information loss for sparse vectors whp sparsesignal measurements nonzero entries

  28. CS Signal Recovery • Goal: Recover signal from measurements • Problem: Randomprojection not full rank(ill-posed inverse problem) • Solution: Exploit the sparse/compressiblegeometry of acquired signal

  29. CS Signal Recovery • Random projection not full rank • Recovery problem:givenfind • Null space • Search in null space for the “best” according to some criterion • ex: least squares (N-M)-dim hyperplaneat random angle

  30. Signal Recovery • Recovery: given(ill-posed inverse problem) find (sparse) • Optimization: • Closed-form solution:

  31. Recovery: given(ill-posed inverse problem) find (sparse) Optimization: Closed-form solution: Wrong answer! Signal Recovery

  32. Recovery: given(ill-posed inverse problem) find (sparse) Optimization: Closed-form solution: Wrong answer! Signal Recovery

  33. Recovery: given(ill-posed inverse problem) find (sparse) Optimization: Signal Recovery “find sparsest vectorin translated nullspace”

  34. Recovery: given(ill-posed inverse problem) find (sparse) Optimization: Correct! But NP-Complete alg Signal Recovery “find sparsest vectorin translated nullspace”

  35. Recovery: given(ill-posed inverse problem) find (sparse) Optimization: Convexify the optimization Signal Recovery Candes Romberg Tao Donoho

  36. Recovery: given(ill-posed inverse problem) find (sparse) Optimization: Convexify the optimization Correct! Polynomial time alg Signal Recovery

  37. Summary: CS Encoding: = random linear combinations of the entries of Decoding: Recover from via optimization sparsesignal measurements nonzero entries

  38. CS Hallmarks Stable acquisition/recovery process is numerically stable Asymmetrical(most processing at decoder) conventional: smart encoder, dumb decoder CS: dumb encoder, smart decoder Democratic each measurement carries the same amount of information robust to measurement loss and quantization “digital fountain” property Random measurements encrypted Universal same random projections / hardware can be used forany sparse signal class (generic)

  39. Universality • Random measurements can be used for signals sparse in any basis

  40. Universality • Random measurements can be used for signals sparse in any basis

  41. Universality • Random measurements can be used for signals sparse in any basis sparsecoefficient vector nonzero entries

  42. Compressive SensingIn Action

  43. Gerhard Richter 4096 Farben / 4096 Colours 1974254 cm X 254 cmLaquer on CanvasCatalogue Raisonné: 359 Museum Collection:Staatliche Kunstsammlungen Dresden (on loan) Sales history: 11 May 2004Christie's New York Post-War and Contemporary Art (Evening Sale), Lot 34US$3,703,500  

  44. Compressive SensingIn ActionCameras

  45. “Single-Pixel” CS Camera scene single photon detector imagereconstructionorprocessing DMD DMD random pattern on DMD array w/ Kevin Kelly

  46. “Single-Pixel” CS Camera scene single photon detector imagereconstructionorprocessing DMD DMD random pattern on DMD array … • Flip mirror array M times to acquire M measurements • Sparsity-based (linear programming) recovery

  47. First Image Acquisition target 65536 pixels 1300 measurements (2%) 11000 measurements (16%)

  48. Utility? Fairchild 100Mpixel CCD single photon detector DMD DMD

  49. CS Low-Light Imaging with PMT true color low-light imaging 256 x 256 image with 10:1 compression [Nature Photonics, April 2007]

More Related