1 / 48

Volkan Cevher Marco Duarte Richard Baraniuk

Distributed Target Localization via Spatial Sparsity -or- Array Processing Meets Compressive Sensing. Volkan Cevher Marco Duarte Richard Baraniuk. Array Processing. Goal: Localize targets by fusing measurements from an array of sensors collect time signal data

chico
Download Presentation

Volkan Cevher Marco Duarte Richard Baraniuk

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distributed Target Localization via Spatial Sparsity -or-Array Processing Meets Compressive Sensing Volkan Cevher Marco Duarte Richard Baraniuk

  2. Array Processing • Goal: Localize targetsby fusing measurementsfrom an array of sensors • collect time signal data • communicate signals to central fusion center • solve an optimizationproblem

  3. Array Processing • Goal: Localize targetsby fusing measurementsfrom an array of sensors • collect time signal data • communicate signals to central fusion center • solve an optimizationproblem

  4. Array Processing • Goal: Localize targetsby fusing measurementsfrom an array of sensors • collect time signal data • requires potentiallyhigh-rate (Nyquist)sampling • communicate signals to central fusion center • potentially largecommunicationburden • solve an optimizationproblem • ex: MLE

  5. Review of Compressive Sensing Theory

  6. Digital Data Acquisition • Foundation: Shannon sampling theorem “if you sample densely enough(at the Nyquist rate), you can perfectly reconstruct the original analog data” time space

  7. Sensing by Sampling • Long-established paradigm for digital data acquisition • uniformly sampledata at Nyquist rate (2x Fourier bandwidth) sample

  8. Sensing by Sampling • Long-established paradigm for digital data acquisition • uniformly sampledata at Nyquist rate (2x Fourier bandwidth) too much data! sample

  9. Sensing by Sampling • Long-established paradigm for digital data acquisition • uniformly sampledata at Nyquist rate (2x Fourier bandwidth) • compress data sample compress transmit/store JPEG JPEG2000 … receive decompress

  10. Sparsity / Compressibility largewaveletcoefficients (blue = 0) largeGabor (TF)coefficients pixels widebandsignalsamples frequency time

  11. Sample / Compress • Long-established paradigm for digital data acquisition • uniformly sample data at Nyquist rate • compress data sample compress transmit/store sparse /compressiblewavelettransform receive decompress

  12. What’s Wrong with this Picture? • Why go to all the work to acquire N samples only to discard all but K pieces of data? sample compress transmit/store sparse /compressiblewavelettransform receive decompress

  13. What’s Wrong with this Picture? nonlinear processing nonlinear signal model (union of subspaces) linear processing linear signal model (bandlimited subspace) sample compress transmit/store sparse /compressiblewavelettransform receive decompress

  14. Compressive Sensing • Directly acquire “compressed” data • Replace samples by more general “measurements” compressive sensing transmit/store receive reconstruct

  15. Sampling • Signal is -sparse in basis/dictionary • WLOG assume sparse in space domain sparsesignal nonzero entries

  16. Sampling • Signal is -sparse in basis/dictionary • WLOG assume sparse in space domain • Samples sparsesignal measurements nonzeroentries

  17. Compressive Sampling • When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss through linear dimensionality reduction sparsesignal measurements nonzero entries

  18. How Can It Work? • Projection not full rank…… and so loses information in general • Ex: Infinitely many ’s map to the same

  19. How Can It Work? • Projection not full rank…… and so loses information in general • But we are only interested in sparse vectors columns

  20. How Can It Work? • Projection not full rank…… and so loses information in general • But we are only interested in sparse vectors • is effectively MxK columns

  21. How Can It Work? • Projection not full rank…… and so loses information in general • But we are only interested in sparse vectors • Design so that each of its MxK submatrices are full rank columns

  22. How Can It Work? • Goal: Design so that its Mx2K submatrices are full rank • difference between two K-sparse vectors is 2K sparse in general • preserve information in K-sparse signals • Restricted Isometry Property (RIP) of order 2K columns

  23. Unfortunately… • Goal: Design so that its Mx2K submatrices are full rank(Restricted Isometry Property – RIP) • Unfortunately, a combinatorial, NP-complete design problem columns

  24. Insight from the 80’s [Kashin, Gluskin] • Draw at random • iid Gaussian • iid Bernoulli … • Then has the RIP with high probability as long as • Mx2K submatrices are full rank • stable embedding for sparse signals • extends to compressible signals in balls columns

  25. Compressive Data Acquisition • Measurements = random linear combinations of the entries of • WHP does not distort structure of sparse signals • no information loss sparsesignal measurements nonzero entries

  26. CS Signal Recovery • Goal: Recover signal from measurements • Problem: Randomprojection not full rank(ill-posed inverse problem) • Solution: Exploit the sparse/compressiblegeometry of acquired signal

  27. CS Signal Recovery • Recovery: given(ill-posed inverse problem) find (sparse) • fast, wrong • correct, slow • correct, efficient mild oversampling[Candes, Romberg, Tao; Donoho]number of measurements required linear program

  28. Universality • Random measurements can be used for signals sparse in any basis

  29. Universality • Random measurements can be used for signals sparse in any basis

  30. Universality • Random measurements can be used for signals sparse in any basis sparsecoefficient vector nonzero entries

  31. Compressive Sensing • Directly acquire “compressed” data • Replace N samples by M random projections random measurements transmit/store … receive linear pgm

  32. Back to Array Processing

  33. Problem Setup • Discretize space into a localization grid withN grid points • fixes localization resolution • P sensors do not have tobe on grid points

  34. Problem Setup • Discretize space into a localization grid withN grid points • Construct localization dictionary • reference sensor imeasures signal • for all grid positionsn=1,…,N: assume that target is at grid position n • for all sensorsj=1,…,P: use Green’s function to estimate signal sensor j would measureif target was at position n

  35. Signal Observation Model localizationgrid actual sensormeasurements

  36. Signal Observation Model localizationgrid localizationdictionary actual sensormeasurements

  37. Localization as Sparse Approximation localizationgrid actual sensormeasurements truetargetlocation

  38. Localization as Sparse Approximation localizationgrid actual sensormeasurements truetargetlocation givencan recovervia lin prgm

  39. Valid Dictionaries • S.A. works when columns of are mutual incoherent[Vandergheynst, Gribonval, et al] • True when target signal has fast-decaying autocorrelation • Extends to multiple targets with small cross-correlation • Correlations control localization resolution

  40. Multiple Targets localizationgrid actual sensormeasurements 2 truetargetlocations

  41. Typical Correlation Functions Toyota Prius ACF: Toyota Prius CCF: Rodeo vs. Prius Isuzu Rodeo ACF: Isuzu Rodeo CCF: Rodeo vs. Camaro Chevy Camaro ACF: Chevy Camaro CCF: Camaro vs. Prius

  42. Enter Compressive Sensing • Since localization vector is sparse, acquire and transmit just a few compressive measurementsto the fusion center

  43. Enter CS: ELVIS • Since localization vector is sparse, acquire and transmit just a few compressive measurementsto the fusion center • ELVIS: Enhanced Localization Via Incoherence and Sparsity

  44. ELVIS • Goal: Localize targetsby fusing measurementsfrom an array of sensors • collect time signal data • requires potentiallyhigh-rate (Nyquist)sampling • communicate signals to central fusion center • potentially largecommunicationburden • solve an optimizationproblem

  45. Synthetic Results

  46. Field Data Results Field example: 5 vehicle convoy, 2 HMMV’s and 3 commercial SUV’s.

  47. Summary • Compressive sensing • integrates sensing, compression, processing • exploits signal sparsity/compressibility information • enables new sensing modalities, architectures, systems • ELVIS: CS for array processing and localization • array signals can form an incoherent dictionary • sub-Nyquist sampling at each sensor • communication bandwidth usage scales logarithmically with the number of sensors and/or desired resolution • democratic compressive measurements robust to quantization (even 1-bit), noise, and packet loss • universality of compressive measurements enables design/deployment of inexpensive generic sensing hardware

  48. dsp.rice.edu/cs

More Related