1 / 62

ADD TO TALK

ADD TO TALK. RON STUFF. New Theory and Algorithms for Scalable Data Fusion. Richard Baraniuk, Volkan Cevher Rice University Ron DeVore Texas A&M University Martin Wainwright University of California-Berkeley Michael Wakin Colorado School of Mines. Networked Sensing. Goals sense

Download Presentation

ADD TO TALK

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ADD TO TALK • RON STUFF

  2. New Theory and Algorithms for ScalableData Fusion Richard Baraniuk, Volkan Cevher Rice University Ron DeVore Texas A&M University Martin Wainwright University of California-Berkeley Michael Wakin Colorado School of Mines

  3. Networked Sensing Goals • sense • communicate • fuse • infer (detect, recognize, etc.) • predict • actuate/navigate networkinfrastructure humanintelligence

  4. Networked Sensing Challenges • growing volumes of sensor data • increasingly diverse data • diverse and changing operating conditions • increasing mobility networkinfrastructure humanintelligence

  5. Research Challenges • Shear amount of data that must be acquired, communicated, processed J sensors N samples/pixels per sensor • Amount of data grows as O(JN) • can lead to communication and computation collapse • Must fuse diverse data types

  6. Research Program • Thrust 1: Scalable data models • Thrust 2: Randomized dimensionality reduction • Thrust 3: Scalable inference algorithms • Thrust 4: Scalable data fusion • Thrust 5: Scalable learning algorithms

  7. Thrust 1: Scalable Data Models • Unifying theme: low-dimensional signal structure • Sparse signal models • Graphical models • Manifold models • Exploit geometry of these models

  8. 1. Sparse Models pixels largewaveletcoefficients (blue = 0) K-dim subspaces

  9. 2. Graphical Models

  10. 3. Manifold Models • Image articulation manifold (IAM) • Manifold dimensionL= # imaging parameters • If images are smooththen manifold is smooth articulation parameter space

  11. Thrust 2: Randomized Dimensionality Reduction • Goal: preserve information from x in y • One avenue: stable embedding • Key question: how small can M be? signalfromsparse,graphical,manifoldmodel measurements

  12. Sparse Models K-dim subspaces

  13. Sparse Models K-dim subspaces

  14. Sparse Models K-dim subspaces • Stable embedding <> Restricted isometry property (RIP) from compressive sensing • Stability whp if

  15. Single-Pixel Camera M randomizedmeasurements N mirrors target N=65536 pixels M=1300 measurements (2%) M=11000 measurements (16%)

  16. Graphical Models K-dim subspaces • Example: K-sparse signals

  17. Graphical Models • Example: K-sparse signals with correlations • Rules out some/many subspaces • Stability whp with as low as K-dim subspaces

  18. Ex: Clustered Signals • Model clustering of significant pixelsin space domain using Ising Markov Random Field • Example: Recovery of background subtracted video from randomized measurements target Ising-modelrecovery CoSaMPrecovery LP (FPC)recovery

  19. Manifold Models • Can stably embed a compact, smooth L-dimensional manifold whp if • Recall that manifold dimension L is very small for many apps (# imaging parameters) • Constants scale with manifold’s • condition number (curvature) • volume

  20. Thrust 3: Scalable Inference Many applications involve signal inferenceand not reconstructiondetection < classification < estimation < reconstruction Good news: RDR supports efficient learning, inference, processing directly on compressive measurements Random projections ~ sufficient statisticsfor signals with concise geometrical structure

  21. Classification Simple object classification problem AWGN: nearest neighbor classifier Common issue: L unknown articulation parameters Common solution: matched filter find nearest neighbor under all articulations

  22. Matched Filter Geometry Classification with L unknown articulation parameters Images are points in Classify by finding closesttarget template to datafor each class distance or inner product data target templatesfromgenerative modelor training data (points)

  23. Matched Filter Geometry Detection/classification with L unknown articulation parameters Images are points in Classify by finding closesttarget template to data As template articulationparameter changes, points map out a L-dimnonlinear manifold Matched filter classification = closest manifold search data articulation parameter space

  24. Smashed Filter Recall stable manifoldembedding whp using random measurements Enables parameter estimation and MFdetection/classificationdirectly on randomizedmeasurements recall L very small in many applications (# articulations)

  25. Example: Matched Filter Naïve approach take M CS measurements, recover N-pixel image from CS measurements (expensive) conventional matched filter

  26. Smashed Filter Worldly approach take M CS measurements, matched filter directly on CS measurements(inexpensive)

  27. Smashed Filter Random shift and rotation (L=3 dim. manifold) WG noise added to measurements Goals: identify most likely shift/rotation parameters identify most likely class more noise classification rate (%) avg. shift estimate error more noise number of measurements M number of measurements M

  28. Thrust 4: Scalable Data Fusion • Sparse signal models • multi-signal sparse models [Wakin, next talk] • Manifold models • joint manifold models [next] • Graphical models

  29. Manifold-based Fusion • Example: Network of J cameras observing an articulating object • Each camera’s images lie on L-dim manifold in • How to efficiently fuse imagery from J cameras to solve an inference problem while minimizing network communication?

  30. Multisensor Fusion • Fusion: stack corresponding image vectors taken at the same time • Fused images still lie on L-dim manifold in“joint manifold”

  31. Joint Manifolds • Given submanifolds • L-dimensional • homeomorphic (we can continuously map between any pair) • Define joint manifoldas concatenation of

  32. Joint Manifolds: Properties • Joint manifold inherits properties from component manifolds • compactness • smoothness • volume: • condition number ( ): • Translate into algorithm performance gains • Bounds are often loose in practice (good news)

  33. Multisensor Fusion via JM+RDR • Can take randomized measurements of stacked images and process or make inferences w/ unfused RDR w/ unfused and no RDR

  34. Multisensor Fusion via JM+RDR • Can compute randomized measurements in-place • ex: as we transmit to collection/processing point

  35. Simulation Results • J=3 CS cameras, each N=320x240 resolution • M=200 random measurements per camera • Two classes • truck w/ cargo • truck w/ no cargo • Goal: classify a test image class 1 class 2

  36. Simulation Results • J=3 CS cameras, each N=320x240 resolution • M=200 random measurements per camera • Two classes • truck w/ cargo • truck w/ no cargo • Smashed filtering • independent • majority vote • joint manifold Joint Manifold

  37. “Real World” Experiment manifold learnedfrom data manifold learnedfrom RDR

  38. “Real World” Experiment joint manifold learned from data joint manifold learned from RDR

  39. Thrust 5: Scalable Learning • Sparse signal models • learning new sparse dictionaries • Manifold models • Manifold lifting [Wakin, next talk] • Manifold learning as high-dimensional function estimation [DeVore] • Graphical model learning

  40. Graphical Models

  41. Graphical Model Learning • Learn Gaussian graphical model by learning inverse covariance matrix [Wainwright] • Learn best fitting sparse model (in term of number of edges) via L1 optimization • Provably consistent

  42. Summary • Re-think data acquisition/processing pipeline • Exploit low-dimensional geometrical structure of • sparse signal models • graphical signal models • manifold signal models • Scalable algorithms via randomized dim. reduction • Progress to date: • multi-signal sparse models • smashed filter for inference • joint manifold model for fusion • manifold lifting • graphical model learning dsp.rice.edu

  43. Hierarchical Graphical Models

  44. Summary • Scalable distributed sensing requires a re-think of the entire sensing and data processing pipeline • New data representation: random encoding • preserves info in a wide range of data types • acts as source/channel fountain codes • supports efficient processing and inference algorithms • supports efficient fusion from multiple sensors • supports a range of actuation/navigation strategies • scalable in resolution N and number of sensors J • secure dsp.rice.edu/cs

  45. Manifold Learning • Given training points in , learn the mapping to the underlying K-dimensional articulation manifold • ISOMAP, LLE, HLLE, … • Example • images of rotating teapot (L=1) • articulation space = circle

  46. Compressive Manifold Learning • ISOMAP algorithm based on geodesic distances between points • Random measurements preserve these distances • Theorem: If , then the ISOMAP residual variance in the projected domain is bounded by the additive error factor translatingdisk manifold(L=2)‏ full data (N=4096)‏ M = 100 M = 50 M = 25

  47. Manifold Learning via Joint Manifolds • Goal: Learn embeddingof 2D translating ellipse(with noise) N=45x45=2025 pixelsJ=20 views at different angles

  48. Manifold Learning via Joint Manifolds • Goal: Learn embeddingof 2D translating ellipse(with noise) N=45x45=2025 pixelsJ=20 views • Embeddingslearnedseparately

More Related