1 / 62

Design in Imaging From Compressive to Comprehensive Sensing 

Design in Imaging From Compressive to Comprehensive Sensing . Lior Horesh Joint work with E. Haber, L. Tenorio , A. Conn, U. Mello, J. Fohring. IBM TJ Watson Research Center SIAM Imaging Science 2012, Philadelphia, PA - May 2012. Introduction.

quynh
Download Presentation

Design in Imaging From Compressive to Comprehensive Sensing 

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design in ImagingFrom Compressive to Comprehensive Sensing  LiorHoresh Joint work with E. Haber, L. Tenorio, A. Conn, U. Mello, J. Fohring IBM TJ Watson Research Center SIAM Imaging Science 2012, Philadelphia, PA - May 2012

  2. Introduction

  3. Exposition - ill - posed Inverse problems • Aim: infer model • Given • Experimental design y • Measurements • Observation model • Ill - posedness Naïve inversion ... • Fails... • Need to fill in the missing information

  4. How to improve model recovery ? • How can we ... • Improve observation model ? • Extract more information in the measurement procedure ? • Define appropriate distance measures / noise model ? • Incorporate more meaningful a-priori information ? • Provide more efficient optimization schemes ?

  5. Evolution of inversionFrom simulation to Design • Forward problem (simulation, description) • Given: model m & observation model • Simulate: data d • Inverse problem (estimation, prediction) • Given: data d & observation model • Infer: model m (and uncertainties) • Design (prescription) • Given: inversion scheme & observation model • Find: the ‘best’ experimental settings y, regularization S, …

  6. Why Design ?

  7. Which Experimental design is best ? Design 1 Design 2 Design 3

  8. Design questions – Diffuse Optical Tomography • Where light sources and optodes should be placed ? • In what sequence they should be activated ? • What laser source intensities should be used ? Arridge 1999

  9. Design questions –Electromagnetic inversion • What frequencies should be used ? • What trajectory of acquisition should be considered ? Newman 1996, Haber & Ascher 2000

  10. Design questions – Seismic inversion • How simultaneous sources can be used ? Clerbout 2000

  11. Design questions –Limited Angle Tomography • How many projections should we collect and at what angles ? • How accurate should the projection data be ?

  12. What can we Design ?

  13. Design experimental layout Stonehenge 2500 B.C.

  14. Design experimental process Galileo Galilei 1564-1642

  15. Respect experimental constraints… French nuclear test, Mururoa, 1970

  16. How to Design ?

  17. Ill vs. Well - posed optimal experimental design • Previous work • Well-posed problems - well established [Fedorov 1997, Pukelsheim 2006] • Ill-posed problems - under-researched [Curtis 1999, Bardow 2008] • Many practical problems in engineering and sciences are ill-posed What makes ill-posed problems so special ?

  18. Optimality criteria in well-posed problems • For linear inversion, employ Tikhonov regularized least squares solution • Bias - variance decomposition • For over-determined problems • A-optimal design problem

  19. Optimality criteria in well-posed problems • Optimality criteria of the information matrix • A-optimal design  average variance • D-optimality  uncertainty ellipsoid • E-optimality  minimax • Almost a complete alphabet… Stone, DeGroot and Bernardo

  20. The problems... • Ill-posedness controlling variance alone reduces the error mildly [Johansen 1996 ] • Non-linearity  bias-variance decomposition is impossible What strategy can be used ? Proposition 1 - Common practice so far Trial and Error…

  21. Experimental Design by Trial And Error • Pick a model • Run observation model of different experimental designs, and get data • Invert and compare recovered models • Choose the experimental design that provides the best model recovery

  22. The problems... • Ill-posedness controlling variance alone reduces the error mildly [Johansen 1996 ] • Non-linearity  bias-variance decomposition is impossible What other strategy can be used ? Proposition 2 - Minimize bias and variance altogether How to define the optimality criterion ?

  23. Optimality criteria for design • Loss • Mean Square Error • Bayes risk • Depends on the noise  • Depends on unknown model • Depends on unknown model • Computationally infeasible

  24. Optimality criteria for design • Bayes empirical risk • Assume a set of authentic model examples is available • Discrepancy between training and recovered models [Vapnik 1998 ] How can y and S be regularized ? • Regularized Bayesian empirical risk

  25. Other Designers / Key players • This is one doctrine • Other interesting choices were developed by • Y. Marzouk et. al, MIT (2011, 2012) • A. Curtis et. al, University of Edinburgh (1999, 2010) • D. Coles & M. Prange, Schlumberger (2008, 2012) • S. Körkel et. al, Heidelberg University (2011) • A. Bardow, RWTH Aachen University (2008, 2009)

  26. Differentiable ObservationSparsity Controlled Design • Assume: fixed number of observations • Design preference: small number of sources / receivers is activated • The observation model • Regularized risk Horesh, Haber & Tenorio 2011

  27. Differentiable Observation Sparsity Controlled Design • Total number of observations may be large • Derivatives of the forward operator w.r.t. y • Effective when activation of each source and receiver is expensive Difficult… Horesh, Haber & Tenorio 2011

  28. Weights formulation Sparsity Controlled Design • Assume: a predefined set of candidate experimental setups is given • Design preference: small number of observations Haber, Horesh & Tenorio 2010

  29. Weights formulation Sparsity Controlled Design • Let be discretization of the space [Pukelsheim 1994] • Let • The (candidates) observation operator is weighted • w inverse of standard deviation • - reasonable standard deviation - conduct the experiment • - infinite standard deviation - do not conduct the experiment Haber, Horesh & Tenorio 2010

  30. Weights formulation Sparsity Controlled Design • Solution - more observations give better recovery • Desired solution many w‘s are 0 • Add penalty to promote sparsity in observations selection • Less degrees of freedom • No explicit access to the observation operator needed

  31. The optimization problems • Leads to a stochastic bi-level optimization problem • Direct formulation • Weights formulation Haber, Horesh & Tenorio 2010, 2011

  32. The optimization problem(s) • In general (multi-level) optimization problem • Difficult to solve (although not impossible) [Alexaderov & Denis 1994 ] • To simplify make assumptions on • F- linear, nonlinear • S- quadratic, matrix form • y- discrete, continuous • Important - sufficient improvement can be “good enough”

  33. How Many Samples are needed ? • The optimist - choose a single m and design for it • The pessimist - assume that m is the worst it can be • The realist - choose an ensemble of m’s and average • A Bayesian estimate of the frequentist risk [Rubin 1984 ]

  34. Optimal Design in Practice

  35. Linear Design

  36. Linear Design • Assume • Then

  37. Linear Design • Assume m has second moment • Then • Best linear design

  38. Case study ILinear experimental Design

  39. the best (linear) experiment • Considering discretization of the design space • We obtain

  40. Borehole Ray Tomography -Observation model • Use 322 × 3 = 3072 rays • Goal: choose 500 optimal observations

  41. Borehole Ray Tomography Experimental Design ASSESSMENT • Assess design performance upon unseen models (cross validation) Test model Non-optimal design Optimal design

  42. Non - linear Design

  43. Non - linear design • Use Stochastic Optimization - approximate by sampling [Shapiro, Dentcheva & Ruszczynski 2009 ]

  44. Non - linear design • Solution through sensitivity calculation and therefore • The sensitivities

  45. Non - linear design • The reduced gradient • Can use steepest descent/LBFGS/Truncated Gauss-Newton to solve the problem

  46. Case study IInon-linear experimental Design

  47. Impedance Tomography – Observation model • Governing equations • Following Finite Element discretization • Given model and design settings • Find data , n < k • Design: find optimal source-receiver configuration

  48. Impedance Tomography – Designs comparison Naive design True model Optimized design Horesh, Haber & Tenorio 2011

  49. Magneto - tullerics Tomography – Observation model • Governing equations • Following Finite Volume discretization • Given: model and design settings (frequency ) • Find: data , • Design: find an optimal set of frequencies

  50. Magneto - tellurics Tomography – Designs comparison Test model Naive design Optimal linearized design Optimized non-linear design Haber, Horesh & Tenorio 2008, 2010

More Related