curve sampling and conditional simulation l.
Download
Skip this Video
Download Presentation
Curve Sampling and Conditional Simulation

Loading in 2 Seconds...

play fullscreen
1 / 51

Curve Sampling and Conditional Simulation - PowerPoint PPT Presentation


  • 132 Views
  • Uploaded on

Curve Sampling and Conditional Simulation. Ayres Fan John W. Fisher III Alan S. Willsky MIT Stochastic Systems Group. Outline. Overview Curve evolution Markov chain Monte Carlo Curve sampling Conditional simulation. Overview.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Curve Sampling and Conditional Simulation' - falala


Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
curve sampling and conditional simulation

Curve Sampling and Conditional Simulation

Ayres Fan

John W. Fisher III

Alan S. Willsky

MIT Stochastic Systems Group

outline
Outline
  • Overview
  • Curve evolution
  • Markov chain Monte Carlo
  • Curve sampling
  • Conditional simulation
overview
Overview
  • Curve evolution attempts to find a curve C (or curves Ci) that best segment an image (according to some model)
  • Goal is to minimize an energy functional E(C) (view as a negative log likelihood)
  • Find a local minimum using gradient descent
sampling instead of optimization
Sampling instead of optimization
  • Draw multiple samples from a probability distribution p (e.g., uniform, Gaussian)
  • Advantages:
    • Naturally handles multi-modal distributions
    • Avoid local minima
    • Higher-order statistics (e.g., variances)
    • Conditional simulation
outline5
Outline
  • Overview
  • Curve evolution
  • Markov chain Monte Carlo
  • Curve sampling
  • Conditional simulation
planar curves
Planar curves
  • A curve is a function
  • We wish to minimize an energy functional with a data fidelity term and regularization term:
  • This results in a gradient flow:
  • We can write any flow in terms of the normal:
euclidean curve shortening flow
Euclidean curve shortening flow
  • Let’s examine
  • E is minimized when C is short
  • Gradient flow should be in the direction that minimizes the curve length the fastest
  • Use Euler-Lagrange and we see:
level set methods
Level-Set Methods
  • A curve is a function (infinite dimensional)
  • A natural implementation approach is to use marker points on the boundary (snakes)
    • Reinitialization issues
    • Difficulty handling topological change
  • Level set methods instead evolve a surface (one dimension higher than our curve) whose zeroth level set is the curve (Sethian and Osher)
embedding the curve
Embedding the curve
  • Force level set  to be zero on the curve
  • Chain rule gives us
references
References
  • Snakes: Active contour models, Kass, Witkin, Terzopoulos 1987
  • Geodesic active contours, Caselles, Kimmel, Sapiro 1995
  • Region competition, Zhu and Yuille, 1996
  • Mumford-Shah approach, Tsai, Yezzi, Willsky 2001
  • Active contours without edges, Chan and Vese, 2001
outline16
Outline
  • Overview
  • Curve evolution
  • Markov chain Monte Carlo
  • Curve sampling
  • Conditional simulation
general map model
General MAP Model
  • For segmentation:
    • x is a curve
    • y is the observed image (can be vector)
    • S is a shape model
    • Data model usually IID given the curve
  • We wish to sample from p(x|y;S), but cannot do so directly
markov chain monte carlo
Markov Chain Monte Carlo
  • Class of sampling methods that iteratively generate candidates based on previous iterate (forming a Markov chain)
  • Examples include Gibbs sampling, Metropolis-Hastings
  • Instead of sampling from p(x|y;S), sample from a proposal distribution q and keep samples according to an acceptance rule a
metropolis algorithm
Metropolis algorithm
  • Metropolis algorithm:
    • Start with x0
    • At time t, generate candidate t (given xt-1)
    • Set xt = t with probability min(1, p(t)/p(xt-1)), otherwise xt = xt-1
    • Go back to 2
asymptotic convergence
Asymptotic Convergence
  • Two main requirements to ensure iterates converge to samples:
    • q spans support of p
      • This means that any element with non-zero probability must be able to be generated by a sequence of samples from q
    • Detailed balance is achieved
metropolis hastings
Metropolis-Hastings
  • Metropolis update rule only results in detailed balance when q is symmetric:
  • When this is not true, we need to evaluate the Hastings ratio:

and keep a candidate with probability min(1, r)

outline22
Outline
  • Overview
  • Curve evolution
  • Markov chain Monte Carlo
  • Curve sampling
  • Conditional simulation
basic model
Basic model
  • Modified Chan-Vese energy functional
  • We can view data model as being iid inside and outside the curve with different Gaussian distributions
mcmc curve sampling
MCMC Curve Sampling
  • Generate perturbation on the curve:
  • Sample by adding smooth random fields:
  • S controls the degree of smoothness in field
  • Note for portions where f is negative, shocks may develop (so called prairie fire model)
  • Implement using white noise and circular convolution
coverage detailed balance
Coverage/Detailed balance
  • It is easy to show we can go from any curve C1 to any other curve C2 (shrink to a point)
  • For detailed balance, we need to compute probability of generating C´ from C (and vice versa)
  • If we assume that the normals are approximately the same, then we see f = f ´ (symmetric)
smoothness issues
Smoothness issues
  • While detailed balance assures asymptotic convergence, may need to wait a very long time
  • In this case, smooth curves have non-zero probability under q, but are very unlikely to occur
  • Can view accumulation of perturbations as(h is the smoothing kernel, ni is white noise)
  • Solution: make q more likely to move towards high-probability regions of p
adding mean force
Adding mean force
  • We can add deterministic elements to f (i.e., a mean to q)
  • The average behavior should then be to move towards higher-probability areas of p
  • In the limit, setting f to be the gradient flow of the energy functional results in always accepting the perturbation
  • We keep the random field but have a non-zero mean:
detailed balance redux
Detailed balance redux
  • Before we just assumed that q was symmetric and used Metropolis updates
  • Now we need to evaluate q and use Metropolis-Hastings updates
  • Probability of going from C to C´ is the probability of generating f (which is Gaussian) and the reverse is the probability of f ´ (also Gaussian)
detailed balance continued
Detailed balance continued
  • Relationship between f and f ´ complicated due to the fact that the normal function changes
  • Various levels of exactness
    • Assume
    • Infinitesimal approximation (ignore tangential)
    • Trace along (technical issues)
further research
Further research
  • We would like q to naturally generate smooth curves
    • Different structure for the perturbation
  • Speed always an issue for MCMC approaches
    • Multiresolution perturbations
    • Parameterized perturbations (efficient basis)
    • Hierarchical models
  • How to properly use samples?
outline33
Outline
  • Overview
  • Curve evolution
  • Markov chain Monte Carlo
  • Curve sampling
  • Conditional simulation
user information
User Information
  • In many problems, the model admits many reasonable solutions
  • Currently user input largely limited to initialization
  • We can use user information to reduce the number of reasonable solutions
    • Regions of inclusion or exclusion
    • Partial segmentations
  • Can help with both convergence speed and accuracy
  • Interactive segmentation
conditional simulation
Conditional simulation
  • With conditional simulation, we are given the values on a subset of the variables
  • We then wish to generate sample paths that fill in the remainder of the variables (e.g., simulating Brownian motion)
simulating curves
Simulating curves
  • Say we are given Cs, a subset of C (with some uncertainty associated with it)
  • We wish to sample the unknown part of the curve Cu
  • One way to view is as sampling from:
  • Difficulty is being able to evaluate the middle term as theoretically need to integrate p(C)
simplifying cases
Simplifying Cases

Under special cases, evaluation of

is tractable:

  • When C is low-dimensional (can do analytical integration or Monte-Carlo integration)
  • When Cs is assumed to be exact
  • When p(C) has special form (e.g., independent)
  • When willing to approximate
basic model in 3d
Basic model in 3D
  • Energy functional with surface area regularization:
  • With a slice-based model, we can write the regularization term as:
zero order hold approximation
Zero-order hold approximation
  • Approximate volume as piecewise-constant “cylinders”:
  • Then we see that the surface areas are:
  • We see terms related to the curve length and the difference between neighboring slices
  • Upper bound to correct surface area
overall regularization term

self potentials

edge potentials

Overall regularization term
  • Adding everything together results in:
2 5d approach
2.5D Approach
  • In 3D world, natural (or built-in) partition of volumes into slices
  • Assume Markov relationship among slices
  • Then have local potentials (e.g., PCA) and edge potentials (coupling between slices)
  • Naturally lends itself to local Metropolis-Hastings approach (iterating over the slices)
2 5d model
2.5D Model
  • We can model this as a simple chain structure with pairwise interactions
  • This admits the following factorization:
partial segmentations
Partial segmentations
  • Assume that we are given segmentations of every other slice
  • We now want to sample surfaces conditioned on the fact that certain slices are fixed
  • Markovianity tells us that c2 and c4 are independent conditioned on c3
log probability for c 2
Log probability for c2
  • We can then construct the probability for c2 conditioned on its neighbors using the potential functions defined previously:
results
Results

Neighbor segmentations

Our result (cyan) with expert (green)

larger spacing
Larger spacing
  • Apply local Metropolis-Hastings algorithm where we sample on a slice-by-slice basis
  • Theory shows that asymptotic convergence is unchanged
  • Unfortunately larger spacing similar to less regularization
  • Currently have issues with poor data models that need to be resolved
full 3d
Full 3D
  • In medical imaging, have multiple slice orientations (axial, sagittal, coronal)
  • In seismic, vertical and horizontal shape structure expected
  • With a 2.5D approach, this introduces complexity to the graph structure
incorporating perpendicular slices
Incorporating perpendicular slices
  • c is now coupled to all of the horizontal slices
  • c only gives information on a subset of each slice
  • Specify edge potentials as, e.g.:
other extensions
Other extensions
  • Additional features can be added to the base model:
    • Uncertainty on the expert segmentations
    • Shape models (semi-local)
    • Exclusion/inclusion regions
    • Topological change (through level sets)
applications
Applications
  • Currently looking for applications to demonstrate utility of curve sampling
    • Multi-modal distributions
    • Morphology changes
    • Confidence levels
interface
Interface
  • How do we present the information to the user?
    • Pixel or curve statistics?
    • Aggregate statistics or present samples?
  • How do we receive information from the user?
    • Readily incorporated into simulation model
    • Easy to specify