Statistics and shape analysis
This presentation is the property of its rightful owner.
Sponsored Links
1 / 19

Statistics and Shape Analysis PowerPoint PPT Presentation


  • 75 Views
  • Uploaded on
  • Presentation posted in: General

Statistics and Shape Analysis. By Marc Sobel. Shape similarity. Humans recognize shapes via both local and global features.

Download Presentation

Statistics and Shape Analysis

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Statistics and shape analysis

Statistics and Shape Analysis

By Marc Sobel


Shape similarity

Shape similarity

  • Humans recognize shapes via both local and global features.

  • (i) matching local features between shapes like curvature, distance to centroid can be statistically modeled via building statistics and parameters to reflect the matching.

  • (ii) matching the relationship between global features of shapes (are they both apples or not?)


Incorporating both local and global features in shape matching

Incorporating both local and global features in shape matching

  • How can we incorporate both local and global features in shape matching?

  • An obvious paradigm is to model global features as governed by priors, and local features given global features as a likelihood.


Definitions and notation

Definitions and Notation

  • Let u1,…,un be the vertices of one shape and v1,…,vm the vertices of another shape. We’d like to biuld correspondences between the vertices which properly reflect the relationship between the shapes. We use the notation (ui,vj) for a correspondence of this type. We use the terminology for a particle consisting of a set of such correspondences.

  • Let Xi,l be the l’th local feature measure for vertex i of the first shape and Yj,l the l’th local feature measure for vertex j of the second shape. For now assume these feature measures are observed.

  • We’d like to biuld a particle which reflects the local and global features of interest.


Contiguity an important global feature

Contiguity: An important global feature.

  • If shapes result from one another via rotation and scaling then the order of shape 1 correspondence points should match the order of shape 2 correspondence points: i.e., if (i1,j1) is one correspondence and (i2,j2) is another, then either i1<i2 and j1<j2 or i1>i2 and j1>j2. We can incorporate this into a prior.


Notation

Notation:

  • We have that:


Simple likelihood

Simple Likelihood

  • Based on the observed features we form weight statistics:

  • Let W denote the weight matrix associated with the features.

  • Therefore given that a correspondence ‘C’ belongs in the ‘true’ set of correspondences, we write the simple likelihood in the form,


Complicated likelihoods

Complicated Likelihoods

  • At stage t, putting ω as the parameter, we define the likelihood:


Simple and complicated priors

Simple and ComplicatedPriors

  • Model a prior for all sets of correspondences which are strongly contiguous:

  • a) a simple prior (we use ω for the weight variable)

  • b) I] a prior giving more weight to diagonals than other correspondences.

  • II] we can define such a prior sequentially based on the fact that


Complicated prior

Complicated Prior

  • Put

  • With ‘DIAG[i,j]’ referring to the positively oriented diagonal to which (i,j) belong.


Simulating the posterior distribution simple prior

Simulating the Posterior Distribution: Simple Prior

  • We would like to simulate the posterior distribution of contiguous correspondences. We do this by calculating the weights:


Simulating the posterior distribution complicated prior

Simulating the Posterior Distribution: Complicated Prior

  • Here we simulate:


A simpler model

A Simpler Model

  • Define the posterior probabilities:

  • For parameter λ, described below.


Weights for the simpler model

Weights for the simpler model

  • The weights for the simpler model are particularly easy:

  • Choosing λ tending to infinity properly, we get convergence to the MAP estimator of the simple particle filter.


Shape similarity a more complicated model employing curvature and distance parameters

Shape Similarity: A More complicated model employing curvature and distance parameters

  • We have:


Simple likelihood1

Simple Likelihood

  • Based on the observed features we form weight parameters:

  • Let W denote the weight matrix associated with the features.

  • Therefore given that a correspondence ‘C’ belongs in the ‘true’ set of correspondences, we write the likelihood in the form,


Particle likelihood

Particle Likelihood

  • We write the likelihood in the form:


Particle prior

Particle Prior

  • We assume standard priors for the mu’s and nu’s. We also assume a prior for the set of contiguous correspondences.

  • The particle is updated as follows: define,


Particle prior1

Particle Prior

  • At stage t we have particles,

  • Their weights are given by:


  • Login