1 / 29

A Graphical Model For Simultaneous Partitioning And Labeling

A Graphical Model For Simultaneous Partitioning And Labeling. Philip Cowans & Martin Szummer AISTATS, Jan 2005. Cambridge. Motivation – Interpreting Ink. Hand-drawn diagram. Machine interpretation. Graph Construction. Vertices are grouped into parts. Vertices, V.

marcos
Download Presentation

A Graphical Model For Simultaneous Partitioning And Labeling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Graphical Model For Simultaneous Partitioning And Labeling Philip Cowans & Martin Szummer AISTATS, Jan 2005 Cambridge

  2. Motivation – Interpreting Ink Hand-drawn diagram Machine interpretation

  3. Graph Construction Vertices are grouped into parts. Vertices, V Each part is assigned a label G Edges, E

  4. Labeled Partitions • We assume: • Parts are contiguous. • The graph is triangulated. • We’re interested in probability distributions over labeled partitions conditioned on observed data.

  5. Conditional Random Fields • CRFs (Lafferty et. al.) provide joint labeling of graph vertices. • Idea: define parts to be contiguous regions with same label. • But… • Large number of labels needed. • Symmetry problems / bias. +1 +2 +2 -1 -1 +3 +3

  6. A Better Approach… • Extend the CRF framework to work directly with labeled partitions. • Complexity is improved – don’t need to deal with so many labels. • No symmetry problem – we’re working directly with the representation in which the problem is posed.

  7. Projection • Projection maps labeled partitions onto smaller subgraphs. • If G µV then, the projection of Y onto G is the unique labeled partition of G which is ‘consistent’ with Y.

  8. Notation

  9. Potentials

  10. The Model • Unary: • Pairwise:

  11. The Model

  12. Training • Train by finding MAP weights on example data with Gaussian prior (BFGS). • We require the value and gradient of the log posterior: Normalization Marginalization

  13. Prediction • New data is processed by finding the most probable labeled partition. • This is the same as normalization with the summation replaced by a maximization.

  14. Inference • These operations require summation or maximization over all possible labeled partitions. • The number of terms grows super-exponentially with the size of G. • Efficient computation possible using message passing as distribution factors. • Proof based on Shenoy & Shafer (1990).

  15. Message Passing 8 9 7 1 2 3 5 4 6

  16. Message Passing 1,7,8 2,9 1,2,3,4 ‘Upstream’ Message summarizes contribution from ‘upstream’ to the sum for a given configuration of the separator. Junction tree constructed from cliques on original graph. 2,3,4,5 4,5,6

  17. Message Passing 1,7,8 2,9 1,2,3,4 2,3,4,5 x22 4,5,6

  18. Message Update Rule • Update messages (for summation) according to • Marginals found using • Z can be found explicitly

  19. Complexity

  20. Experimental Results • We tested the algorithm on hand drawn ink collected using a Tablet PC. • The task is to partition the ink fragments into perceptual objects, and label them as containers or connectors. • Training data set was 40 diagrams, from 17 subjects with a total of 2157 fragments. • 3 random splits (20 training and 20 test examples).

  21. Example 1

  22. Example 1

  23. Example 2

  24. Example 2

  25. Example 3

  26. Example 3

  27. Labeling Results • Labelling error: fraction of fragments labeled incorrectly. • Grouping error: fraction of edges locally incorrect.

  28. Conclusions • We have presented a conditional model definied over labeled partitions of an undirected graph. • Efficient exact inference is possible in our model using message passing. • Labeling and grouping simultaneously can improve labeling performance. • Our model performs well when applied to the task of parsing hand-drawn ink diagrams.

  29. Acknowledgements Thanks to: Thomas Minka, Yuan Qi and Michel Gagnet for useful discussion and providing software. Hannah Pepper for collecting our ink database.

More Related