1 / 71

Stacked Hierarchical Labeling

Stacked Hierarchical Labeling. Dan Munoz Drew Bagnell Martial Hebert. The Labeling Problem. Sky. Bldg. Tree. Fgnd. Road. Input. Our Predicted Labels. The Labeling Problem. The Labeling Problem. Needed: better representation & interactions Ohta ‘78. Using Regions. Input.

early
Download Presentation

Stacked Hierarchical Labeling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stacked Hierarchical Labeling Dan Munoz Drew Bagnell Martial Hebert

  2. The Labeling Problem Sky Bldg Tree Fgnd Road Input Our Predicted Labels

  3. The Labeling Problem

  4. The Labeling Problem • Needed: betterrepresentation & interactions • Ohta ‘78

  5. Using Regions Input Ideal Regions Slide from T. Malisiewicz

  6. Using Regions Input Actual Regions Slide from T. Malisiewicz

  7. Using Regions + Interactions big regions small regions Image Representation IdealProb. Graphical Model • High-order • Expressive interactions

  8. Using Regions + Interactions big regions small regions Image Representation Actual PGM • Restrictive interactions • Still NP-hard

  9. Learning with Approximate Inference Simple Random Field Learning Path • PGM learning requires exact inference • Otherwise, may diverge Kulesza and Pereira ’08

  10. PGM Approach Input PGM Inference Output

  11. Our Approach f1 Input Output fN … Sequence of simple problems Cohen ’05, Daume III ’06

  12. A Sequence of Simple Problems … Stacked Hierarchical Labeling f1 Input fN Output • Training simple modules to net desired output • No searching in exponential space • Not optimizing any joint distribution/energy • Not necessarily doing it before! Kulesza & Pereira ‘08

  13. Our Contribution • An effective PGM alternative for labeling • Training a hierarchical procedure of simple problems • Naturally analyzes multiple scales • Robust to imperfect segmentations • Enables more expressive interactions • Beyond pair-wise smoothing

  14. Related Work big regions small regions • Learning with multi-scale configurations • Joint probability distribution Bouman ‘94, Feng ‘02, He ’04 Borenstein ‘04, Kumar ’05 • Joint score/energy Tu ‘03, S.C. Zhu ‘06, L. Zhu ‘08 Munoz ‘09, Gould ’09, Ladicky ’09 • Mitigating the intractable joint optimization • Cohen ’05, Daume III ’06, Kou ‘07, Tu ’08, Ross ‘10

  15. 1 2 3 . . . . . .

  16. 1 In this work, the segmentation tree is given We use the technique from Arbelaez ’09 2 3 . . . . . .

  17. Segmentation Tree (Arbelaez ’09) 2 3 4 1

  18. Segmentation Tree (Arbelaez ’09) Label Coarse To Fine 2 3 4 1 Parent sees big picture Naturally handles scales

  19. Segmentation Tree (Arbelaez ’09) 2 3 4 1 f4 f2 f3 f1 • Parent sees big picture • Naturally handles scales • Break into simple tasks • Predict labelmixtures

  20. Handling Real Segmentation Segmentation Map Input fipredicts mixture of labels for each region

  21. Actual Predicted Mixtures P(Building) P(Tree) P(Fgnd) (brighter  higher probability)

  22. Training Overview f1 f2 How to train each module fi? How to use previous predictions? How to train the hierarchical sequence?

  23. Training Overview f1 f2 How to train each module fi? How to use previous predictions? How to train the hierarchical sequence?

  24. Modeling Heterogeneous Regions • Count true labels Prpresent in each region r • Train a model Q to match each Pr • Logistic Regression • minQ H(P,Q)  Weighted Logistic Regression • Image features: texture, color, etc. (Gould ’08)

  25. Training Overview f1 f2 How to train each module fi? How to use previous predictions? How to train the hierarchical sequence?

  26. Using Parent Predictions Parent regions Child regions • Use broader context in the finer regions • Allow finer regions access to all parent predictions • Create & append 3 types of context features • Kumar ’05, Sofman ’06, Shotton ’06, Tu ‘08

  27. Parent Context Parent Child Refining the parent

  28. Detailed In Paper Σ regions Image-wise (co-occurrence) Spatial Neighborhood (center-surround)

  29. Training Overview f1 f2 How to train each module fi? How to use previous predictions? How to train the hierarchical sequence?

  30. Approach #1 f3 f1 f2 f4 • Train each module independently • Use ground truth context features • Problem: Cascades of Errors • Modules depend on perfect context features • Observe no mistakes during training Propagate mistakes during testing

  31. Approach #2 f3 f1 f2 f4 • Solution: Train in feed-forward manner • Viola-Jones ‘01, Kumar ‘05, Wainwright ’06, Ross ‘10

  32. Training Feed-Forward A fl LogReg B (Parameters) C

  33. Training Feed-Forward fl A fl B fl C

  34. Cascades of Overfitting F.F. Train Confusions F.F. Test Confusions Stacking Test Confusions • Solution: Stacking • Wolpert ’92, Cohen ’05 • Similar to x-validation • Don’t predict on data used for training

  35. Stacking A fl A B LogReg C

  36. Stacking fl A A

  37. Stacking A fl B B LogReg C

  38. Stacking fl A A fl B B

  39. Stacking fl A A fl B B fl C C

  40. Learning to Fix Mistakes Person part of incorrect segment Person segmented, but relies on parent Person fixes previous mistake Level 5 Level 6 Level 7 Segments Current Output

  41. Level 1/8 Predictions Segmentation

  42. Level 1/8 Predictions P(Foreground) Segmentation 15% P(Building) P(Tree) P(Road) 12% 18% 31%

  43. Level 1/8 Predictions P(Foreground) Segmentation Current Output 15% Road P(Building) P(Tree) P(Road) 12% 18% 31%

  44. Level 2/8 Predictions P(Foreground) Segmentation P(Building) P(Tree) P(Road)

  45. Level 2/8 Predictions P(Foreground) Segmentation Current Output P(Building) P(Tree) P(Road)

  46. Level 3/8 Predictions P(Foreground) Segmentation Current Output P(Building) P(Tree) P(Road)

  47. Level 4/8 Predictions P(Foreground) Segmentation Current Output P(Building) P(Tree) P(Road)

  48. Level 5/8 Predictions P(Foreground) Segmentation Current Output P(Building) P(Tree) P(Road)

  49. Level 6/8 Predictions P(Foreground) Segmentation Current Output P(Building) P(Tree) P(Road)

  50. Level 7/8 Predictions P(Foreground) Segmentation Current Output P(Building) P(Tree) P(Road)

More Related