1 / 18

Sliding Windows – Silver Bullet or Evolutionary Deadend?

Sliding Windows – Silver Bullet or Evolutionary Deadend?. Alyosha Efros, Bastian Leibe, Krystian Mikolajczyk Sicily Workshop, Syracusa, 23.09.2006. What is a Sliding Window Approach?. Search over space and scale Detection as subwindow classification problem

powa
Download Presentation

Sliding Windows – Silver Bullet or Evolutionary Deadend?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sliding Windows – Silver Bullet or Evolutionary Deadend? Alyosha Efros, Bastian Leibe, Krystian Mikolajczyk Sicily Workshop, Syracusa, 23.09.2006

  2. ... ... What is a Sliding Window Approach? • Search over space and scale • Detection as subwindow classification problem • “In the absence of a more intelligent strategy, any global image classification approach can be converted into a localization approach by using a sliding-window search.” A. Efros, B. Leibe, K. Mikolajczyk

  3. Task: Object Localization in Still Images • What options do we have to choose from? • Sliding window approaches • Classification problem • [Papageorgiou&Poggio,’00], [Schneiderman&Kanade,’00], [Viola&Jones,01], [Mikolajczyk et al.,’04], [Torralba et al.,’04], [Dalal&Triggs,’05], [Wu&Nevatia,’05], [Laptev,’06],… • Feature-transform based approaches • Part-based generative models, typically with a star topology • [Fergus et al.,’03], [Leibe&Schiele,’04], [Fei-Fei et al.,’04], [Felszenszwalb&Huttenlocher,’05], [Winn&Criminisi,’06], [Opelt et al.,’06], [Mikolajczyk et al.,’06],… • Massively parallel NN architectures • e.g. convolutional NNs • [LeCun et al.,’98], [Osadchy et al.,’04], [Garcia et al.,??],… • “Smart segmentation” based approaches • Localization based on robustified bottom-up segmentation • [Todorovic&Ahuja,’06], [Roth&Ommer,’06] A. Efros, B. Leibe, K. Mikolajczyk

  4. Sliding-Window Approaches • Pros: • Can draw from vast stock of ML methods. • Independence assumption between subwindows. • Makes classification easier. • Process can be parallelized. • Simple technique, can be tried out very easily. • No translation/scale invariance required in model. • There are methods to do it very fast. • Cascades with AdaBoost/SVMs • Good detection performance on many benchmark datasets. • e.g. face detection, VOC challenges • Direct control over search range (e.g. on ground plane). A. Efros, B. Leibe, K. Mikolajczyk

  5. Sliding-Window Approaches • Cons: • Can draw from vast stock of ML methods……as long as they can be evaluated in a few ms. • Need to evaluate many subwindows (100’000s).  Needs very fast & accurate classification  Many training examples required, often limited to low training resolution.  Can only deal with relatively small occlusions. • Still need to fuse resulting detections  Hard/suboptimal from binary classification output • Classification task often ill-defined • How to label half a car? • Difficult to deal with changing aspect ratios A. Efros, B. Leibe, K. Mikolajczyk

  6. s s s y y y x Hough votes Binned accum. array Candidatemaxima Refinement(MSME) s y x Duality to Feature-Based Approaches… • How to find maxima in the Hough space efficiently? • Maxima search = coarse-to-fine sliding window stage! • Main differences: • All features evaluated upfront (instead of in cascade). • Generative model instead of discriminative classifier. • Maxima search already performs detection fusion. A. Efros, B. Leibe, K. Mikolajczyk

  7. So What is Left to Oppose? • Feature-based vs. Window-based? • (Almost) exclusive use of discriminative methods • Low training resolutions • How to deal with changing aspect ratios? A. Efros, B. Leibe, K. Mikolajczyk

  8. s y 1. Feature-based vs. Window-based • May be mainly an implementation trade-off • Few, localized features  feature-based evaluation better • Many, dense features  window-based evaluation better • Noticed already by e.g. [Schneiderman,’04] • The trade-offs may change as your method develops… A. Efros, B. Leibe, K. Mikolajczyk

  9. Matched Codebook Entries Probabilistic Voting Interest Points y Gen. Model inside! s x 3D Voting Space(continuous) Segmentation Backprojectionof Maxima p(figure)Probabilities BackprojectedHypotheses 2. Exclusive Use of Discriminative Methods [Leibe & Schiele,04] A. Efros, B. Leibe, K. Mikolajczyk

  10. Generative Models for Sliding Windows • Continuous confidence scores • Smoother maxima in hypothesis space • Coarser sampling possible A. Efros, B. Leibe, K. Mikolajczyk

  11. Generative Models for Sliding Windows • Continuous confidence scores • Smoother maxima in hypothesis space • Coarser sampling possible • Backprojection capability • Determine a hypothesis’s support in the image • Resolve overlapping cases A. Efros, B. Leibe, K. Mikolajczyk

  12. Generative Models for Sliding Windows • Continuous confidence scores • Smoother maxima in hypothesis space • Coarser sampling possible • Backprojection capability • Determine a hypothesis’s support in the image • Resolve overlapping cases • Easier to deal with partial occlusion • Part-based models • Reasoning about missing parts A. Efros, B. Leibe, K. Mikolajczyk

  13. Sliding Windows for Generative Models • Apply cascade idea to generative models • Discriminative training • Evaluate most promising features first A. Efros, B. Leibe, K. Mikolajczyk

  14. y s x Sliding Windows for Generative Models • Apply cascade idea to generative models • Discriminative training • Evaluate most promising features first • Direct control over search range • Only need to evaluate positions in search corridor • Only need to consider subset of features • Easier to adapt to different geometry(e.g. curved ground surface)  Should combine discriminative and generative elements! Search corridor A. Efros, B. Leibe, K. Mikolajczyk

  15. 3. Low Training Resolutions • Many current s-w detectors operate on tiny images • Viola&Jones: 2424 pixels • Torralba et al.: 3232 pixels • Dalal&Triggs: 6496 pixels (notable exception) • Main reasons • Training efficiency (exhaustive feature selection in AdaBoost) • Evaluation speed • Want to recognize objects at small scales • But… • Limited information content available at those resolutions • Not enough support to compensate for occlusions! A. Efros, B. Leibe, K. Mikolajczyk

  16. 4. Changing Aspect Ratios • Sliding window requires fixed window size • Basis for learning efficient cascade classifier • How to deal with changing aspect ratios? • Fixed window size  Wastes training dimensions • Adapted window size  Difficult to share features • “Squashed” views [Dalal&Triggs]  Need to squash test image, too A. Efros, B. Leibe, K. Mikolajczyk

  17. What is wrong with sliding window? Search complexity? A. Efros, B. Leibe, K. Mikolajczyk

  18. Is there anything that cannot be done with sliding window? A. Efros, B. Leibe, K. Mikolajczyk

More Related