1 / 1

Fast Memory-Efficient Generalized Belief Propagation

Fast Memory-Efficient Generalized Belief Propagation. Aim: To reduce time and memory requirements of Generalized Belief Propagation. Results. Fast LBP. Message M = max xi (x i ,x j ) * Local Belief (x i ). 100 random MRFs for varying n C /n L. Highest LB Label.

Download Presentation

Fast Memory-Efficient Generalized Belief Propagation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fast Memory-Efficient Generalized Belief Propagation Aim:To reduce time and memory requirements of Generalized Belief Propagation. Results Fast LBP Message M = max xi (xi,xj) * Local Belief (xi) • 100 random MRFs for varying nC/nL Highest LB Label Belief Propagation nC nC OR • Sites of MRF are clustered into regions. • Regions pass messagesto subregions until convergence. ij j ij j ij j Time Memory Fast GBP Message M = max xi (xi,xj)* (xi,xj) * LB(xi,xj) * LB(xi,xk) Loopy Belief Propagation (LBP) Subgraph Matching • Regions of size 2 • Inaccurate Bethe approximation • Computationally inexpensive nC ijk jk nC T1 ijk jk OR G2 = (V2,E2) G1 = (V1,E1) MRF MRF Regions+ Messages Highest LB(xi,xj) Label • 1000 synthetic pairs of graphs • 7% noise added Generalized Belief Propagation (GBP) * ij j ik k Highest LB(xi,xk) Label T3 T2 • Regions of arbitrary size S • Accurate Kikuchi approximation • Computationally expensive • The same label xi of site i is used to computed the terms T2 and T3. Proof in paper. • Term T1 takes O(nL/nC) less time than message M. Memory-Efficient GBP Truncation Factor = 0 MRF Regions+ Messages Robust Truncated Model (RTM) Do not contribute to message Object Recognition nL Outline nC nC nC Texture P Q P Q • Divide MRF into smaller MRFs which can be solved one at a time. Part likelihood Spatial Prior • Number of stored messages reduced by O((nL/nC)S-1). A B Bipartite Graphs Pairwise Potentials (xi,xj) • Time = 16 sec. Memory = 0.5 MB ROC Curves - 450 +ve and 2400 -ve images A B Regions MRF • Message within A depends only on messages from B (and vice versa). Reduction in Time and Memory Requirements • Number of stored messages can be halved.

More Related