1 / 17

Max-Sum Exact Inference

Inference. Probabilistic Graphical Models. MAP. Max-Sum Exact Inference. Product  Summation. E. A. B. C. D. Max-Sum Elimination in Chains. X. Factor Summation. Factor Maximization. E. A. B. C. D. Max-Sum Elimination in Chains. X. X. E. A. B. C. D.

dorjan
Download Presentation

Max-Sum Exact Inference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Inference Probabilistic Graphical Models MAP Max-Sum Exact Inference

  2. Product  Summation

  3. E A B C D Max-Sum Elimination in Chains X

  4. Factor Summation

  5. Factor Maximization

  6. E A B C D Max-Sum Elimination in Chains X X

  7. E A B C D Max-Sum Elimination in Chains X X X X

  8. 1: A,B 2: B,C 3: C,D 4: D,E C B D E A B C D Max-Sum in Clique Trees 12(B) = 21(B) = 23(C) = 32(C) = 34(D) = 43(D) =

  9. 1: A,B 2: B,C 3: C,D 4: D,E C B D Convergence of Message Passing • Once Ci receives a final message from all neighbors except Cj, then ij is also final (will never change) • Messages from leaves are immediately final 12(B) = 21(B) = 23(C) = 32(C) = 34(D) = 43(D) =

  10. A B C Simple Example

  11. 1: A,B 2: B,C B Simple Example

  12. Max-Sum BP at Convergence • Each clique contains max-marginal • Cliques are necessarily calibrated • Agree on sepsets

  13. Decoding a MAP Assignment • Easy if MAP assignment is unique • Single maximizing assignment at each clique • Whose value is the  value of the MAP assignment • Due to calibration, choices at all cliques must agree

  14. Decoding a MAP assignment • If MAP assignment is not unique, we may have multiple choices at some cliques • Arbitrary tie-breaking may not produce a MAP assignment

  15. Decoding a MAP assignment • If MAP assignment is not unique, we may have multiple choices at some cliques • Arbitrary tie-breaking may not produce a MAP assignment • Two options: • Slightly perturb parameters to make MAP unique • Use traceback procedure that incrementally builds a MAP assignment, one variable at a time

  16. Summary • The same clique tree algorithm used for sum-product can be used for max-sum • Result is a max-marginal at each clique C: • For each assignment c to C, what is the score of the best completion to c • Decoding the MAP assignment • If MAP assignment is unique, max-marginals allow easy decoding • Otherwise, some additional effort is required

  17. END END END

More Related