1 / 58

PGM 2003/04 Tirgul6 Clique/Junction Tree Inference

PGM 2003/04 Tirgul6 Clique/Junction Tree Inference. Undirected graph representation. At each stage of the procedure, we have an algebraic term that we need to evaluate In general this term is of the form: where Z i are sets of variables

Download Presentation

PGM 2003/04 Tirgul6 Clique/Junction Tree Inference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PGM 2003/04 Tirgul6Clique/Junction Tree Inference

  2. Undirected graph representation • At each stage of the procedure, we have an algebraic term that we need to evaluate • In general this term is of the form:where Zi are sets of variables • We now plot a graph where there is an undirected edge X--Y if X,Y are arguments of some factor • that is, if X,Y are in some Zi • Note: this is the Markov network that describes the probability on the variables we did not eliminate yet

  3. S V L T B A X D S V L T B A X D Undirected Graph Representation • Consider the “Asia” example • The initial factors are • thus, the undirected graph is • In this case this graph is just the moralized graph

  4. S V L T B A X D Undirected Graph Representation • Now we eliminate t, getting • The corresponding change in the graph is S V L T B A X D

  5. S V L T S V B A X D D Example • Want to compute P(L, V = t, S = f, D = t) • Moralizing L T B A X

  6. S V L T S V B A X D D Example • Want to compute P(L, V = t, S = f, D = t) • Moralizing • Setting evidence L T B A X

  7. S V L T S V B A X D D Example • Want to compute P(L, V = t, S = f, D = t) • Moralizing • Setting evidence • Eliminating x • New factor fx(A) L T B A X

  8. S V L T S V B A X D D Example • Want to compute P(L, V = t, S = f, D = t) • Moralizing • Setting evidence • Eliminating x • Eliminating a • New factor fa(b,t,l) L T B A X

  9. S V L T S V B A X D D Example • Want to compute P(L, V = t, S = f, D = t) • Moralizing • Setting evidence • Eliminating x • Eliminating a • Eliminating b • New factor fb(t,l) L T B A X

  10. S V L T S V B A X D D Example • Want to compute P(L, V = t, S = f, D = t) • Moralizing • Setting evidence • Eliminating x • Eliminating a • Eliminating b • Eliminating t • New factor ft(l) L T B A X

  11. Elimination in Undirected Graphs • Generalizing, we see that we can eliminate a variable x by 1. For all Y,Z, s.t., Y--X, Z--X • add an edge Y--Z 2. Remove X and all adjacent edges to it • This procedures create a clique that contains all the neighbors of X • After step 1 we have a clique that corresponds to the intermediate factor (before marginlization) • The cost of the step is exponential in the size of this clique

  12. Undirected Graphs • The process of eliminating nodes from an undirected graph gives us a clue to the complexity of inference • To see this, we will examine the graph that contains all of the edges we added during the elimination

  13. S V L T S V B A X D D Example • Want to compute P(L) • Moralizing L T B A X

  14. S V L T B A X D Example • Want to compute P(L) • Moralizing • Eliminating v • Multiply to get f’v(v,t) • Result fv(t) S V L T B A X D

  15. S V L T B A X D Example • Want to compute P(L) • Moralizing • Eliminating v • Eliminating x • Multiply to get f’x(a,x) • Result fx(a) S V L T B A X D

  16. S V L T B A X D Example • Want to compute P(L) • Moralizing • Eliminating v • Eliminating x • Eliminating s • Multiply to get f’s(l,b,s) • Result fs(l,b) S V L T B A X D

  17. S V L T B A X D Example • Want to compute P(D) • Moralizing • Eliminating v • Eliminating x • Eliminating s • Eliminating t • Multiply to get f’t(a,l,t) • Result ft(a,l) S V L T B A X D

  18. S V L T B A X D Example • Want to compute P(D) • Moralizing • Eliminating v • Eliminating x • Eliminating s • Eliminating t • Eliminating l • Multiply to get f’l(a,b,l) • Result fl(a,b) S V L T B A X D

  19. S V L T B A X D Example • Want to compute P(D) • Moralizing • Eliminating v • Eliminating x • Eliminating s • Eliminating t • Eliminating l • Eliminating a, b • Multiply to get f’a(a,b,d) • Result f(d) S V L T B A X D

  20. S V L T B A X D Expanded Graphs • The resulting graph is the inducedgraph (for this particular ordering) • Main property: • Every maximal clique in the induced graphcorresponds to a intermediate factor in the computation • Every factor stored during the process is a subset of some maximal clique in the graph • These facts are true for any variable elimination ordering on any network

  21. Induced Width • The size of the largest clique in the induced graph is thus an indicator for the complexity of variable elimination • This quantity is called the induced width of a graph according to the specified ordering • Finding a good ordering for a graph is equivalent to finding the minimal induced width of the graph

  22. A A C C B B E E D D F F G G Consequence: Elimination on Trees • Suppose we have a tree • A network where each variable has at most one parent • All the factors involve at most two variables • Thus, the moralized graph is also a tree

  23. A C B E D F G Elimination on Trees • We can maintain the tree structure by eliminating extreme variables in the tree A C B E D F G A C B E D F G

  24. Elimination on Trees • Formally, for any tree, there is an elimination ordering with induced width = 1 Thm • Inference on trees is linear in number of variables

  25. A H C B E D F G PolyTrees • A polytree is a network where there is at most one path from one variable to another Thm: • Inference in a polytree is linear in the representation size of the network • This assumes tabular CPT representation • Can you see how the argument would work?

  26. General Networks What do we do when the network is not a polytree? • If network has a cycle, the induced width for any ordering is greater than 1

  27. A A A B C B B B C C C A A B C D E E D E E D D D E G F F G F G F G F G H H H H H Example • Eliminating A, B, C, D, E,….

  28. A B C C B C B C B A A D E E E E D D D F G F G F F G G H H H Example • Eliminating H,G, E, C, F, D, E, A A A B C D E F G H H

  29. General Networks • From graph theory: Thm: • Finding an ordering that minimizes the induced width is NP-Hard However, • There are reasonable heuristic for finding “relatively” good ordering • There are provable approximations to the best induced width • If the graph has a small induced width, there are algorithms that find it in polynomial time

  30. S V L T B A S V X D L T B A X D Chordal Graphs • Recall: elimination ordering  undirected chordal graph Graph: • Maximal cliques are factors in elimination • Factors in elimination are cliques in the graph • Complexity is exponential in size of the largest clique in graph

  31. S V L T B A X D Cluster Trees • Variable elimination  graph of clusters • Nodes in graph are annotated by the variables in a factor • Clusters: circles correspond to multiplication • Separators: boxes correspond to marginalization T,V T A,L,T B,L,S A,L B,L A,L,B X,A A,B A A,B,D

  32. Properties of cluster trees • Cluster graph must be a tree • Only one path between anytwo clusters • A separator is labeled by the intersection of the labels of the two neighboring clusters • Running intersection property: • All separators on the path between two clusters contain their intersection T,V T A,L,T B,L,S A,L B,L A,L,B X,A A,B A A,B,D

  33. S V L T B A X D Cluster Trees & Chordal Graphs • Combining the two representations we get that: • Every maximal clique in chordal is a cluster in tree • Every separator in tree is a separator in the chordal graph T,V T A,L,T B,L,S A,L B,L A,L,B X,A A,B A A,B,D

  34. S V T,V T L T A,L,T B,L,S B A A,L B,L X D A,L,B X,A A,B A A,B,D Cluster Trees & Chordal Graphs Observation: • If a cluster that is not a maximal clique, then it must be adjacent to one that is a superset of it • We might as well work with cluster tree were each cluster is a maximal clique

  35. Cluster Trees & Chordal Graphs Thm: • If G is a chordal graph, then it can be embedded in a tree of cliques such that: • Every clique in G is a subset of at least one node in the tree • The tree satisfies the running intersection property

  36. S V T,V T L T A,L,T B,L,S B A A,L B,L X D A,L,B X,A A,B A A,B,D Elimination in Chordal Graphs • A separator S divides the remaining variables in the graph in to two groups • Variables in each group appears on one “side” in the cluster tree • Examples: • {A,B}: {L, S, T, V} & {D, X} • {A,L}: {T, V} & {B,D,S,X} • {B,L}: {S} & {A, D,T, V, X} • {A}: {X} & {B,D,L, S, T, V} • {T}; {V} & {A, B, D, K, S, X}

  37. x fX(S) S B A fY(S) y Elimination in Cluster Trees • Let X and Ybe the partition induced by S Observation: • Eliminating all variables in X results in a factor fX(S) • Proof: Since S is a separator only variables in S are adjacentto variables in X • Note:The same factor would result, regardless of elimination ordering

  38. Recursive Elimination in Cluster Trees • How do we compute fX(S) ? • By recursive decomposition alongcluster tree • Let X1 and X2 be the disjoint partitioning of X - C implied by theseparators S1 and S2 • Eliminate X1 to get fX1(S1) • Eliminate X2 to get fX2(S2) • Eliminate variables in C - S toget fX(S) x1 x2 S1 S2 C S y

  39. Elimination in Cluster Trees(or Belief Propagation revisited) • Assume we have a cluster tree • Separators: S1,…,Sk • Each Si determines two sets of variables Xi and Yi, s.t. • Si Xi Yi = {X1,…,Xn} • All paths from clusters containing variables in Xi to clusters containing variables in Yi pass through Si • We want to compute fXi(Si) and fYi(Si) for all i

  40. Elimination in Cluster Trees Idea: • Each of these factors can be decomposed as an expression involving some of the others • Use dynamic programming to avoid recomputation of factors

  41. T,V T A,L,T B,L,S A,L B,L A,L,B X,A A,B A A,B,D Example

  42. Dynamic Programming We now have the tools to solve the multi-query problem • Step 1: Inward propagation • Pick a cluster C • Compute all factors eliminating fromfringes of the tree toward C • This computes all “inward” factors associated with separators C

  43. Dynamic Programming We now have the tools to solve the multi-query problem • Step 1: Inward propagation • Step 2: Outward propagation • Compute all factors on separators going outward from C to fringes C

  44. Dynamic Programming We now have the tools to solve the multi-query problem • Step 1: Inward propagation • Step 2: Outward propagation • Step 3: Computing beliefs on clusters • To get belief on a cluster C’ multiply: • CPDs that involves only variables in C’ • Factors on separators adjacent toC’ using the proper direction • This simulates the result of eliminationof all variables except these in C’using pre-computed factors C C’’

  45. Complexity Time complexity: • Each traversal of the tree is costs the same as standard variable elimination • Total computation cost is twice of standard variable elimination Space complexity: • Need to store partial results • Requires two factors for each separator • Space requirements can be up to 2n more expensive than variable elimination

  46. Smoking Visit to Asia Tuberculosis Lung Cancer Abnormality in Chest Bronchitis Dyspnea X-Ray The “Asia” network with evidence We want to compute P(L|D=t,V=t,S=f)

  47. Initial factors with evidence We want to compute P(L|D=t,V=t,S=f) P(T|V):( ( Tuberculosis false ) ( VisitToAsia true ) ) 0.95( ( Tuberculosis true ) ( VisitToAsia true ) ) 0.05 P(B|S):( ( Bronchitis false ) ( Smoking false ) ) 0.7 ( ( Bronchitis true ) ( Smoking false ) ) 0.3 P(L|S):( ( LungCancer false ) ( Smoking false ) ) 0.99 ( ( LungCancer true ) ( Smoking false ) ) 0.01 P(D|B,A):( ( Dyspnea true ) ( Bronchitis false ) ( AbnormalityInChest false ) ) 0.1 ( ( Dyspnea true ) ( Bronchitis true ) ( AbnormalityInChest false ) ) 0.8 ( ( Dyspnea true ) ( Bronchitis false ) ( AbnormalityInChest true ) ) 0.7 ( ( Dyspnea true ) ( Bronchitis true ) ( AbnormalityInChest true ) ) 0.9

  48. Initial factors with evidence (cont.) P(A|L,T):( ( Tuberculosis false ) ( LungCancer false ) ( AbnormalityInChest false ) ) 1 ( ( Tuberculosis true ) ( LungCancer false ) ( AbnormalityInChest false ) ) 0 ( ( Tuberculosis false ) ( LungCancer true ) ( AbnormalityInChest false ) ) 0 ( ( Tuberculosis true ) ( LungCancer true ) ( AbnormalityInChest false ) ) 0 ( ( Tuberculosis false ) ( LungCancer false ) ( AbnormalityInChest true ) ) 0 ( ( Tuberculosis true ) ( LungCancer false ) ( AbnormalityInChest true ) ) 1 ( ( Tuberculosis false ) ( LungCancer true ) ( AbnormalityInChest true ) ) 1 ( ( Tuberculosis true ) ( LungCancer true ) ( AbnormalityInChest true ) ) 1 P(X|A):( ( X-Ray false ) ( AbnormalityInChest false ) ) 0.95( ( X-Ray true ) ( AbnormalityInChest false ) ) 0.05 ( ( X-Ray false ) ( AbnormalityInChest true ) ) 0.02 ( ( X-Ray true ) ( AbnormalityInChest true ) ) 0.98

  49. Step 1: Initial Clique values T,V CT=P(T|V) T CB,L=P(L|S)P(B|S) T,L,A B,L,S CT,L,A=P(A|L,T) L,A B,L CX,A=P(X|A) X,A B,L,A CB,L,A=1 B,A A “dummy” separators: this is the intersection between nodes in the junction tree and helps in defining the inference messages (see below) D,B,A CB,A=1

  50. Step 2: Update from leaves T,V CT T ST=CT T,L,A B,L,S CT,L,A CB,L L,A B,L S B,L=CB,L B,L,A X,A CB,L,A CX,A B,A A S A=CX,A D,B,A CB,A

More Related