1 / 13

236372 - Bayesian Networks

236372 - Bayesian Networks. Clique tree algorithm Presented by Sergey Vichik. Evidences. Enter evidences. Tree with Evidences. Inference. Moralization. Find Cliques. Clique tree. Bayesian network. Add edges. Create clique tree. Markov graph. Chordal graph. Cliques graph.

kara
Download Presentation

236372 - Bayesian Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 236372 - Bayesian Networks Clique tree algorithm Presented by Sergey Vichik .

  2. Evidences Enter evidences Tree with Evidences Inference Moralization Find Cliques Clique tree Bayesian network Add edges Create clique tree Markov graph Chordal graph Cliques graph Algorithm sequence • Translate a BN to Markov graph (moralization) • Add edges to create chordal graph • Find cliques • Construct clique tree • Enter evidences • Calc a posteriori probability (inference)

  3. Evidences Enter evidences Tree with Evidences Inference Moralization Find Cliques Clique tree Bayesian network Add edges Create clique tree Markov graph Chordal graph Cliques graph BN to Markov A • Add an edge between parents. A B B G G D D C C E E F F

  4. Evidences Enter evidences Tree with Evidences Inference Moralization Find Cliques Clique tree Bayesian network Add edges Create clique tree Markov graph Chordal graph Cliques graph Create Chordal graph • Add edges to form a chordal graph. A A B B G G D D C C E E F F

  5. A B G D C E F Evidences Enter evidences Tree with Evidences Inference Moralization Find Cliques Clique tree Bayesian network Add edges Create clique tree Markov graph Chordal graph Cliques graph Cliques graph 1 AGC ABD • Find all cliques, connect them to form a clique graph • Cliques are connected if they are sharing a variable 2 2 ADC 1 1 2 CDE 1 EF

  6. Running Intersection property 1 AGC ABD • Many trees may be embedded in the cliques graph. What tree to choose? • Lets try: {ADC}-{AGC}-{ABD}-{CDE}-{EF} • Now lets follow this order of cliques and perform an elimination: • Eliminate F, Eliminate E • Now, in order to continue we need to eliminate either C or D. Eliminating any of them will result in creating an extra edge : CB or GD, thus enlarging the probability tables. • The required property : If variable x is contained in cliques Y and Z, it must be contained in every. clique on path from Y to Z. 2 ADC 1 CDE 1 EF A B G D C E F

  7. Evidences Enter evidences Tree with Evidences Inference Moralization Find Cliques Clique tree Bayesian network Add edges Create clique tree Markov graph Chordal graph Cliques graph Clique tree construction • A maximal spanning tree of the clique graph, is the required clique tree. • Proof: At the end of the lecture. AGC ABD 2 2 ADC 2 CDE 1 EF

  8. Evidences Enter evidences Tree with Evidences Inference Moralization Find Cliques Clique tree Bayesian network Add edges Create clique tree Markov graph Chordal graph Cliques graph Enter evidence • Entering evidence is eqivalent to removing the evidence variables and recalculating the effected probability functions. • General approach : build the clique tree without the evidence, and then recalculate the effected cliques. Actually it means reducing the tables to specific values : • f(E,F) -> f(E,F=f).

  9. Evidences Enter evidences Tree with Evidences Inference Moralization Find Cliques Clique tree Bayesian network Add edges Create clique tree Markov graph Chordal graph Cliques graph Efficient Calculation of probabilities AGC ABD • mij – a message from i to j • Wait for all messages excluding j (immediately for leaves). • Store messages on the edge for efficient update. AC AD ADC CD CDE E EF

  10. Calculation of a marginal probability • Select clique with a variable. • Multiply all incoming messages. • Marginalize to the required variable. P=f(ADC)f(AGC)f(ABD)f(CDE)f(EF) AGC ABD AC AD ADC CD CDE E EF

  11. Maximum Spanning tree is a Clique Tree – a Proof ../1 1) From graph theory : Cycle property: • For any cycle C in the graph, if the weight of an edge e of C is smaller than the weights of other edges of C, then this edge cannot belong to an MST. 2) For any chordal graph exists a clique tree. • Every chordal graph have a perfect elimination order.

  12. Proof ../2 Cut set - set of all edges connecting 2 graph partitions • Lets take a clique tree that has as much common edges with MST but different. • For the purpose of contradiction, assume that edge (K1,K2) in MST is not an edge in CT (Clique Tree). • take the cut set associated with (K1,K2) in MST from the full graph. • There must be another edge (K3,K4) not equal to (K1,K2) in this cut set, and (K3,K4) is in CT but not in MST. K1 K2 K3 K4 (K1,K2)MST (K3,K4)CT (K3,K4)MST

  13. Proof ../3 K1 K2 • But from properties of clique tree : K1∩K2 K3∩K4. if K1∩K2 K3∩K4, it contradicts the fact that MST is maximal. • Therefore K1∩K2 = K3∩K4. • We can replace the (K3,K4) with (K1,K2) in CT, and still remain with a clique tree. • Lets take K5 and K6 belong to different cut set sides. From properties of CT, K5∩K6  K3∩K4 and thus K5∩K6  K1∩K2. • Therefore the clique intersection property holds • We have contradicted the fact that CT is different from MST and proposed an algorithm to make them equal. K3 K4 (K1,K2)MST (K3,K4)CT (K3,K4)MST CT=> K1∩K2 K3∩K4 MST=> K1∩K2 K3∩K4 => K1∩K2 = K3∩K4 //

More Related