1 / 21

Chap. 4 Decision Graphs

Chap. 4 Decision Graphs. Statistical Genetics Forum Bayesian Networks and Decision Graphs Finn V. Jensen Presented by Ken Chen Genome Sequencing Center. T. Sleepy. Fever. Flu. A. Using probabilities provided by network to support decision-making Test decisions

asha
Download Presentation

Chap. 4 Decision Graphs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chap. 4 Decision Graphs Statistical Genetics Forum Bayesian Networks and Decision Graphs Finn V. Jensen Presented by Ken Chen Genome Sequencing Center

  2. T Sleepy Fever Flu A • Using probabilities provided by network to • support decision-making • Test decisions • Look for more evidences • Action decisions

  3. FC SC OH0 OH1 OH2 BH MH FC SC OH0 OH1 OH2 MH BH D U One Action: Example: Poker Game: Call Fold

  4. D One action in general Goal: find D=d that maximize EU(D|e)

  5. Action DSS GA UGA UDSS • 4.2 Utilities: • Example: Management of effort • Decision: • Gd: keep pace in GA, follow DSS superficially • SB: slow down in both courses • Dg: keep pace in DSS, follow GA superficially Game 1: maximize the sum of the exp marks General: maximize the sum of the exp utilities

  6. T H U A 4.3 Value of information

  7. Nonutility value functions • When there is no proper model for actions and utilities, the reason for test is to decrease the uncertainty of the hypothesis

  8. discard 97.74 Action pos yes Test2 T2 99.74 clean pos pour Inf yes neg Action Test1 T1 -0.26 99.94 clean infected neg pour Action Inf -0.06 infected Nonmyopic data request

  9. P(X=x1|o) action1 action2 P(X=x2|o) D X U … … actionn P(X=xn|o) Decision Tree • Nonleaf nodes are decision nodes or chance nodes, and the leaves are utility nodes • Complete: For a chance node there must be a link for each possible state, and from a decision node there must be a link for each possible decision option

  10. A car start problem • Possible Fault: • Spark Plug (SP), prob=0.3 • Ignition System (IS), prob=0.2, • Others, prob=0.5 • Actions: • SP, fixes SP, 4 min • IS, fixes IS with prob=0.5, 2 min • T, test OK iff. IS is OK, 0.5 min • RS, fixes everything, 15 min • Goal: • Have car fixed asap

  11. Fault IS T Fault-I 14.5 25.5 10.5 P(SP fix|T=OK) =P(SP|T=OK) =P(SP| !IS ) =P(SP)/(P(SP)+P(others)) =0.3/0.8=0.38 RS OK 0.38 RS 15 SP !OK D D 0.62 OK 0.8 RS 12.5 14.5 D RS !OK RS !OK 0.5 IS T OK 0.2 27.5 D 0.5 26 OK SP D 0.3 !OK 0.7 D D IS … D D … OK 28 D 0.1 D !OK 0.9 … D … P(IS fix)=P(IS)P(fix|IS)=0.2*0.5=0.1

  12. 14.5 25.5 10.5 Solving Decision Trees RS OK 0.38 RS 15 SP !OK D D 0.62 OK 0.8 RS 12.5 14.5 D RS !OK RS !OK 0.5 IS T OK 0.2 27.5 D 0.5 16.96 26 OK SP D 0.3 16.27 !OK 0.7 D D IS … 15.43 D D … OK 28 D 0.1 D !OK 0.9 … D …

  13. Coalesced decision trees • Grow exponentially with the number of decisions and chance variables • When decision tree contains identical subtrees they can be collapsed.

  14. 4.5 Decision-Theoretic Troubleshooting • A fault causing a device to malfunction is identified and eliminated through a sequence of troubleshooting steps. • A troubleshooting problem can be represented and solved through a decision tree (actions and questions) • As decision trees have a risk of becoming intractably large, we look for ways of pruning the decision tree.

  15. Action sequences • Ai=yes, Ai=no • Cost of action Ai, Ci(),  evidences • Action seq: s=<A1,…,An> repeatly performing the next action until problem gets fixed or the last action has been performed • Expected cost of repair (ECR)

  16. Local optimality of the optimal sequence (Dynamic Programming) Consider two neighboring actions Ai and Ai+1 Pruned tree has eight non-RS links, compared to 32 in a coalesced DT for the same problem

  17. F1 A1 F2 A2 F3 A3 F4 The greedy approach • Always choose the action with the highest efficiency Not necessarily optimal! • Proposition 4.2: Conditions under which the greedy approach is optimal: • n faults F1…Fn, and n actions: A1 … An • Exactly one of the faults is present • Each action has a specific probability of repair: pi=P(Ai=yes|Fi), P(Ai=yes|Fj)=0 if i≠j • The cost Ci of an action does not depend on the performance of previous actions • Theorem 4.2: for action sequence s fulfilling the conditions in Proposition 4.2. Assume s is ordered according to decreasing initial efficiencies. Then s is an optimal action sequence and

  18. Influence Diagram • A compact representation of decision tree • Now seen more as a decision tool extending Bayesian networks • Syntax: • There is a directed path comprising all decision nodes • The utility nodes have no children • The decision nodes and the chance nodes have a finite set of mutually exclusive states • The utility nodes have no states • To each chance node A is attached a conditional probability table P(A|pa(A)) • The each utility node U is attached a real-valued function over pa(U)

  19. OFC OFC OSC OSC OH0 OH0 OH1 OH1 OH OH BH BH MH0 MH0 MH1 MH1 MH MH U U MFC MFC MSC MSC D D BN Influence Diagram No-forgetting: The decision maker remembers the past observations and decisions

  20. Solution to influence diagrams • Similar to decision-tree • More efficiently by exploiting the structure of of the influence diagram (Chapter 7)

  21. V1 V2 V5 V3 V4 T3 T1 T5 T2 T4 FV4 FV3 FV1 FV2 FV5 U3 U4 U5 U2 U1 Information blocking Fishing Vol FV5 has 109 elements Introduce variables/links which,when observed, d-separate most of the past from The present decision V1 V2 V3 V4 V5 T1 FV1 T2 FV2 T3 FV3 T4 FV4 T5 FV5 U1 U2 U3 U4 U5

More Related