Sensitivity Analysis, Modeling, Inference And More SamIam

Sensitivity Analysis, Modeling, Inference And More SamIam PowerPoint PPT Presentation


  • 269 Views
  • Uploaded on
  • Presentation posted in: General

Real-World Situation. Player. . . DecisionMaker. Situation Model(Darwiche, Dechter). Inference Engine (Darwiche, Dechter, Hopkins). Interface(Kellman,Roth). Decision-Aid. Causal Queries(Pearl,Hopkins). Archeticture of Decision Aid. . . . . . . . When do numbers really matter? Hei Chan and Ad

Download Presentation

Sensitivity Analysis, Modeling, Inference And More SamIam

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


1. Sensitivity Analysis, Modeling, Inference And More (SamIam)

3. When do numbers really matter? Hei Chan and Adnan Darwiche, UAI-01 Approximating MAP using stochastic local search James Park and Adnan Darwiche, UAI-01 Using recursive decomposition to compute elimination orders, jointrees and dtrees Adnan Darwiche and Mark Hopkins, ECSQARU-01 Recursive conditioning Adnan Darwiche, Journal of Artificial Intelligence, 01

6. Current Users/Evaluators HRL Labs (Diagnosis) TRW IIT (reasoning about adversary intentions) Bizrate.com (e-commerce) UCLA Human Perception Lab

7. SamIam Features JAVA-based Implements: Differential approach Sensitivity analysis MAP computations Innovative decomposition algorithms** Will integrate: Anyspace algorithms Approximate algorithms Causal reasoning algorithms

8. Sensitivity Analysis

9. Tuning network parameters Evidence = {report, ~smoke} Currently, Pr(tampering | e) = 0.5. We want Pr(tampering | e) to 0.65. SamIam recommends: Increase Pr(tampering) from 0.02 to 0.036, or decrease Pr(report | ~leaving) from 0.01 to 0.005.

10. Tuning network parameters SamIam calculates minimal changes for each network parameter to enforce the following constraints: DIFFERENCE: Pr(y|e) – Pr(z|e) ³ e. RATIO: Pr(y|e) / Pr(z|e) ³ e.

11. How we do it… Difference: Pr(y|e) – Pr(z|e) ³ e

12. The sensitivity of probabilistic queries to parameter changes We want to understand the sensitivity of a query Pr(y|e) to changes in a meta parameter tx|u. A bound of the partial derivative:

13. Plot of the bound of the partial derivative

14. Some observations… An example where the bound of the derivative is tight. An example network in which the derivative tends to infinity. An example where an infinitesimal absolute change in a parameter can induce a non-infinitesimal absolute change in some query. An example where the relative change in query is not bounded by the relative parameter change

15. Effects of infinitesimal parameter changes Assume that tx|u £ 0.5. We apply an infinitesimal change Dtx|u to the meta parameter tx|u, leading to a change of D Pr(y|e). The relative change in the query Pr(y|e) is bounded by:

16. Effects of arbitrary parameter changes Odds of x|u: O(x|u) = Pr(x|u) / (1-Pr(x|u)) Odds of y|e: O(y|e) = Pr(y|e) / (1-Pr(y|e)) O’(x|u) and O’(y|e) are the new odds after having applied an arbitrary change to the meta parameter tx|u.

17. Effects of arbitrary parameter changes If the change in tx|u is positive, then: If the change in tx|u is negative, then: Combining both results, we have:

18. Applications Given a parameter Pr(x|u) and an applied change, we can calculate the upper bounds of the change in query Pr(y|e). Given a query Pr(y|e) and a desired change, we can calculate the lower bounds of the change in Pr(x|u) that we need. This can be done in constant time and serve as a preliminary recommendation.

19. Applications Bring the COA’s probability of success to >= .90 Bring the enemy’s COA probability of failure to >= .99 Reliability… What are the most influential “sensors” in predicting the probability of intention X

20. “Minimal” change The same relative odds change in different network parameters give us the same possible effects of the network queries. Therefore, the relative odds change can be adopted as a measure of a “minimal” change, instead of the absolute or relative change. We can use it to choose between different recommendations of parameter changes.

21. Changes that (don’t) matter Pr(y|e) = 0.6

23. Elimination orders, jointrees and dtrees

24. Elimination orders, jointrees and dtrees

26. Hypergraph partitioning

27. An algorithm using hypergraph partitioning to construct dtrees

29. Statistics for ISCAS ’85 Benchmark Circuits

30. Results for Suite of Belief Networks

31. Conclusions Theoretically, we have shown how methods for recursively decomposing DAGs can be used to construct elimination orders, dtrees, and jointrees. Practically, we have proposed and evaluated the use of a state-of-the-art system for hypergraph partitioning to recursively decompose DAGs and, hence, to construct elimination orders, dtrees, and jointrees. Currently, looking into other methods for constructing dtrees: graph aggregation! Dtree of undirected graphs.

33. Possible elimination orders for MPE, Pr(e) Orders width ABC 2 ACB 2 BAC 1 BCA 1 CAB 1 CBA 1

34. Possible elimination orders for MAP of B,C Orders width ABC 2 ACB 2 BAC 1 BCA 1 CAB 1 CBA 1

35. Unconstrained vs Constrained Width

36. Local Search Local search works as follows: Start from an initial guess Iteratively try to improve the estimate by moving to a neighbor which is better. It requires the ability to efficiently compute the score of each neighbor.

37. Local Search for MPE State space consists of all complete network instantiations. Neighbors of w are all instantiations w-X,x where x isn’t in w. The score for state w is Pr(w,e). The score for each neighbor can be computed locally in constant time.

38. Local Search for MAP State space consists of all instantiations of the MAP variables. Neighbors of w are all instantiations w-X,x where x isn’t in w. The score for state w is Pr(w,e). To be useful, requires an efficient method to compute Pr(w-X,x,e) for all X in W.

39. Computing Neighbor Scores Efficiently Pr(w-X,x,e) can be computed for all neighbors in the same time complexity as computing P(w,e). Can be done using differential inference. Can be done using fast retraction in jointrees.

40. Methods Search Strategies Hill climbing with random restart Taboo search Initialization Strategies Random MPE Individual maximum likelihood (ML) Sequential

41. Solution Quality

42. Evaluations Required

44. When do numbers really matter? Hei Chan and Adnan Darwiche, UAI-01 Approximating MAP using stochastic local search James Park and Adnan Darwiche, UAI-01 Using recursive decomposition to compute elimination orders, jointrees and dtrees Adnan Darwiche and Mark Hopkins, ECSQARU-01 Recursive conditioning Adnan Darwiche, Journal of Artificial Intelligence, 01

  • Login