1 / 22

CS498-EA Reasoning in AI Lecture #14

CS498-EA Reasoning in AI Lecture #14. Professor: Eyal Amir Fall Semester 2009. * Some slides due to Fei-Fei Li (Stanford U). Summary So Far in Our Class. We saw motivating applications We discussed two methods for propositional-logical reasoning

deron
Download Presentation

CS498-EA Reasoning in AI Lecture #14

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS498-EAReasoning in AILecture #14 Professor: Eyal Amir Fall Semester 2009 * Some slides due to Fei-Fei Li (Stanford U)

  2. Summary So Far in Our Class • We saw motivating applications • We discussed two methods for propositional-logical reasoning • We studied properties of graphical models of probability distributions • We learned 2 kinds of probabilistic inference methods in graphical models • We examined 2 methods for learning parameters of graphical models

  3. The Road Ahead in Our Class • Variational Approximations • Models and inference with dynamic (temporal) systems: logical, probabilistic • More expressive representations and inference: • First-Order Logic (FOL) • Relational/First-Order Probabilistic Models • Semantic Web and Description Logics • Cross-cutting issues

  4. Before we Continue… • Applications of methods we’ve learned • Review ideas and techniques • Reinvigorate our search for more methods…

  5. Memories from Lecture 2… • Applications of reasoning in AI • Econometrics • Social Networks • Verification of Circuits and Programs • Natural Language Processing • Robotics • Vision • Computer Security

  6. Econometrics Example: A Recession Model of a country • What is probability of recession, when a bank(bm) goes into bankruptcy? • Recession: Recession of a country in [0,1] • Market[X]: Quarterly market (X) index • Loss[X,Y]: Loss of a bank (Y) in a market (X) • Revenue[Y]: Revenue of a bank (Y)

  7. Experiments

  8. Experiments

  9. Social Networks Example: school friendships and their effects Friend(A,B) Attr(A) Measuremt(A) shorthand for Friend(., .), Atrr(.), and Measuremt(.) potential func­tions Friend(A,C) Attr(B) Measuremt(B) Friend(B,C) Attr(C) Measuremt(C)

  10. hlia blia hjoe htom hbob btom hann bjoe bann bbob hval bval f f f f f f f f f f f f f f f f f f f f f f f f f f f f f f tom; val bob; lia lia; tom ann; lia ann; tom lia; ann val; bob tom; ann joe; val ann; joe val; joe tom; lia bob; val lia; joe bob; tom val; tom joe; ann ann; bob val; lia joe; bob tom; bob joe; tom tom; joe joe; lia bob; ann lia; val val; ann ann; val lia; bob bob; joe

  11. Scaling-Up: Computing Pr(f(x,y)) Figure 5: Computation time for

  12. Application: Hardware Verification f3 x1 f1 not AND x2 f5 AND not f2 OR x3 f4 Question: Can we set this boolean cirtuit to TRUE? f5(x1,x2,x3) = a function of the input signal

  13. Application: Hardware Verification f3 x1 f1 not AND x2 f5 AND not f2 OR SAT(f5) ? x3 f4 Question: Can we set this boolean cirtuit to TRUE? f5(x1,x2,x3) = f3 f4 = f1  (f2  x3) = (x1  x2)  (x2  x3) M[x1]=FALSE M[x2]=FALSE M[x3]=FALSE

  14. Hardware Verification • Questions in logical circuit verification • Equivalence of circuits • Arrival of the circuit to a state (required a temporal model, which gets propositionalized) • Achieving an output from the circuit

  15. Natural-Language Processing • Logical semantics • Probabilistic choice between meanings • Inference over time

  16. Vision: Variability within a category Intrinsic Deformation

  17. Constellation model of object categories Burl, Leung, Weber, Welling, Fergus, Fei-Fei, Perona, et al.

  18. Goal

  19. Goal Burl, Leung, et al. ’96 ’98 Weber, Welling, et al. ’98 ’00, Fergus, et al. ‘03

  20. Use prior knowledge of other objects Goal • Estimate uncertainties in models • Do full Bayesian learning • Reduce the number of training examples

  21. Variational Approximation Outline • Motivation • Outline of the Variational Approximation approach • Loopy Belief Propagation • Variational methodology • Sequential approach • Block approach

  22. Variational Inference (in three easy steps…) • Choose a family of variational distributions Q(H). • Use Kullback-Leibler divergence KL(Q||P) as a measure of ‘distance’ between P(H|V) and Q(H). • Find Q which minimises divergence.

More Related