1 / 15

Inverse Resolution

Inverse Resolution. CMSC 671 - Principles of AI Mike Smith 2001/12/04. Inverse Resolution. Why invert resolution? Wasn't resolution hard enough?. We can work resolution graphs backwards We can learn theories from examples We can use background knowledge to help

teems
Download Presentation

Inverse Resolution

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Inverse Resolution CMSC 671 - Principles of AI Mike Smith 2001/12/04

  2. Inverse Resolution Why invert resolution? Wasn't resolution hard enough? • We can work resolution graphs backwards • We can learn theories from examples • We can use background knowledge to help • Inverse resolution can be "lifted" to FOL • We can capture knowledge beyond attributes • We can interpret the resulting theories

  3. T = Theory B = Background Knowledge H = Hypothesis E = Examples Legend: Inverse Resolution – Learning Framework • Deductive framework: T entails E • Break T into B, H • Inductive framework: B ^ H entails E • Build set of resolution trees backwards from roots • New leaves not in prior knowledge are hypothesis

  4. Inverting Resolution • Four Rules • Absorption • Identification • Intra-construction • Inter-construction

  5. Absorption q <- A p <- A,B q <- A p <- q,B We can create a new clause p <- q,B by absorbing a conjunction of atoms (A) in the premise into a single atom (q) of the other clause q <- A p <- q,B p <- A,B

  6. female(mary) daughter(X,Y) <- female(X), parent(Y,X) -1= {mary/X} Absorption #2 parent(ann, mary) daughter(mary,Y)<-parent(Y,mary) -1= {ann/Y} Absorption #1 Absorption– Example B parent(ann, mary) female(mary) father(henry,jane) <- parent(henry,jane) E daughter(mary,ann) grandfather(henry,john) <- parent(henry,jane), parent(jane,john) grandfather(henry,john) <- parent(henry,jane), male(henry) daughter(mary,ann)

  7. Identification p <- A,B p <- A,q q <- B p <- A,q Because A,B and A,q have the same conclusion, B can be identified by q. p <- A,q q <- B p <- A,B

  8. p <- A,B p <- A,C q <- B p <- A,q q <- C Intra-Construction Construct a clause that represents the similarity between the two clauses, (p <- A,q) and then q<-B and q<-C come from applying the identification rule. q <- B p <- A,q q <- C p <- A,B p <- A,C

  9. q(henry,jane) <- parent(henry,jane) q(henry,jane) <- male(henry) grandfather(henry,john) <- parent(henry,jane), q(henry,jane) father(henry,jane) <- parent(henry,jane) father(henry,jane) <- male(henry) grandfather(henry,john) <- parent(henry,jane), father(henry,jane) grandfather(henry,john) <- parent(henry,jane), parent(jane,john) grandfather(henry,john) <- parent(henry,jane), male(henry) Intra-Construction Example B parent(ann, mary) female(mary) father(henry,jane) <- parent(henry,jane) E daughter(mary,ann) grandfather(henry,john) <- parent(henry,jane), parent(jane,john) grandfather(henry,john) <- parent(henry,jane), male(henry)

  10. p <- A,B q <- A,C p <- r,B p <- r,B r <- A r <- A q <- r,C q <- r,C Inter-Construction Noting the common variable A, construct a clause r <- A (r is new atom). The remaining two conclusive clauses are the result of applying the absorption rule. p <- A,B p <- A,C

  11. Using Inverse Resolution • Inductive Logic Programming (ILP) • ILP = Inductive Methods + Logic Programming • Two Major Induction Methods • Inverse Resolution • Top-Down Learning Methods

  12. ILP Systems

  13. Inductive Logic Programming Common Applications • Life Sciences / Molecular Biology • Predict 3D Protein Structures from Amino Acid Sequences • Predict Therapeutic Efficacy of Drugs • Predict Mutagenesis of Compounds • Natural Language • Learning Part of Speech Tagging • Learning Parsers

  14. References • Camacho. (1994).The Use of Background Knowledge in Inductive Logic Programming. http://citeseer.nj.nec.com/camacho94use.html • Muggleton. (199?). Inductive Logic Programming. http://www.cs.york.ac.uk/mlg/ilp.html • Russell & Norvig. (1995). Artificial Intelligence: A Modern Approach. • van der Poel. (2000). Inductive Logic Programming - Theory. http://ww.kbs.twi.tudelft.nl/Education/Cyberles/Trondheim/ILP/html/ilp_th_01introd.html • Wang. (2000). Parallel Inductive Logic in Data Mining. http://citeseer.nj.nec.com/wang00parallel.html • Weber. (1996). ILP Systems on the ILPnet Systems Repository http://www-ai.ijs.si/ilpnet/irenefinal.ps

  15. Questions? ?

More Related