1 / 44

Intelligent Systems (AI-2) Computer Science cpsc422 , Lecture 32 Apr, 2, 2014

Intelligent Systems (AI-2) Computer Science cpsc422 , Lecture 32 Apr, 2, 2014. Slide source: from Pedro Domingos UW . Lecture Overview. Statistical Relational Models ( for us aka Hybrid ) Recap Markov Networks and log-linear models Markov Logic. Statistical Relational Models.

lok
Download Presentation

Intelligent Systems (AI-2) Computer Science cpsc422 , Lecture 32 Apr, 2, 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 32 Apr, 2, 2014 Slide source: from Pedro Domingos UW CPSC 322, Lecture 32

  2. Lecture Overview • Statistical Relational Models (for us aka Hybrid) • Recap Markov Networks and log-linear models • Markov Logic CPSC 322, Lecture 32

  3. Statistical Relational Models Goals: • Combine (subsets of) logic and probability into a single language (R&R system) • Develop efficient inference algorithms • Develop efficient learning algorithms • Apply to real-world problems L. Getoor & B. Taskar (eds.), Introduction to Statistical Relational Learning, MIT Press, 2007. CPSC 322, Lecture 32

  4. Plethora of Approaches • Knowledge-based model construction[Wellman et al., 1992] • Stochastic logic programs [Muggleton, 1996] • Probabilistic relational models[Friedman et al., 1999] • Relational Markov networks [Taskar et al., 2002] • Bayesian logic[Milch et al., 2005] • Markov logic[Richardson & Domingos, 2006] • And many others! CPSC 322, Lecture 32

  5. Prob. Rel. Models vs. Markov Logic CPSC 322, Lecture 32

  6. Lecture Overview • Statistical Relational Models (for us aka Hybrid) • Recap Markov Networks and log-linear models • Markov Logic • Markov Logic Network (MLN) CPSC 322, Lecture 32

  7. Parameterization of Markov Networks • Factors define the local interactions (like CPTs in Bnets) • What about the global model? What do you do with Bnets? X X CPSC 322, Lecture 32

  8. How do we combine local models? • As in BNets by multiplying them! CPSC 322, Lecture 32

  9. Markov Networks Smoking Cancer • Undirected graphical models Asthma Cough • Factors/Potential-functions defined over cliques CPSC 322, Lecture 32

  10. Markov Networks :log-linear model Smoking Cancer • Log-linear model: Asthma Cough Weight of Feature i Feature i CPSC 322, Lecture 32

  11. Lecture Overview • Statistical Relational Models (for us aka Hybrid) • Recap Markov Networks • Markov Logic CPSC 322, Lecture 32

  12. Markov Logic: Intuition(1) • A logical KB is a set of hard constraintson the set of possible worlds CPSC 322, Lecture 32

  13. Markov Logic: Intuition(1) • A logical KB is a set of hard constraintson the set of possible worlds • Let’s make them soft constraints:When a world violates a formula,It becomes less probable, not impossible CPSC 322, Lecture 32

  14. Markov Logic: Intuition (2) • The more formulas in the KB a possible world satisfies the more it should be likely • Give each formula a weight • By design, if a possible world satisfies a formula its log probability should go up proportionally to the formula weight. CPSC 322, Lecture 32

  15. Markov Logic: Definition • A Markov Logic Network (MLN) is • a set of pairs (F, w) where • F is a formula in first-order logic • w is a real number • Together with a set C of constants, • It defines a Markov network with • One binary node for each grounding of each predicate in the MLN • One feature/factor for each grounding of each formulaF in the MLN, with the corresponding weight w Grounding: substituting vars with constants CPSC 322, Lecture 32

  16. Example: Friends & Smokers CPSC 322, Lecture 32

  17. Example: Friends & Smokers CPSC 322, Lecture 32

  18. Example: Friends & Smokers CPSC 322, Lecture 32

  19. Example: Friends & Smokers Two constants: Anna (A) and Bob (B) CPSC 322, Lecture 32

  20. MLN nodes Two constants: Anna (A) and Bob (B) • One binary node for each grounding of each predicate in the MLN Grounding: substituting vars with constants Smokes(A) Smokes(B) Cancer(A) Cancer(B) • Any nodes missing? CPSC 322, Lecture 32

  21. MLN nodes (complete) Two constants: Anna (A) and Bob (B) • One binary node for each grounding of each predicate in the MLN Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A) CPSC 322, Lecture 32

  22. MLN features Two constants: Anna (A) and Bob (B) Edge between two nodes iffthe corresponding ground predicates appear together in at least one grounding of one formula Grounding: substituting vars with constants Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A) Which edge should not be there? CPSC 322, Lecture 32

  23. MLN features Two constants: Anna (A) and Bob (B) Edge between two nodes iffthe corresponding ground predicates appear together in at least one grounding of one formula Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A) CPSC 322, Lecture 32

  24. MLN features Two constants: Anna (A) and Bob (B) Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A) CPSC 322, Lecture 32

  25. MLN: parameters • For each formula i we have a factor CPSC 322, Lecture 32

  26. MLN: prob. of possible world Two constants: Anna (A) and Bob (B) Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A) CPSC 322, Lecture 32

  27. MLN: prob. Of possible world • Probability of a world pw: Friends(A,B) Weight of formula i No. of true groundings of formula iin pw Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A) CPSC 322, Lecture 32

  28. Learning Goals for today’s class You can: • Describe the intuitions behind the design of a Markov Logic • Define and Build a Markov Logic Network • Justify and apply the formula for computing the probability of a possible world CPSC 322, Lecture 32

  29. Next class on Wed Markov Logic -relation to FOL - Inference (MAP and Cond. Prob) Assignment-4 due on Mon (last class) Still have marked midterms and assignment 1-2 CPSC 322, Lecture 32

  30. Relation to First-Order Logic • Example pag 17 • Infinite weights  First-order logic • Satisfiable KB, positive weights Satisfying assignments = Modes of distribution • Markov logic allows contradictions between formulas CPSC 322, Lecture 32

  31. Special cases: Markov networks Markov random fields Bayesian networks Log-linear models Exponential models Max. entropy models Gibbs distributions Boltzmann machines Logistic regression Hidden Markov models Conditional random fields Obtained by making all predicates zero-arity Markov logic allows objects to be interdependent (non-i.i.d.) Relation to Statistical Models CPSC 322, Lecture 32

  32. MAP Inference • Problem: Find most likely state of world given evidence Query Evidence CPSC 322, Lecture 32

  33. MAP Inference • Problem: Find most likely state of world given evidence CPSC 322, Lecture 32

  34. MAP Inference • Problem: Find most likely state of world given evidence CPSC 322, Lecture 32

  35. MAP Inference • Problem: Find most likely state of world given evidence • This is just the weighted MaxSAT problem • Use weighted SAT solver(e.g., MaxWalkSAT [Kautz et al., 1997] CPSC 322, Lecture 32

  36. The MaxWalkSAT Algorithm fori← 1 to max-triesdo solution = random truth assignment for j ← 1 tomax-flipsdo if ∑ weights(sat. clauses) > thresholdthen return solution c ← random unsatisfied clause with probabilityp flip a random variable in c else flip variable in c that maximizes ∑ weights(sat. clauses) return failure, best solution found CPSC 322, Lecture 32

  37. Computing Probabilities • P(Formula|MLN,C) = ? • Brute force: Sum probs. of worlds where formula holds • MCMC: Sample worlds, check formula holds • P(Formula1|Formula2,MLN,C) = ? • Discard worlds where Formula 2 does not hold • In practice: More efficient alternatives CPSC 322, Lecture 32

  38. Directed Models vs. Undirected Models Parent Friend 1 Child Friend 2 P(Child|Parent) φ(Friend1,Friend2) CPSC 322, Lecture 32

  39. Undirected Probabilistic Logic Models • Upgrade undirected propositional models to relational setting • Markov Nets  Markov Logic Networks • Markov Random Fields  Relational Markov Nets • Conditional Random Fields  Relational CRFs CPSC 322, Lecture 32

  40. Markov Logic Networks (Richardson & Domingos) • Soften logical clauses • A first-order clause is a hard constraint on the world • Soften the constraints so that when a constraint is violated, the world is less probably, not impossible • Higher weight Stronger constraint • Weight of  first-order logic Probability( World S ) = ( 1 / Z )  exp {  weight i x numberTimesTrue(f i, S)} CPSC 322, Lecture 32

  41. Example: Friends & Smokers Two constants: Anna (A) and Bob (B) Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A) CPSC 322, Lecture 32

  42. Alphabetic Soup => Endless Possibilities • Probabilistic Relational Models (PRM) • Bayesian Logic Programs (BLP) • PRISM • Stochastic Logic Programs (SLP) • Independent Choice Logic (ICL) • Markov Logic Networks (MLN) • Relational Markov Nets (RMN) • CLP-BN • Relational Bayes Nets (RBN) • Probabilistic Logic Progam (PLP) • ProbLog • …. • Web data (web) • Biological data (bio) • Social Network Analysis (soc) • Bibliographic data (cite) • Epidimiological data (epi) • Communication data (comm) • Customer networks (cust) • Collaborative filtering problems (cf) • Trust networks (trust) • … Fall 2003– Dietterich @ OSU, Spring 2004 –Page @ UW, Spring 2007-Neville @ Purdue, Fall 2008 – Pedro @ CMU CPSC 322, Lecture 32

  43. Recent Advances in SRL Inference • Preprocessing for Inference • FROG – Shavlik & Natarajan (2009) • Lifted Exact Inference • Lifted Variable Elimination – Poole (2003), Braz et al(2005) Milch et al (2008) • Lifted VE + Aggregation – Kisynski & Poole (2009) • Sampling Methods • MCMC techniques – Milch & Russell (2006) • Logical Particle Filter – Natarajan et al (2008), ZettleMoyer et al (2007) • Lazy Inference – Poon et al (2008) • Approximate Methods • Lifted First-Order Belief Propagation – Singla & Domingos (2008) • Counting Belief Propagation – Kersting et al (2009) • MAP Inference – Riedel (2008) • Bounds Propagation • Anytime Belief Propagation – Braz et al (2009) CPSC 322, Lecture 32

  44. Conclusion • Inference is the key issue in several SRL formalisms • FROG - Keeps the count of unsatisfied groundings • Order of Magnitude reduction in number of groundings • Compares favorably to Alchemy in different domains • Counting BP - BP + grouping nodes sending and receiving identical messages • Conceptually easy, scaleable BP algorithm • Applications to challenging AI tasks • Anytime BP – Incremental Shattering + Box Propagation • Only the most necessary fraction of model considered and shattered • Status – Implementation and evaluation CPSC 322, Lecture 32

More Related