1 / 14

Lecture 24 of 41

Lecture 24 of 41. Review: Classical and Modern Planning. Monday, 18 October 2004 William H. Hsu Department of Computing and Information Sciences, KSU http://www.kddresearch.org http://www.cis.ksu.edu/~bhsu Reading: Chapter 13, Russell and Norvig 2e. Lecture Outline. Today’s Reading

Download Presentation

Lecture 24 of 41

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 24 of 41 Review: Classical and Modern Planning Monday, 18 October 2004 William H. Hsu Department of Computing and Information Sciences, KSU http://www.kddresearch.org http://www.cis.ksu.edu/~bhsu Reading: Chapter 13, Russell and Norvig 2e

  2. Lecture Outline • Today’s Reading • Chapter 13, Russell and Norvig 2e • References: Readings in Planning – Allen, Hendler, and Tate • Next Week: Chapter 14, Russell and Norvig 2e • Previously: Logical Representations • Today and Wednesday: Introduction to Reasoning under Uncertainty • Conditional planning, concluded • Monitoring • Friday and Next Week: Introduction to Uncertain Reasoning • Uncertainty in AI • Need for uncertain representation • Soft computing: probabilistic, neural, fuzzy, other representations • Probabilistic knowledge representation • Views of probability • Justification

  3. Review:How Things Go Wrong Adapted from slides by S. Russell, UC Berkeley

  4. Review:Solutions Adapted from slides by S. Russell, UC Berkeley

  5. Example:Preconditions for Remaining Plan Adapted from slides by S. Russell, UC Berkeley

  6. Example:Replanning Adapted from slides by S. Russell, UC Berkeley

  7. Review:Uncertain Reasoning Requirements Adapted from slides by S. Russell, UC Berkeley

  8. Methods for Handling Uncertainty Adapted from slides by S. Russell, UC Berkeley

  9. Probability Adapted from slides by S. Russell, UC Berkeley

  10. The Probabilistic (Bayesian) Framework • Framework: Interpretations of Probability [Cheeseman, 1985] • Bayesian subjectivist view • A measure of an agent’s belief in a proposition • Proposition denoted by random variable (sample space: range) • e.g., Pr(Outlook = Sunny) = 0.8 • Frequentist view: probability is the frequency of observations of an event • Logicist view: probability is inferential evidence in favor of a proposition • Some Applications • HCI: learning natural language; intelligent displays; decision support • Approaches: prediction; sensor and data fusion (e.g., bioinformatics) • Prediction: Examples • Measure relevant parameters: temperature, barometric pressure, wind speed • Make statement of the form Pr(Tomorrow’s-Weather = Rain) = 0.5 • College admissions: Pr(Acceptance) p • Plain beliefs: unconditional acceptance (p = 1) or categorical rejection (p = 0) • Conditional beliefs: depends on reviewer (use probabilistic model)

  11. Terminology • Introduction to Reasoning under Uncertainty • Probability foundations • Definitions: subjectivist, frequentist, logicist • (3) Kolmogorov axioms • Bayes’s Theorem • Prior probability of an event • Joint probability of an event • Conditional (posterior) probability of an event • Maximum APosteriori (MAP) and Maximum Likelihood (ML) Hypotheses • MAP hypothesis: highest conditional probability given observations (data) • ML: highest likelihood of generating the observed data • ML estimation (MLE): estimating parameters to find ML hypothesis • Bayesian Inference: Computing Conditional Probabilities (CPs) in A Model • Bayesian Learning: Searching Model (Hypothesis) Space using CPs

  12. Summary Points • Introduction to Probabilistic Reasoning • Framework: using probabilistic criteria to search H • Probability foundations • Definitions: subjectivist, objectivist; Bayesian, frequentist, logicist • Kolmogorov axioms • Bayes’s Theorem • Definition of conditional (posterior) probability • Product rule • Maximum APosteriori (MAP) and Maximum Likelihood (ML) Hypotheses • Bayes’s Rule and MAP • Uniform priors: allow use of MLE to generate MAP hypotheses • Relation to version spaces, candidate elimination • Next Week: Chapter 14, Russell and Norvig • Later: Bayesian learning: MDL, BOC, Gibbs, Simple (Naïve) Bayes • Categorizing text and documents, other applications

More Related