learning as search
Skip this Video
Download Presentation
Learning As Search

Loading in 2 Seconds...

play fullscreen
1 / 14

Learning As Search - PowerPoint PPT Presentation

  • Uploaded on

Learning As Search. Overview. Inductive learning as search Current best hypothesis search Least commitment search. Learning as search. An interesting view of learning Helps understand the learning process, including generalization and specialization

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Learning As Search' - charis

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
  • Inductive learning as search
  • Current best hypothesis search
  • Least commitment search
learning as search1
Learning as search
  • An interesting view of learning
  • Helps understand the learning process, including generalization and specialization
  • Concepts, such as hypothesis space better come into focus
  • Will see two more learning algorithms
  • An instructional tool and elegant, and with some applications too
the setting
The Setting
  • Target function is a Logical sentence
  • In our examples: propositional logic
  • Task: find a hypothesis or hypotheses for a goal predicate consistent with all examples seen so far
  • Example: goal predicate would be WillWait(instance), for the restaurant example
  • Each hypothesis proposes an expression for the goal predicate
  • Hypotheses: Set of all hypotheses an algorithm entertains. E.g. decision trees
  • Having seen no examples, any hypothesis could be correct:

H1 V H2 V… V Hn

  • The goal predicate g() may or may not hold for each example
  • False positive example i for a hypothesis h():
    • If g(i) is false, but h(i) is true
  • False negative example i for a hypothesis h():
    • If g(i) is true, but h(i) is false
current best hypothesis search
Current-Best-Hypothesis Search
  • Local search (iterative improvement)
  • Start with a hypothesis and keep it consistent with examples seen:
    • If a false positive is seen, then specialize
    • If a false negative is seen, then generalize
    • (try to keep the hypothesis consistent with the previous examples)
specialization and generalization
Specialization and Generalization
  • Specialization (Restriction): Add a conjunction or drop a disjunction
  • Generalization (Relaxation): Add a disjunction or drop a conjunction
  • There could be other changes depending on the representation language
  • The choice of initial hypothesis and specialization and generalization is nondeterministic (use heuristics)
  • Problems:
    • What heuristics to use
    • Could be inefficient (need to check consistency with all examples), and perhaps backtrack or restart
    • Handling noise
least commitment search
Least-Commitment Search
  • Or the version space (v.s.) algorithm
  • Keep all the version space=set of all consistent hypes
  • As each example is seen, drop inconsistent ones (shrink the v.s.)
  • Return all the consistent ones
  • But there is an enormous number of hypes..
represent the frontier
Represent the Frontier
  • Solution: don’t maintain explicitly;

Keep only the boundary of the version space

  • There is a partial ordering on the hype space under the relationship of specializes/generalizes
  • Maintain the most general hypes and the most specific hypes of the version space
version space maintenance
Version Space Maintenance
  • S: set of most specific hypes in v.s. (initially False)
  • G: set of most general hypes in v.s. (initially True)
  • For each new example and h in S or G:
    • if false negative for h in S, generalize h
    • if false positive for h in S, throw h out of S!
    • Symmetrically for h in G
  • Generalization or specialization can create multiple hypes (children)
stopping condition
Stopping Condition
  • If no hype left, but examples exist then?
  • If examples are gone, and one hype left, great!
  • If multiple hypotheses remain?
  • Advantages to Best-Current-Search?
  • Disadvantages (in general)?
  • Best-Current-Hypothesis and Version-Space algorithms are examples of incremental or on-line learning algorithms