1 / 21

Artificial Intelligence 15-381 Heuristic Search Methods

Artificial Intelligence 15-381 Heuristic Search Methods. Jaime Carbonell jgc@cs.cmu.edu 17 January 2002 Today's Agenda Island Search & Complexity Analysis Heuristics and evaluation functions Heuristic search methods Admissibility and A* search B* search (time permitting)

washi
Download Presentation

Artificial Intelligence 15-381 Heuristic Search Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Intelligence 15-381Heuristic Search Methods Jaime Carbonell jgc@cs.cmu.edu 17 January 2002 Today's Agenda • Island Search & Complexity Analysis • Heuristics and evaluation functions • Heuristic search methods • Admissibility and A* search • B* search (time permitting) • Macrooperators in search (time permitting)

  2. Complexity of Search • Definitions • Let depth d = length(min(s-Path(S0, SG)))-1 • Let branching-factor b = Ave(|Succ(Si)|) • Let backward branching B = Ave(|Succ-1(Si)|); usually b=bb, but not always • Let C(<method>,b,d) = max number of Si visited C(<method>,b,d) = worst-case time complexity C(<method>,b,d) < worst-case space complexity

  3. Complexity of Search • Breadth-First Search Complexity • C(BFS,b,d) = i=0,dbi = O(bd) • C(BBFS,b,d) = i=0,dBi = O(Bd) • C(BiBFS,b,d) = 2i=0,d/2bi = O(bd/2), if b=B • Suppose we have k evenly-spaced islands in s=Path(S0, SG), then: C(IBFS,b,d) = (k+1) i=0,d/(k+1)bi = O(bd/(k+1)) C(BiIBFS,b,d) = 2 (k+1) i=0,d/(2k+2)bi = O(bd/(2k+2))

  4. Heuristics in AI Search • Definition A Heuristicis an operationally-effective nugget of information on how to direct search in a problem space. Heuristics are only approximately correct. Their purpose is to minimize search on average.

  5. Common Types of Heuristics • "If-then" rules for state-transition selection • Macro-operator formation [discussed later] • Problem decomposition [e.g. hypothesizing islands on the search path] • Estimation of distance between Scurr and SG. (e.g. Manhattan, Euclidian, topological distance) • Value function on each Succ(Scurr) • cost(path(S0, Scurr)) + E[cost(path(Scurr,SG))] • Utility: value(S) – cost(S)

  6. Heuristic Search Value function: E(o-Path(S0, Scurr), Scurr, SG) Since S0 and SG are constant, we abbreviate E(Scurr) General Form: • Quit if done (with success or failure), else: • s-Queue:= F(Succ(Scurr),s-Queue) • Snext:= Argmax[E(s-Queue)] • Go to 1, with Scurr:= Snext

  7. Heuristic Search Steepest-Ascent Hill-Climbing • F(Succ(Scurr), s-Queue) = Succ(Scurr) • No history stored in s-Queue, hence: Space complexity = max(b) [=O(1) if b is bounded] • Quintessential greedy search Max-Gradient Search • "Informed" depth-first search • Snext:= Argmax[E(Succ(Scurr))] • But if Succ(Snext) is null, then backtrack • Alternative: backtrack if E(Snext)<E(Scurr)

  8. Beyond Greedy Search • Best-First Search BestFS(Scurr, SG, s-Queue) IF Scurr = SG, return SUCCESS For si in Succ(Scurr) Insertion-sort(<Si, E(Si)>, s-Queue) IF s-Queue = Null, return FAILURE ELSE return BestFS(FIRST(s-Queue), SG, TAIL(s-Queue))

  9. Beyond Greedy Search • Best-First Search (cont.) • F(Succ(Scurr)), s-Queue) = Sort(Append(Succ(Scurr), Tail(s-Queue)),E(si)) • Full-breadth search • "Ragged"fringe expansion • Does BestFS guarantee optimality?

  10. Beyond Greedy Search • Beam Search • Best-few BFS • Beam-width parameter • Uniform fringe expansion • Does Beam Search guarantee optimality?

  11. A* Search • Cost Function Definitions • Let g(Scurr) = actual cost to reach Scurr from S0 • Let h(Scurr)= estimated cost from Scurr to SG • Let f(Scurr)= g(Scurr) + h(Scurr)

  12. A* Search Definitions • Optimality Definition A solution isoptimalif it conforms to the minimal-cost path between S0 and SG. If operators cost is uniform, then the optimal solution = shortest path. • Admissibility Definition A heuristic is admissible with respect to a search method if it guarantees finding the optimal solution first, even when its value is only an estimate.

  13. A* Search Preliminaries • Admissible Heuristics for BestFS • "Always expand the node with min(g(Scurr)) first." If Solution found, expand any Si in s-Queue where g(Si) < g(SG) • Find solution any which way. Then Best FS(Si) for all intermediate Si in solution as follows: • If g(S(1curr) >= g(SG) in previous, quit • Else if g(S(1G < g(SG), Sol:=Sol1, & redo (1).

  14. A*: Better Admissible Heuristics Observations on admissible heuristics • Admissible heuristics based only on look-back (e.g. on g(S)) can lead to massive inefficiency! • Can we do better? • Can we look forward (e.g. beyond g(Scurr)) too? • Yes, we can!

  15. A*: Better Admissible Heuristics • The A* Criterion If h(Scurr) always equals or underestimates the true remaining cost, then f(Scurr) is admissible with respect to Best-First Search. • A* Search A* Search = BestFS with admissible f = g + h under the admissibility constraints above.

  16. A* Optimality Proof Goal and Path Proofs • Let SG be optimal goal state, and s-path (S0, SG) be the optimal solution. • Consider an A* search tree rooted at S0 with S1Gon fringe. • Must prove f(SG2) >= f(SG) and g(path(S0, SG)) is minimal (optimal). • Text proves optimality by contradiction.

  17. A* Optimality Proof • Simpler Optimality Proof for A* • Assume s-Queue sorted by f. • Pick a sub-optimal SG2: g(SG2) > g(SG) • Since h(SG2) = h(SG) = 0, f (SG2) > f (SG) • If s-Queue is sorted by f, f(SG) is selected before f(SG2)

  18. B* Search • Ideas • Admissible heuristics for mono- and bi-polar search • "Eliminates" horizon problem in game-trees [more later] • Definitions • Let Best(S) = Always optimistic eval fn. • Let Worst(S) = Always pessimistic eval fn. • Hence: Worst(S) < True-eval(S) < Best(S)

  19. Basic B* Search • Basic B* Search B*(S) is defined as: If there is an Si in SUCC(Scurr) s.t. For all other Sj in SUCC(Scurr), W(Si) > B(Sj) Then select Si Else ProveBest (SUCC(Scurr)) OR DisproveRest (SUCC(Scurr) • Difficulties in B* • Guaranteeing eternal pessimism in W(S) (eternal optimism is somewhat easier) • Switching among ProveBest and DisproveRest • Usually W(S) << True-eval(S) << B(S) (not possible to achieve W(Si) > B(Sj)

  20. Macrooperators in Search • Linear Macros • Cashed sequence of instantiated operators: If: S0 ---opi S1 ---opjS2 Then: S0 –opi,j S2 • Alternative notation: if: opj(opi(S0)) = S2, Then: opi,j(S0) = S2 • Macros can have any length, e.g. oi,j,k,l,m,n • Key question: do linear macoros reduce search?

  21. Macrooperators in Search • Disjunctive Macros • Iterative Macros op2 op3 op1 op4 op7 op5 op6 opk,l,m,n opi,j Cond (s-Hist,SG) YES NO opo,p,q

More Related