1 / 27

Informed (Heuristic) Search Algorithms

Informed (Heuristic) Search Algorithms. Homework #1 assigned due 10/4 before Exam 1. 9. 7. 7. A. .1. N1:B(.1+25.2). N1:B(.1+8.6). N2:G(9+0). N2:G(9+0). 25.2. 8.6. 20. B. .1. 9. N3:C(.2+8.7). 25.1. 8.7. 28. C. .1. 25. 25. 25. D. N4:D(.3+25). 25. 0. 0. 0. G.

keona
Download Presentation

Informed (Heuristic) Search Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Informed (Heuristic) Search Algorithms

  2. Homework #1 assigned due 10/4 before Exam 1

  3. 9 7 7 A .1 N1:B(.1+25.2) N1:B(.1+8.6) N2:G(9+0) N2:G(9+0) 25.2 8.6 20 B .1 9 N3:C(.2+8.7) 25.1 8.7 28 C .1 25 25 25 D N4:D(.3+25) 25 0 0 0 G Evaluating heuristic functions Is pink-h admissible? No:A (0+7) Is green-h admissible? No:A (0)

  4. or h2 is more informed than h1

  5. Uniform-cost and A* UCS A*

  6. h5 h4 h* Max(h2,h3) h1 h3 h2 Admissibility/Informedness Heuristic Value Seach Nodes

  7. Not required for HW and exam purposes Consistency Consistency  admissibility Admissibility  consistency

  8. UCS g(n) A* f(n)= g(n) + h(n) Greedy h(n)

  9. IDA* • Basicaly IDDFS, except instead of the iterations being defined in terms of depth, we define it in terms of f-value • Start with the f cutoff equal to the f-value of the root node • Loop • Generate and search all nodes whose f-values are less than or equal to current cutoff. • Use depth-first search to search the trees in the individual iterations • Keep track of the node N’ which has the smallest f-value that is still larger than the current cutoff. Let this f-value be next-largest-f-value -- If the search finds a goal node, terminate. If not, set cutoff = next-largest-f-value and go back to Loop Properties: Linear memory. #Iterations in the worst case? = bd (Happens when all nodes have distinct f-values.)

  10. Who will give you admissible h(n)?

  11. Relaxed problems A problem with fewer restrictions on the actions is called a relaxed problem. Obtain heuristic from relaxed problems The more relaxed, the easier to compute heuristic, but the less accurate it is For 8-puzzle problem? Assume ability to move the tile directly to the place distance= # misplaced tiles Assume ability to move only one position at a time distance = Sum of Manhattan distances.

  12. Different levels of abstraction for shortest path problems on the plane I The obstacles in the shortest path problem canbe abstracted in a variety of ways. --The more the abstraction, the cheaper it is to solve the problem in abstract space --The less the abstraction, the more “informed” the heuristic cost (i.e., the closer the abstract path length to actual path length) hD G “disappearing-act abstraction” I hC G “circular abstraction” hP I Actual h* Why are we inscribing the obstacles rather than circumscribing them? I G “Polygonal abstraction” G

  13. How informed should the heuristic be? Total cost incurred in search Cost of computing the heuristic Cost of searching with the heuristic hC hD h0 h* hP • Not always clear where the total minimum occurs • Old wisdom was that the global min was closer to cheaper heuristics • Current insights are that it may well be far from the cheaper heuristics for many problems Reduced level of abstraction (i.e. more and more concrete)

More Related