1 / 34

CSC 450 - AI Local search Algorithms

CSC 450 - AI Local search Algorithms. Outline. Understanding Local search algorithms Hill-climbing search Simulated annealing search Local beam search Genetic algorithms. previously. Addressed a single category of problems:

chesterlee
Download Presentation

CSC 450 - AI Local search Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSC 450 - AILocal search Algorithms

  2. Outline • Understanding Local search algorithms • Hill-climbing search • Simulated annealing search • Local beam search • Genetic algorithms

  3. previously • Addressed a single category of problems: • observable, deterministic, known environments where the solution is a sequence of actions. • In this chapter, we look at what happens when these assumptions are relaxed. • There by getting closer to the real world.

  4. This part • covers algorithms that perform purely local search in the state space • evaluating and modifying one or more current states rather than systematically exploring paths from an initial state. • These algorithms are suitable for problems in which all that matters is the solution state, not the path cost to reach it. • The family of local search algorithms includes methods inspired by statistical physics (simulated annealing) and evolutionary biology (genetic algorithms).

  5. Local search Algorithms • Basic search algorithms explore search space systematically • keeping one or more paths in memory ,recording which alternatives have been explored , When a goal is found, the path to that goal also constitutes a solution to the problem. • In many problems, however, the path to the goal is irrelevant. • For example, in the 8-queens problem ,what matters is the final configuration of queens, not the order in which they are added. • Local search algorithms are a different class of algorithms. • ones that do not worry about paths at all

  6. Local search Algorithms • Local search algorithms operate using a single current node (rather than multiple paths) and generally move only to neighbors of that node. Typically, the paths followed by the search are not retained. • Two key advantages: • (1) they use very little memory—usually a constant amount; and • (2) they can often find reasonable solutions in large or infinite (continuous) state spaces for which systematic algorithms are unsuitable. • Are useful for solving pure optimization problems, in which the aim is to find the best state according to an objective function.

  7. Understanding local search algorithms • A landscape has both "location" (defined by the state) and "elevation" (defined by the value of the heuristic cost function or objective function). • If elevation corresponds to cost, then the aim is to find the lowest valley—a global minimum; if elevation corresponds to an objective function, then the aim is to find the highest peak—a global maximum.

  8. Understanding local search algorithms • Local search algorithms explore the landscape. • A complete local search algorithm always finds a goal if one exists; • An optimal algorithm always finds a global minimum/ maximum.

  9. Part 2 Algorithms

  10. Hill climbing search • The hill-climbing search algorithm is simply a loop that continually moves in the direction of increasing value—that is, uphill. • Terminates when it reaches a "peak" where no neighbor has a higher value. • The algorithm does not maintain a search tree, • the current node need only record the state and the value of the objective function. • Hill climbing does not look ahead beyond the immediate neighbors of the current state.

  11. Hill-climbing search • "Like climbing Everest in thick fog with amnesia"

  12. Hill-climbing search • Problem: depending on initial state, can get stuck in local maxima

  13. Example: n-queens • Put n queens on an n × n board with no two queens on the same row, column, or diagonal To illustrate hill climbing, we will use the 8-queens problem

  14. the 8-queens problem • Local search algorithms use a complete-state formulation, where each state has S queens on the board, one per column. • The successors of a state are all possible states generated by moving a single queen to another square in the same column • (so each state has 8 x 7= 56 successors). • The heuristic cost function h is thenumber of pairs of queens that are attacking each other, either directly or indirectly. • The global minimum of this function is zero, which occurs only at perfect solutions. ( no attack)

  15. The 8-queens problem • h= number of pairs of queens that are attacking each other, either directly or indirectly • h = 17 for the above state

  16. The 8-queens problem • A local minimum with h = 1, close to the optimum solution

  17. Hill climb advantages • The relative simplicity of the algorithm • makes it a popular first choice amongst optimizing algorithms. • It is used widely in artificial intelligence, for reaching a goal state from a starting node. • Hill climbing can often produce a better result than other algorithms when the amount of time available to perform a search is limited, • such as with real-time systems. • It is an anytime algorithm: it can return a valid solution even if it's interrupted at any time before it ends.

  18. Hill climb Disadvantages • Unfortunately, hill climbing often gets stuck for the following reasons: • Local maxima • Plateau • Ridges • In each case, the algorithm reaches a point at which no progress is being made.

  19. Hill Climbing: Disadvantages Local maxima • A state that is better than all of its neighbours, but not better than some other states far away. • Is a peak that is higher than each of its neighboring states but lower than the global maximum.

  20. Hill Climbing: Disadvantages Plateau A flat area of the search space in which all neighbouring states have the same value.

  21. Hill Climbing: Disadvantages Ridge The orientation of the high region, compared to the set of available moves, makes it impossible to climb up. However, two moves executed serially may increase the height.

  22. Hill Climbing: Disadvantages • The hill-climbing algorithms can often fails to find a goal when one exists because they can get stuck. • Ways Out • Backtrack to some earlier node and try going in a different direction. • Stochastic hill climbing chooses at random from among the uphill moves • Random restart. "If at first you don't succeed, try, try again." • Lecture 1 end

  23. Simulated annealing

  24. Simulated Annealingthe idea • A hill-climbing algorithm is incomplete • It gets stuck • In contrast, a purely random walk—that is, moving to a successor chosen uniformly at random from the set of successors—is complete but extremely inefficient. • Therefore, it seems reasonable to combine hill climbing with a random walk in some way that yields both efficiency and completeness. • Simulated annealing is such an algorithm.

  25. Simulated annealing search • Idea: escape local maxima by allowing some "bad" moves but gradually decrease their frequency.

  26. Properties of simulated annealing search • One can prove: If T decreases slowly enough, then simulated annealing search will find a global optimum with probability approaching 1 • Widely used in VLSI layout, airline scheduling, etc

  27. Local beam search

  28. Local beam search • Keep track of k states rather than just one • Start with k randomly generated states • At each iteration, all the successors of all k states are generated • If any one is a goal state, stop; else select the k best successorsfrom the complete list and repeat. • In a local beam search, useful information is passed among the parallel search threads. In effect, the states that generate the best successors say to the others, "Come over here, the grass is greener!"

  29. Local beam search • In its simplest form, local beam search can suffer from a lack of diversity among the k states • they can quickly become concentrated in a small region of the state space, making the search little more than an expensive version of hill climbing • Alleviate this problem by stochastic beam search • Instead of choosing the best k from the pool of candidate successors, it chooses In successors at random, with the probability of choosing a given successor being an increasing function of its value. • Resemblance to the process of natural selection

  30. Genetic algorithms GA

  31. Genetic algorithms GA • A successor state is generated by combining two parent states • GA is a variant of stochastic beam search in which successor states are generated by combining two parent states rather than by modifying a single state. • Start with k randomly generated states (population) A state is represented as a string over a finite alphabet (often a string of 0s and 1s) Evaluation function (fitness function). Higher values for better states. Produce the next generation of states by selection, crossover, and mutation

  32. Genetic algorithms • Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28) • 24/(24+23+20+11) = 31% • 23/(24+23+20+11) = 29% etc

  33. Genetic algorithmsGA

  34. Genetic algorithms GA • Like stochastic beam search, genetic algorithms combine an uphill tendency with random exploration and exchange of information among parallel search threads. • The primary advantage, if any, of genetic algorithms comes from the crossover operation. • The crossover operation is able to combine large blocks of letters that have evolved independently to perform useful functions, thus raising the level of granularity at which the search operates.

More Related