1 / 26

MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002

Learn about neighborhood search in problem solving, which involves exploring points in the search space that are close to a given point. Discover how to define and evaluate a neighborhood to find optimal solutions.

arlena
Download Presentation

MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002

  2. http://coool.mines.edu/report/node3.html

  3. Basics of problem Solving-Evaluation Function • For every real world problem the evaluation problem is chosen by the designer. • It should of course indicate for instance that a solution that meets the objective is better than one that does not. • It should also depend on factors such as the computational complexity of the problem. • Often the objective function indicates a good evaluation function. • Objective - Minimize Stress -> Evaluation Function -Stress

  4. Basics of problem Solving-Evaluation Function • Other times you cannot derive a useful Evaluation Function from the objective: • In the SAT problem the objective is to find a set of boolean (TRUE,FALSE) variables that satisfies a logical statement (makes it TRUE). • All wrong candidate solutions return FALSE which does not tell you how to improve the solution.

  5. Basics of problem Solving-Defining a Search Problem • When you design an evaluation function you need to consider that for many problems the only solutions of interest are the subset that are Feasible (satisfy the constraints). • The feasible space can be defined as F where F  S. • A search problem can then be defined as: • Given a search space S and its feasible part F  S find x  F such that • eval(x) eval(y) THIS IS THE DEF. OF A GLOBAL OPT • for all y  F • Note that the objective does not appear at all in the formulation!! • If your EF does not correspond with the objective you will searching for the answer to the wrong problem.

  6. Basics of problem Solving-Defining a Search Problem • A point x that satisfies the condition is called a global solution. • Finding a global solution can be difficult and impossible to prove in some cases. • It would be easier if we could limit the search to a smaller area of S. • This fact underlies many search techniques.

  7. Basics of problem Solving-Neighborhood Search • If we concentrate on the area of S ‘near’ to some point in the search space we can more easily look in this ‘neighborhood’. N(x) S x • N(x) of x is a set of all points in the search space that ‘close’ to the given point x. • N(x) ={y  S: dist(x,y)}

  8. Basics of problem Solving-Neighborhood Search • For a continuous NLP the Euclidean distance can be used to define a neighborhood. • For the TSP a 2-swap neighborhood can be defined as all of the candidates that would result from swapping two cities in a given tour. • A solution x (a permutation of n=5 cities) • 1-2-3-4-5 has n(n-1)/2 neighbors including • 1-3-2-4-5 (swapping cities 2 and 3) • 5-2-3-4-1 (swapping cities 1 and 5) • etc.

  9. Basics of problem Solving-Neighborhood Search Example: Quadratic Objective with no Constraints F=x2+3 xc

  10. Basics of problem Solving-Neighborhood Search Min F=x2+3 xc Step 1: Define a neighborhood around point xc. N(x): xc-  x  xc +

  11. Basics of problem Solving-Neighborhood Search F=x2+3 x1 xc Step 2: Sample a candidate solution from the neighborhood and evaluate it. if F(x1)>F(xc) reject point and choose another.

  12. Basics of problem Solving-Neighborhood Search F=x2+3 xc x1 if F(x1)<F(xc) accept point and choose replace current point xc with x1.

  13. Basics of problem Solving-Neighborhood Search F=x2+3 xc Step 3: Create new neighborhood around xc and repeat process.

  14. Basics of problem Solving-Neighborhood Search • Most realistic problems are considerably more difficult than a quadratic bowl problem. • The evaluation function defines a response surface that describes the topography of the search space with many hills and valleys.

  15. Basics of problem Solving-Neighborhood Search • Finding the best peak or the lowest valley is like trying to navigate a mountain range in the dark with only a small lamp. • Your decisions must be made using local information. You can sample points in a local area and then decide where to walk next.

  16. Basics of problem Solving-Neighborhood Search • If you decide to always go uphill then you will reach a peak but not necessarily the highest peak. • You may need to walk downhill in order to eventually reach the highest peak in the space.

  17. Basics of problem Solving-Local Optima • With the notion of neighbor we can define the idea of local optima. • A potential solution xF is a local optima if and only if : • eval(x)  eval(y) for all yN(x) • If N(x) is small then it is relatively easy to search it for the best solution but is also easy to get trapped in a local minimum. • If N(x) is large then the visibility of the entire design space is increased and the chances of finding the global optima increase. • Large N(x) also lead to more computational expense.

  18. Basics of problem Solving -Local Optima N(x) • With a small neighborhood only a local optima is found.

  19. Basics of problem Solving N(x) • With a large neighborhood the global optima is more likely to be found but with high computational expense. • The size of the neighbor hood should fit the problem!!!!

  20. Formal Implementation of Neighborhood Search -Hill Climbing Methods • Basic Hill Climbing Methods utilize the concept of a neighborhood search and iterative improvement to find local optima. • During each iteration the best solution is selected for the neighborhood N(x) and is used to replace the current solution. • If there are no better solutions in N(x) then a local optima has been reached and a new design point is selected at random to start the next iteration. • Hill climbing methods are VERY dependent on the starting point of the algorithm and size of the neighborhood. Always go uphill (or downhill in the case of minimization).

  21. Hill Climbing Procedure Begin Set t =0; Set best=0; Repeat local = FALSE; Select a current point vc at random; Evaluate vc and set best=eval(vc); Repeat select all points in the neighborhood of vc; select the point vn from the set of new points with best value of evaluation function eval if eval(vn) is better than eval(vc) then vc = vn else local=TRUE Until local t=t+1 if vc is better than best then best = vc Until t = MAX_ITERATIONS

  22. Disadvantages of Hill Climbers • They usually terminate at solutions that only locally optimal • There is no information as to the amount by which the discovered local optimum deviates from the global optima or other local optima. • The optimum that is found depends on the initial configuration. • In general, it is not possible to provide an upper bound for the computation time.

  23. Advantages of Hill Climbers • They are very easy to apply!!!!!

  24. Balancing Local and Global Search • Effective search techniques balance exploitation and exploration. • Exploitation is the process of using the best solution as a jumping of point to finding an improved solution. • Exploration is the process of exploring new areas of the search space. • Hill climbing methods utilize exploitation by effectively utilizing the current best point, but they can neglect a large portion of the search space.

  25. Pure Random Search • Pure random search utilizes all exploration and no exploitation. It explored the space thoroughly, but forgoes exploiting promising areas of the design space. F(x) x

  26. Random Search Procedure Begin Set t=0; Set best=0; Select an initial point vo at random; Evaluate vo and set best_x=vo; Set best_f=eval(vo) Repeat Select a point vc at random t=t+1 Evaluate vc if eval (vc) is better than best_f then Set best_f = eval (vc) Set best =vc Until t = MAX_ITERATIONS

More Related