1 / 54

Heuristic Optimization Methods

Heuristic Optimization Methods. Scatter Search. Agenda. Scatter Search (SS) For Local Search based Metaheuristics: SA based on ideas from nature TS based on problem-solving and learning For population based Metaheuristics: GA based on ideas from nature

Download Presentation

Heuristic Optimization Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Heuristic Optimization Methods Scatter Search

  2. Agenda • Scatter Search (SS) • For Local Search based Metaheuristics: • SA based on ideas from nature • TS based on problem-solving and learning • For population based Metaheuristics: • GA based on ideas from nature • SS based on problem-solving and learning • Nature works, but usually very slowly • Being clever is better than emulating nature?

  3. The following is a presentation previously held at the conference ICS 2003... Scatter Search:Methodology and Applications Manuel Laguna University of Colorado Rafael Martí University of Valencia

  4. Based on … Laguna, M. and R. Martí (2003) Scatter Search: Methodology and Implementations in C, Kluwer Academic Publishers, Boston.

  5. Scatter Search Methodology

  6. Metaheuristic • A metaheuristic refers to a master strategy that guides and modifies other heuristics to produce solutions beyond those that are normally generated in a quest for local optimality. • A metaheuristic is a procedure that has the ability to escape local optimality

  7. Typical Search Trajectory

  8. Metaheuristic Classification • x/y/z Classification • x = A (adaptive memory) or M (memoryless) • y = N (systematic neighborhood search) or S (random sampling) • Z = 1 (one current solution) or P (population of solutions) • Some Classifications • Tabu search (A/N/1) • Genetic Algorithms (M/S/P) • Scatter Search (M/N/P)

  9. P Repeat until |P| = PSize RefSet Stop if no more new solutions Scatter Search Diversification Generation Method Improvement Method Improvement Method Reference Set Update Method Solution Combination Method Subset Generation Method

  10. P RefSet Scatter Search with Rebuilding Diversification Generation Method Repeat until |P| = PSize Improvement Method Improvement Method Reference Set Update Method Stop if MaxIter reached Solution Combination Method Improvement Method Subset Generation Method No more new solutions Diversification Generation Method

  11. Tutorial • Unconstrained Nonlinear Optimization Problem

  12. Diversification Generation Method Subrange 1 Subrange 2 Subrange 3 Subrange 4 -10 -5 0 +5 +10 Probability of selecting a subrange is proportional to a frequency count

  13. Diverse Solutions

  14. Improvement Method Nelder and Mead (1965)

  15. Reference Set Update Method(Initial RefSet) Objective function value to measure quality b1 high-quality solutions Min-max criterion and Euclidean distances to measure diversity b2 diverse solutions RefSet of size b

  16. Initial RefSet High-Quality Solutions Diverse Solutions

  17. Subset Generation Method • All pairs of reference solutions that include at least one new solution • The method generates (b2-b)/2 pairs from the initial RefSet

  18. Combination Method

  19. Alternative Combination Method

  20. 1 Best 2 . . . New trial solution b Worst Updated RefSet Reference Set Update Method Quality 1 Best 2 . . . b Worst RefSet of size b

  21. Static Update Pool of new trial solutions Quality 1 Best 2 . . . Updated RefSet = Best b from RefSet Pool b Worst RefSet of size b

  22. RefSet after Update

  23. Additional Strategies • Reference Set • Rebuilding • Multi-tier • Subset Generation • Subsets of size > 2 • Combination Method • Variable number of solutions

  24. RefSet Rebuilt RefSet b1 b2 Diversification Generation Method Reference Set Update Method Rebuilding

  25. Solution Combination Method Improvement Method RefSet Try here first b1 If it fails, then try here b2 2-Tier RefSet

  26. Solution Combination Method Improvement Method RefSet Try here first b1 If it fails, then try here b2 Try departing solution here b3 3-Tier RefSet

  27. Subset Generation • Subset Type 1: all 2-element subsets. • Subset Type 2: 3-element subsets derived from the 2-element subsets by augmenting each 2-element subset to include the best solution not in this subset. • Subset Type 3: 4-element subsets derived from the 3-element subsets by augmenting each 3-element subset to include the best solutions not in this subset. • Subset Type 4: the subsets consisting of the best i elements, for i = 5 to b.

  28. Subsets of Size > 2

  29. Variable Number of Solutions Quality 1 Best 2 . . . Generate 5 solutions Generate 3 solutions Generate 1 solution b Worst RefSet of size b

  30. Hybrid Approaches • Use of Memory • Tabu Search mechanisms for intensification and diversification • GRASP Constructions • Combination Methods • GA Operators • Path Relinking

  31. Multiobjective Scatter Search • This is a fruitful research area • Many multiobjective evolutionary approaches exist (Coello, et al. 2002) • SS can use similar techniques developed for MOEA (multiobjective evolutionary approches)

  32. Multiobjective EA Techniques • Independent Sampling • Search on f(x) = wi fi(x) • Change weights and rerun • Criterion Selection • Divide reference set into k subsets • Admission to ith subset is according to fi(x)

  33. Advanced Designs • Reference Set Update • Dynamic / Static • 2 Tier / 3 Tier • Subset Generation • Use of Memory • Explicit Memory • Attributive Memory • Path Relinking

  34. An ExampleThe Linear Ordering Problem • Given a matrix of weights E = {eij}mxm, the LOP consists of finding a permutation p of the columns (and rows) in order tomaximize the sum of the weights in the upper triangle • Applications • Triangulation for Input-Output Economic Tables. • Aggregation of individual preferences • Classifications in Sports Maximize

  35. An Instance 1 2 3 4 3 4 1 2 1 2 3 4 3 4 1 2 p=(1,2,3,4) cE(p)=12+5+3+2+6+9=37 p*=(3,4,1,2) cE(p*)=9+8+3+11+4+12=47

  36. Diversification Generator • Use of problem structure to create methods in order to achieve a good balance between quality and diversity. • Quality • Deterministic constructive method • Diversity • Random Generator • Systematic Generators (Glover, 1998) • GRASP constructions. • The method randomly selects from a short list of the most attractive sectors. • Use of Memory • Modifying a measure of attractiveness proposed by Becker with a frequency-based memory measure that discourages sectors from occupying positions that they have frequently occupied.

  37. Diversity vs. Quality • Compare the different generators • Create a set of 100 solutions with each one d = Standardized Diversity C = Standardized Quality

  38. Improvement Method • INSERT_MOVE (pj, i)consist of deleting pjfrom its current position j to be inserted in position i • Apply a first strategy • scans the list of sectors in search for the first sector whose movement results in an improvement MoveValue = CE(p’) - CE(p) CE(p’) = 78 + (1 - 4) + (6 - 0) + (2 - 6) + (13 - 4)= 78 + 8 = 86

  39. Solution Combination Method • The method scans (from left to right) each reference permutation. • Each reference permutation votes for its first element that is still not included in the combined permutation (“incipient element”). • The voting determines the next element to enter the first still unassigned position of the combined permutation. • The vote of a given reference solution is weighted according to the incipient element’s position. Incipient element (3,1,4,2,5) votes for 4 Solution under construction: (1,4,3,5,2) votes for 4 (3,1,2,4,_ ) (2,1,3,5,4) votes for 5

  40. Experiments with LOLIB • 49 Input-Output Economic Tables

  41. Another ExampleA commercial SS implementation • OptQuest Callable Library (by OptTek) • As other context-independent methods separates the method and the evaluation.

  42. OptQuest based Applications Solution Generator Solution Evaluator

  43. Feasibility and Evaluation User Implementation Returns to OptQuest The OptQuest engine generates a new solution

  44. Comparison with Genocop • Average on 28 hard nonlinear instances

  45. Conclusions • The development of metaheuristics usually entails a fair amount of experimentation (“skill comes from practice”). • Code objectives: • Quick Start • Benchmark • Advanced Designs • Scatter Search provides a flexible “framework” to develop solving methods.

  46. Metaheuristic Classification • x/y/z Classification • x = A (adaptive memory) or M (memoryless) • y = N (systematic neighborhood search) or S (random sampling) • Z = 1 (one current solution) or P (population of solutions) • Some Classifications • Tabu search (A/N/1) • Genetic Algorithms (M/S/P) • Scatter Search (M/N/P)

  47. Some Classifications (local search) Tabu Search A/N/1 Simulated Annealing M/S/1 (systematic) (randomized) Genetic Algorithm M/S/P Scatter Search M/N/P (population)

  48. About the Classifications • Our four main methods (SA, TS, GA, SS) all belong far from the center (they are very randomized or very systematic) • Other methods have both some element of randomized and some element of systematic behaviour • Most implementations will mix the ingredients, and we have an element of local search in population based methods (e.g., Memetic Algorithms), or an element of randomness in systematic approaches (such as random tabu tenure in TS) • The classifications highlight the differences between methods, but there are also many similarities

  49. GA vs. SS (1) • GA has a ”long” history: proposed in the 1970s, and immediately becoming popular • Not initially used for optimization • Gradually morphed into a metodology whose major concern is the solution of optimization problems • The concepts and principles of SS was also proposed early (1970s), but was not popularized until the 1990s • The SS template most often used is from 1998 • Propsed to solve Integer Programming problems

  50. GA vs. SS (2) • GA is based on natural processes (genetics, the ”survival of the fittest”, and imitation of the nature) • SS is based on strategic ideas for how to use adaptive memory • Some TS concepts are critically linked with SS

More Related