1 / 37

Introduction to Scatter Search

Introduction to Scatter Search. ENGG*6140 – Paper Review Presented by: Jason Harris & Stephen Coe M.Sc. Candidates University of Guelph. Outline. Introduction Scatter Search Template Diversification Generation Method Improvement Method Reference Set Update Method

linda-downs
Download Presentation

Introduction to Scatter Search

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Scatter Search ENGG*6140 – Paper Review Presented by: Jason Harris & Stephen Coe M.Sc. Candidates University of Guelph

  2. Outline • Introduction • Scatter Search Template • Diversification Generation Method • Improvement Method • Reference Set Update Method • Subset Generation Method • Solution Combination Method • Example (0-1 Knapsack Problem) • Comparison with GA

  3. Introduction • Evolutionary method • Fundamental concepts were first introduced in the 1970’s and are based on formulations from the 1960’s • Original proposal in 1977 was tabled by Fred Glover, after that Scatter Search was left until 1990. • Uses strategies that both diversify and intensify solutions • Solutions are generated using combination strategies as opposed to probabilistic learning approaches

  4. Scatter Search Foundations • Useful information about the solution is typically contained in a diverse collection of elite solutions • Combination strategies incorporate both diversity (extrapolation) and intensification (interpolation). • Multiple solution combination enhance the opportunity to exploit information contained in the union of elite solutions

  5. Introduction C These solutions are in a raw form. In most scatter searches these solutions are subject to heuristic improvement. 2 Convex 1 A B 4 3 Non-convex

  6. RefSet – Reference set of Solutions b – Size of the reference set xi – The ith solution in the Reference set (x1 is the best and xb is the worst). P – Set of solutions generated by diversification generation method PSize – Size of the population of diverse solutions s – A subset of reference solutions ssize – The size of the subset of reference solutions Scatter Search Notation

  7. P Improvement Method Solution Combination Method Improvement Method Subset Generation Method s s s RefSet No New Reference Solutions Added Scatter Search Template Diversification Generation Method Reference Set Update Method

  8. Diversification Generation Method • The idea behind the diversification generation method is to generate a collection of diverse solutions • The quality of the solutions is not of importance • Generation methods are often customized to specific problems • PSize is usually set to the maximum of 100 or 5*b • Can be totally deterministic or partially random

  9. Improvement Method • Must be able to handle both feasible and infeasible solutions • It is possible to generate multiple instances of the same solution • Generally employs local searches not unlike those previously introduced in this class (steepest descent) • This is the only component that is not necessary to implement the scatter search algorithm

  10. Subset Combination Method • Construct subsets by building subsets of Type 1, Type 2, Type 3 and Type 4 subsets • For a RefSet of b there are approximately (3b-7)*b/2 subset combinations • The number of subsets can be reduced by considering just one layer of subsets to reduce computational time

  11. Solution Combination Method • Generally problem specific, because it is directly related to a solution representation • Can generate more than one solution and can depend on the quality of the solutions being combined • Can also generate infeasible solutions • If a subset has been calculated on a previous iteration it is not necessary to do the calculation again

  12. Reference Update Method • Objective is to generate a collection of both high quality solutions and diverse solutions • The number of Solution included in the RefSet is usually less than 20 • Consists of the b1 best solutions from the preceding step (solution combination or diversification generation) • Consists of the b2 solutions that have the largest Euclidian distance from the current RefSet solutions • Multiple techniques are employed to update the reference set (Static, Dynamic, 2-Tier etc…)

  13. Example (0-1 Knapsack) Maximize: 11x1+10x2+9x3+12x4+10x5+6x6+7x7+5x8+3x9+8x10 (Co-efficients represent profit for each item) Subject to: 33x1+27x2+16x3+14x4+29x5+30x6+31x7+33x8+14x9+18x10100 (Co-efficients represent weight for each item) xi={0,1} for i=1,…,10

  14. Diversification Generator Our goal is to generate a diverse population from a random seed: x=(0,0,0,0,0,0,0,0,0,0) Select h  n-1, we will choose h=5 x’1+hk=1-x1+hk for k=0,1,…,n/h (Values that are not visited are equal to the original seed x) Another set of diverse solutions are generated base on the compliment of x’: x”i=1-x’I

  15. Diversification Generator 9 (0,1,1,1,0,1,1,1,0,1) 57 169

  16. Improvement Method x1 = 0.333 (0,1,1,1,0,1,1,1,0,1) x2 = 0.370 x3 = 0.563 Objective = 57 x4 = 0.857 x5 = 0.345 x6 = 0.200 Weight = 169 x7 = 0.226 x8 = 0.152 x9 = 0.214 x10 = 0.444

  17. Improvement Method x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,1,1,0,0,1) x5 = 0.345 x6 = 0.200 Objective = 52 x7 = 0.226 x8 = 0.152 1→0 Weight = 136 x9 = 0.214 x10 = 0.444

  18. Improvement Method x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,1,0,0,1) x5 = 0.345 x6 = 0.200 1→0 Objective = 46 x7 = 0.226 x8 = 0.152 1→0 Weight = 106 x9 = 0.214 x10 = 0.444

  19. Improvement Method x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444

  20. Improvement Method x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,1,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 49 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 104 x9 = 0.214 x10 = 0.444

  21. Improvement Method x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444

  22. Improvement Method x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (1,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 50 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 108 x9 = 0.214 x10 = 0.444

  23. Improvement Method x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444

  24. Improvement Method x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,1,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 46 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 106 x9 = 0.214 x10 = 0.444

  25. Improvement Method x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444

  26. Improvement Method x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,1,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 42 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 89 x9 = 0.214 0→1 x10 = 0.444

  27. Improvement Method x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,1,0,0,1,1) x5 = 0.345 0→1 x6 = 0.200 1→0 0→1 Objective = 48 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 119 x9 = 0.214 0→1 x10 = 0.444

  28. Improvement Method x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,1,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 42 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 89 x9 = 0.214 0→1 x10 = 0.444 Most Improved Feasible Solution of #9

  29. Improvement Method

  30. Reference Set Update 1 (0,1,1,1,0,0,0,0,1,1) 2 (1,0,1,1,1,0,0,0,0,0) 8(0,1,1,1,1,0,0,0,1,0)

  31. Reference Set Update In the previous slide, the elite solutions were taken as 1, 2 and 8. Diversity also needs to be incorporated into the Reference set by taking the solutions that are farthest from the best solutions. Distance: (Solution 1 and Solution 8) (0, 1, 1, 1, 0, 0, 0, 0, 1, 1) Solution 1 (0 ,1 ,1 ,1 ,1 ,0 ,0 ,0 ,1 ,0) Solution 8 (0+0+0+0+1+0+0+0+0+1) = 2 3 (1,0,0,1,0,0,1,0,0,1) 7 (0,1,0,1,0,1,0,0,0,1)

  32. Subset Generation Type 1 Solutions (1,2) (1,8) (1,3) (1,7) (2,8) (2,3) (2,7) (8,3) (8,7) (3,7) Type 2 Solutions (1,2,8) (1,3,8) (1,7,8) (2,3,8) (2,7,8) (3,7,8) Type 3 Solutions (1,2,8,3) (1,7,8,2) (3,7,8,1) Type 4 Solutions (1,2,8,3,7)

  33. Solution Combination

  34. Solution Combination If score (i) > 0.5 If score (i) ≤ 0.5 x’ = ( 0, 1, 0, 1, 0, 0, 0, 0, 0, 1 )

  35. Scatter Search Comparison to GA • RefSet size vs. Generation Size • All solutions participate in combination in Scatter Search • Evolution of the population is controlled by deterministic rules • Local search procedures are integral to Scatter Search • Scatter Searches are not generally limited to combining two “parent” solutions • Initial population is not constructed in a random manner

  36. Scatter Search Applications

  37. References Cung, V., T. Mautor, P. Michelon, and A. Tavares (1997), “A Scatter Search Based Approach for the Quadratic Assignment Problem” Proceedings of the IEEE-ICEC'97 Conference in Indianapolis, April 13-16 Glover, F., A. Løkketangen and D. Woodruff (1999), “Scatter Search to Generate Diverse MIP Solutions” in OR Computing Tools for Modeling, Optimization and Simulation: Interfaces in Computer Science and Operations Research, M. Laguna and J.L. Gonazalez-Velarde (Eds.), Kluwer Academic Publishers, pp. 299-317 http://www-bus.colorado.edu/faculty/glover/ssdiversemip.pdf (Last Access: March 24th 2003) Glover, F., M. Laguna and R. Martí (2000), “Scatter Search” To appear in Theory and Applications of Evolutionary Computation: Recent Trends, A. Ghosh and S. Tsutsui (Eds.), Springer-Verlag http://leeds.colorado.edu/Faculty/Laguna/articles/ss2.pdf (Last Access: March 24th 2003) Glover, F., M. Laguna and R. Martí (2000), “Fundamentals of Scatter Search and Path Relinking” Control and Cybernetics, 29 (3), pp. 653-684 http://leeds.colorado.edu/Faculty/Laguna/articles/ss3.pdf (Last Access: March 24th 2003) Glover, F., M. Laguna and R. Martí (2002), “Fundamentals of Scatter Search and Path Relinking: Foundations and Advanced Designs”to appear in New Optimization Techniques in Engineering, Godfrey Onwubolu (Eds.) http://leeds.colorado.edu/faculty/glover/aassprad.pdf (Last Access: March 24th 2003) Laguna, M. (2002), “Scatter Search” in Handbook of Applied Optimization, P. M. Pardalos and M. G. C. Resende (Eds.), Oxford University Press, pp. 183-193 http://www-bus.colorado.edu/Faculty/Laguna/articles/ss1.pdf (Last Access: March 24th 2003) Laguna, M., and R. Martí (2003), “Scatter Search: Methodology and Implementations in C” Kluwer Academic Publishers, Boston, 312 pp.

More Related