1 / 103

Memetic Algorithms

Memetic Algorithms. Dr. N. Krasnogor Interdisciplinary Optimisation Laboratory Automated Scheduling, Optimisation and Planning Research Group School of Computer Science & Information Technology University of Nottingham www.cs.nott.ac.uk/~nxk. Outline of the Talk.

fbowles
Download Presentation

Memetic Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Memetic Algorithms Dr. N. Krasnogor Interdisciplinary Optimisation Laboratory Automated Scheduling, Optimisation and Planning Research Group School of Computer Science & Information Technology University of Nottingham www.cs.nott.ac.uk/~nxk

  2. Outline of the Talk • Evolutionary Algorithms Revisited • MAs’ Motivation (General Versus Problem Specific Solvers) • MAs design issues • A Few Exemplar Applications • Related Methods and Advanced Topics • Putting it all Together • Conclusions, Q&A

  3. Based on: • N. Krasnogor. Handbook of Natural Computation, chapter Memetic Algorithms. Natural Computing. Springer Berlin / Heidelberg, 2009. • J. Bacardit and N. Krasnogor. Performance and efficiency of memetic pittsburgh learning classifier systems. Evolutionary Computation, 17(3), 2009. • Q.H. Quang, Y.S. Ong, M.H. Lim, and N. Krasnogor. Adaptive cellular memetic algorithm. Evolutionary Computation, 17(3), 2009. • N. Krasnogor and J.E. Smith. Memetic algorithms: The polynomial local search complexity theory perspective. Journal of Mathematical Modelling and Algorithms, 7:3-24, 2008. • M. Tabacman, J. Bacardit, I. Loiseau, and N. Krasnogor. Learning classifier systems in optimisation problems: a case study on fractal travelling salesman problems. In Proceedings of the International Workshop on Learning Classifier Systems, volume (to appear) of Lecture Notes in Computer Science. Springer, 2008. • N. Krasnogor and J.E. Smith. A tutorial for competent memetic algorithms: model, taxonomy and design issues. IEEE Transactions on Evolutionary Computation, 9(5):474- 488, 2005. • W.E. Hart, N. Krasnogor, and J.E. Smith, editors. Recent advances in memetic algorithms, volume 166 of Studies in Fuzzyness and Soft Computing. Springer Berlin Heidelberg New York, 2004. ISBN 3-540-22904-3. • N. Krasnogor. Self-generating metaheuristics in bioinformatics: the protein structure comparison case. Genetic Programming and Evolvable Machines, 5(2):181-201, 2004. • N.Krasnogor and S. Gustafson. A study on the use of “self-generation” in memetic algorithms. Natural Computing, 3(1):53 - 76, 2004. • M. Lozano, F. Herrera, N. Krasnogor, and D. Molina. Real-coded memetic algorithms with crossover hill-climbing. Evolutionary Computation, 12(3):273-302, 2004. • All material available at www.cs.nott.ac.uk/~nxk/publications.html

  4. Evolution Environment Individual Fitness Natural Selection Problem Solving Problem Candidate/Feasible Solution Solution quality, i.e. objective value Simulated Pruning of bad solutions Fitness  survival and reproduction likelihood Objective value  chances of generating related (but not necessarily identical) solutions Evolutionary Computation Most Important Metaphors A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Hybridisation with other techniques: Memetic Algorithms

  5. Simulated Evolution Consists of: • The population contains a diverse set of individuals (i.e. solutions to an optimisation problem) • Those features that make solutions good under a specific objective function tend to proliferate • Variation is introduced into the population by means of mutation, crossover and, for MAs, local search. • Selection culls the population throwing out bad solutions and keeping the most promising ones

  6. The Metaphor of “Adaptive landscape” (Wright, 1932) (1) • Solutions (i.e. individuals) are represented by n properties + its quality value. • One can imagine the space of all solutions and their quality values represented as a graph (i.e. landscape) in n+1 dimensions. • In this metaphor, a specific individual is seen as a point in the landscape. • Each point in the landscape will have neighbouring points, which are those solutions that are somehow related to that point implying a neighbourhood. • It is called “adaptive” landscape because the quality value function, F(), depends on the features and properties of the individual and hence the better those are the higher(smaller) the value.

  7. Objective Value a solutions Solutions have 2 features Prop 2 Prop 1 An example: Maximisation Problem with HillClimber (2)

  8. Prop 2 Prop 1 An example: Maximisation Problem with EA (3)

  9. Evolutionary Algorithms in Context • There are several opinions about the use of metaheuristics in optimisation • For the majority of problems a specific algorithm could: • work better than a generic algorithm in a large set of instances , • but it can be very limited on a different domain. • don’t work too good for some instances. • One important research challenge is : • to build frameworks that can be robust across a set of problems delivering good enough/cheap enough/soon enough solutions. • to a variety of problems and instances.

  10. Outline of the Talk • Evolutionary Algorithms Revisited • MAs’ Motivation (General Versus Problem Specific Solvers) • MAs design issues • A Few Exemplar Applications • Related Methods and Advanced Topics • Putting it all Together • Conclusions, Q&A

  11. the Rollroyce for problem P The For T for problem solving specific method metaheuristic method random search method performance on problems scale of all problems EAs as problem solvers: The pre-90s view P A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Hybridisation with other techniques: Memetic Algorithms

  12. Evolutionary Algorithms and domain knowledge • Fashionable after the 90’s: to add problem specific information into the EAs by means of specialized crossover, mutation, representations and local search • Result: The performance curve deforms and • makes EAs better in some problems, • worst on other problems • amount of problem specific is varied. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Hybridisation with other techniques: Memetic Algorithms

  13. EA 4 EA 2 EA 3 EA 1 method performance on problems P scale of all problems Michalewicz’s Interpretation A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Hybridisation with other techniques: Memetic Algorithms

  14. So What Are Memetic Algorithms? MAs are carefully orchestrated interplay between (stochastic) global search and (stochastic) local search algorithms N. Krasnogor. Handbook of Natural Computation, chapter Memetic Algorithms. Natural Computing. Springer Berlin / Heidelberg, 2009

  15. So What Are Memetic Algorithms?Adding Domain Knowledge to EAs • Memetic Algorithms (MAs) were originally inspired by: • Models of adaptation in natural systems that combine evolutionary adaptation of population of individuals (GAs) • WITH • Individual learning (LS) within a lifetime (others consider the LS as development stage). Learning took the form of (problem specific) local search • PLUS • R. Dawkin’s concept of a meme which represents a unit of cultural evolution that can exhibit refinement, hence the local search can be adaptive. BUT WHY?

  16. Prop 2 Prop 1 An example: Maximisation Problem with EA (3)

  17. At design time lots of issues arise The Canonical MA From Eiben’s & Smith “Introduction To Evolutionary Computation”

  18. Memetic Algorithms: the issues involved Motivation • There are several reasons why it is worthwhile hybridizing: • Complex problems can be partially decomposable, different subproblems be better solved by different methods: • EA could be used as pre/post processors • Subproblem specific information can be placed into variation operators or into local searchers • In some cases there are exact/approximate methods for subproblems • Well established theoretical results: generally good black-box optimisers do not exist. This is why successful EAs are usually found in “hybridized form” with domain knowledge added in • EA are good at exploring the search space but find it difficult to zoom-in good solutions • Problems have constraints associated to solutions and heuristics/local search are used to repair solutions found by the EA • If heuristic/local search strategies in MAs are “first class citizens” then one can raise the level of generality of the algorithm without sacrificing performance by letting the MA choose which local search to use.

  19. A conservation of competence principle applies: the better one algorithm is solving one specific instance (class)the worst it is solving a different instance (class) [Wolpert et.al.] It cannot be expected that a black-box metaheuristic will suit all problem classes and instances all the time, that is, it is theoretically impossible to have both ready made of-the-shelf general & good solvers for all problems. MAs and Hyperheuristics are good algorithmic templates that aid in the balancing act of successfully & cheaply using general, of-the-shelf, reusable solvers (EAs) with adds-on instance (class) specific features.

  20. This is my solution to your problem I used strategy X Population of agents When I grow up I’ll need to decide whose problem solving strategy to use Evolutionary Algorithm Memetics Offspring What happens inside an MA?

  21. Outline of the Talk • Evolutionary Algorithms Revisited • MAs’ Motivation (General Versus Problem Specific Solvers) • MAs design issues • A Few Exemplar Applications • Related Methods and Advanced Topics • Putting it all Together • Conclusions, Q&A

  22. Memetic Algorithms: the issues involved Baldwinism VS Lamarckianism • Lamarkian • traits acquired by an individual during its lifetime can be transmitted to its offspring • e.g. replace individual with fitter neighbour • Baldwinian • traits acquired by individual cannot be transmitted to its offspring • e.g. individual receives fitness (but not genotype) of fitter neighbour

  23. Baldwin’s “filter” Raw fitness

  24. Memetic Algorithms: the issues involved Diversity The loss of diversity is specially problematic in MAs as the LS tends to focus excesively in a few good solutions. If the MA uses LS up to local optimae then it becomes important to constantly identify new local optimae If the MA uses partial LS you could still be navigating around the basins of attractions of a few solutions

  25. Memetic Algorithms: the issues involved Diversity • There are various ways to improve diversity (assuming that’s what • one wants!): • if the population is seeded only do so partially. • instead of applying LS to every individual choose whom to apply it to. • use variation operators that ensure diversity (assorted) • in the local search strategy include a diversity weigth • modify the selection operator to prevent duplicates • archives • modify the acceptance criteria in the local search:

  26. when population is diverse T <= 0  only accepts improvements when population is converged T goes to infinity accepts both better and worst solutions (explores) Memetic Algorithms: the issues involved Diversity The following modified MC exploits solutions (zooms-in) when the population is diverse. If the population is converged it explores (zooms-out) The temperature T of the MC is defined for each generation as: A new solution is accepted when:

  27. Memetic Algorithms: the issues involved Operators Choice • The choice of LS/Heuristic is one of the most important steps in • the design of an MA • Local searchers induce search landscapes and there has been various attempts to characterize these. Kallel et.al. and Merz et.al. have shown that the choice of LS can have dramatic impact on the efficiency and effectiveness of the MA • Krasnogor formally proved that to reduce the worst case run time of MAs LS move operators must induce search graphs complementary (or disjoint) than those of the crossover and mutation. • Krasnogor and Smith have also shown that the optimal choice of LS operator is not only problem and instance dependent but also dependent on the state of the overall search carried by the underlying EA • The obvious way to implement 2&3 is to use multiple local searchers within an MA (multimeme algorithms) • and we will see that the obvious way of including feedback like that suggested by 1 is to • use self-generated multiple local searchers (self-generating MAs aka co-evolving MAs)

  28. Thanks to P. Merz! Memetic Algorithms: the issues involved Fitness Landscapes

  29. Multiple Local Searchers

  30. Memetic Algorithms: the issues involved Use of Knowledge • The use of knowledge is essential for the success of a search methods • There are essentially two stages when knowledge is used: • At design time: eg, in the form of local searchers/heuristics, specific variation operators, initialization biases, etc. • At run time: • using tabu-like mechanisms to avoid revisiting points (implicit) • using adaptive operators that bias search towards unseen/promising regions of search space (implicit) • creating new operators on-the-fly, eg., self-generating or co-evolving MAs (explicit) With appropriate data-mining techniques we can turn implicitknowledge into explicit and feed it back into the design process! (Deb Calls this Innovisation)

  31. Outline of the Talk • Evolutionary Algorithms Revisited • MAs’ Motivation (General Versus Problem Specific Solvers) • MAs design issues • A Few Exemplar Applications • Related Methods and Advanced Topics • Putting it all Together • Conclusions, Q&A

  32. Showcase Applications The Maximum Diversity Problem Katayama & Narihisa solve the MDP by means of a sophisticated MA. The MDP: The problem consists in selecting out of a set of N elements, M which maximize certain diversity measure Dij

  33. Showcase Applications The Maximum Diversity Problem • This problem is at the core of various important real-world applications: • Immigration and admission policies • Committee formation • Curriculum design • Portfolio selection • Combinatorial chemical libraries • etc

  34. Primary Structure Secondary Structure Tertiary Structure Showcase Applications Protein Structure Prediction by us

  35. Showcase Applications Protein Structure Prediction Krasnogor, Krasnogor & Smith, Krasnogor & Pelta, Smith have used MAs to study fundamentals of the algorithmics behind PSP in simplified models.

  36. Standard MA template except that Multiple Memes which promote diversity by means of fuzzy rules are used Showcase Applications Protein Structure Prediction

  37. Promotes Diversity Promotes improvements Showcase Applications Protein Structure Prediction Membership function for “acceptable” solutions Two distinct “acceptability” concepts

  38. New optimal solutions Showcase Applications Protein Structure Prediction

  39. Showcase Applications Optimal Engine Calibration • The OEC problem is paradigmatic of many industrial problems. • In this problem many combinatorial optimisation problems occur: • Optimal Design of Experiments • Optimal Test Bed Schedule • Look-up Table Calculation

  40. By P.Merz: Showcase Applications Optimal Engine Calibration

  41. Standard MA template Showcase Applications Optimal Engine Calibration

  42. Showcase Applications Circuit Partitioning • CP is the task of dividing a circuit into smaller parts. Its an important component of the VLSI Layout problem: • the division permits the fabrication of circuits in physically distinct components • By dividing we conquer: resulting circuits can fit fabrication norms, complexity is reduced • Can reduce heat dissipation, energy consumption, etc. this is a minimization objective this is a constraint

  43. From S.Areibi’s chapter: A graphical example Showcase Applications Circuit Partitioning

  44. Various features: distinct repair & LS, GRASP for init, diversification phase, accelerated LS. Showcase Applications The Maximum Diversity Problem

  45. Outline of the Talk • Evolutionary Algorithms Revisited • MAs’ Motivation (General Versus Problem Specific Solvers) • MAs design issues • A Few Exemplar Applications • Related Methods and Advanced Topics • Putting it all Together • Conclusions, Q&A

  46. Related Methodologies Teams of Heuristics Variable Neighbourhood Search: under this approach a number of different neighbourhood structures are systematically explored, tries to improve the current solution while avoiding poor local optima. A-teams of Heuristics: in A-Teams a set of constructive, improvement and destructive heuristics are asynchronously used to improve solutions. Hyperheuristics: the main concept behind the hyperheuristic is that of managing the application of other heuristics adaptively with the purpose of improving solutions.

  47. Methodologies Cooperative Local Search (Landa Silva & Burke) Cycle of each individual in pop The search cycle of each individual begins Cooperation mechanism Gets stuck sharing moves, parts, centralized control, etc Finds something to do. Gets unstuck Note that this differs from teams of heuristics in that here the cooperation is made explicit

  48. Methodologies Off-line & In-line operators discovery All the previous methodologies clearly benefits the end user as they have been shown to provide improvements in robustness, quality, etc. But what do we do if we do not have, or don’t know, good heuristics which could be used by,eg., A-teams, VNS or CLS? Also, why don’t we use the information the algorithm produces to better understand and make explicit new knowledge of the search landscape capturing this knowledge in new operators?

  49. Methodologies On-the-fly operators discovery • Two alternatives: • Off-line: Whitley and Watson did it successfully for TS, and Kallel et al for other methods. Tabacman et al demonstrated it for TSP • In-line: Krasnogor, Krasnogor & Gustafson, J.E. Smith and others for MAs on PSP, PSC & other problems

More Related