1 / 62

Genetic Algorithms

ŠIAULIAI UNIVERSITY. Genetic Algorithms. Gra žvydas Felinskas LITHUANIA Šiauliai University Faculty of Mathematics and Informatics Department of Informatics 2009-10-27.

Download Presentation

Genetic Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ŠIAULIAI UNIVERSITY Genetic Algorithms GražvydasFelinskas LITHUANIA Šiauliai University FacultyofMathematicsandInformatics Department of Informatics 2009-10-27

  2. “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions you might not otherwise find in a lifetime.” - Salvatore Mangano Computer Design, May 1995 Genetic Algoritms (GA) // Wendy Williams – Metaheuristic Algorithms //

  3. Some concepts (definitions) Heuristic(pronounced /hjʊˈrɪstɨk/, from the Greek "Εὑρίσκω" for "find" or"discover") is an adjective for experience-based techniquesthat help inproblem solving, learning and discovery. A heuristic method isparticularly used to rapidly come to a solution that is hoped to be closeto the best possible answer, or 'optimal solution'. Heuristics are "rulesof thumb", educated guesses, intuitive judgments or simply common sense. //wikipedia//

  4. Some concepts (definitions) In computer science, a heuristic algorithm, or simply a heuristic, is an algorithm that is able to produce an acceptable solution to a problem in many practical scenarios, in the fashion of a general heuristic, but for which there is no formal proof of its correctness. Alternatively, it may be correct, but may not be proven to produce an optimal solution, or to use reasonable resources. Heuristics are typically used when there is no known method to find an optimal solution, under the given constraints (of time, space etc.) or at all. //wikipedia//

  5. Some concepts (definitions) A metaheuristic is a heuristic method for solving a very general class of computational problems by combining user-given black-box procedures — usually heuristics themselves — in the hope of obtaining a more efficient or more robust procedure. The name combines the Greek prefix "meta“ ("beyond", here in the sense of "higher level") and "heuristic" (from ευρισκειν, heuriskein, "to find"). //wikipedia//

  6. History – Timeline 1952: first works on stochastics optimization methods 1965: Rechenberg conceives the first algorithm using evolution strategies 1970: Hastings conceives the Metropolis-Hastings algorithm, which can sample any probability density function 1975: John Holland proposes the first genetic algorithms 1980: Smith describes genetic programming 1983: based on Hastings's work, Kirkpatrick, Gelatt and Vecchi conceive simulated annealing 1986: first mention of the term "meta-heuristic" by Fred Glover, during the conception of tabu search 1989: Goldberg publishes one of the best known books on genetic algorithms 1991: the ant colony algorithms are proposed by Marco Dorigo 1995: Kennedy and Eberhart conceive particle swarm optimization 2000: First interactive genetic algorithms 2005: Karaboga proposes Artificial Bee Colony Algorithm ... //wikipedia//

  7. Metaheuristics classification scheme • M. Laguna*suggestedthex/y/z metaheuristicsclassificationscheme. Thisclassificationisbasedonthreemethod’sparameters: • x = A (adaptive memory) or M (memoryless), • y = N (systematic neighborhood search) or S (random sampling), • Z = 1 (one current solution) or P(population of solutions). • Forexample, severalmetaheuristicscan be attributed to thefollowingclasses: • Tabu search – A / N / 1, • Simulated annealing method – M / S / 1, • Genetic Algorithms – M/S/P, • Scatter Search – M/N/P. • * Laguna M., homepage, (2006). <http://leeds-faculty.colorado.edu/laguna>

  8. Classes of Search Techniques // Wendy Williams – Metaheuristic Algorithms //

  9. General Introduction to GA’s • Genetic algorithms (GA’s) are a technique to solve problems which need optimization • GA’s are a subclass of Evolutionary Computing • Directed search algorithms based on the mechanics of biological evolution • GA’s are based on Darwin’s theory of evolution • Biological background: Origin of species, Natural selection // Peter Spijker //

  10. Nature Vs Computer - Mapping // Abhishek Sharma //

  11. GA Quick Overview • Typically applied to: • - discrete optimization • Attributed features: • - not too fast • - good heuristic for combinatorial problems • Special Features: • - Traditionally emphasizes combining information from good parents (crossover) • - many variants, e.g., reproduction models, operators

  12. History of GA’s • Originator: Holland J. H., (1975). Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. Ann Arbor, MI: Univ. of Michigan Press. - explored the concept of using mathematically-based artificial evolution as a method to conduct a structured search for solutions to complex problems. • Goldberg D.E., (1989). Genetic Algorithm in Search, Optimization and Machine Learning, Addison-Wesley Publishing Company, Inc., Reading, Massachusetts. - suggested applications for genetic algorithms in a wide range of engineering fields.

  13. DEFINITION OF THE GA • The genetic algorithm is a probabalistic search algorithm that iteratively transforms a set (called a population) of mathematical objects (typically fixed-length binary character strings), each with an associated fitness value, into a new population of offspring objects using the Darwinian principle of natural selection and using operations that are patterned after naturally occurring genetic operations, such as crossover and mutation. // John R. Koza //

  14. IMPRACTICALITY OF RANDOM OR ENUMERATIVE SEARCH • Imagine search space: • 81-bit problems are small for GA • However, even if L is as small as 81, 281 ~ 1027 = number of nanoseconds since the beginning of the universe 15 billion years ago.

  15. PROBABILISTIC SELECTION BASED ON FITNESS • Better individuals are preferred • Best is not always picked • Worst is not necessarily excluded • Nothing is guaranteed • Mixture of greedy exploitation and adventurous exploration

  16. PROBABILISTIC STEPS • The initial population is typically random • Probabilistic selection based on fitness • Best is not always picked • Worst is not necessarily excluded • Random picking of mutationand crossover points • Often, there is probabilistic scenario as part of the fitness measure

  17. GAs: The main components 1. Genetic representation(encoding) 5. Genetic operators(crossover, mutation) 2. Method for generating the initial population 6. Mechanism for creating successive generations GAs 3. Evaluation function 7. Stopping Criteria 4. Reproduction selection scheme 8. GA parameter settings(practice and art) // W.Williams, T. Tunnukij, C. Hicks //

  18. More terms: gene Chromosome Population parents … children

  19. Simple Genetic Algorithm paradigm { initialize population; evaluate population; while TerminationCriteriaNotSatisfied { select parents for reproduction; perform recombination (crossover) and mutation; evaluate population; } }

  20. Genetic Algorithm – Search space • Most often we are looking for the best solution in a specific subset of solutions • This subset is called the search space • Every point in the search space is a possible solution • Therefore every point has a fitness value, depending on the problem definition • GA’s are used to search the search space for the best solution, e.g. a minimum • Difficulties are the local minima and the starting point of the search

  21. Genetic Algorithm – Basic algorithm • Starting with a subset of n randomly chosen solutions from the search space (i.e. chromosomes). This is the population. • This population is used to produce a next generation of individuals by reproduction • Individuals with a higher fitness have more chance to reproduce (i.e. natural selection)

  22. Genetic Algorithm – Coding • Chromosomes are encoded by bitstrings • Every bitstring therefore is a solution but not necessarely the best solution • The way bitstrings can be coded differs from problem to problem

  23. Methodology Associated with GAs Begin Initialize population Evaluate Solutions T =0 (first step) Stoping criteria satisfied? N Selection Y T=T+1 (go to next step) Crossover Stop Mutation //http://obitko.com/tutorials/genetic-algorithms/ //

  24. The GA Cycle of Reproduction children reproduction modification modified children parents evaluation population evaluated children deleted members discard // W.Williams //

  25. Population population Chromosomes could be: • Bit strings (0101 ... 1100) • Real numbers (43.2 -33.1 ... 0.0 89.2) • Permutations of element (E11 E3 E7 ... E1 E15) • Lists of rules (R1 R2 R3 ... R22 R23) • Program elements (genetic programming) • ... any data structure ...

  26. Reproduction children reproduction parents population Parents are selected at random with selection chances based in relation to chromosome evaluations.

  27. 1/6 = 17% B fitness(A) = 3 A C fitness(B) = 1 2/6 = 33% 3/6 = 50% fitness(C) = 2 Selection • Main idea: better individuals get higher chance • Chances proportional to fitness • Implementation: roulette wheel technique • Assign to each individual a part of the roulette wheel • Spin the wheel n times to select n individuals

  28. Tournament Selection - widely applied • Informal Procedure: • Pickk members at random then select the best of these • Repeat to select more individuals • Probability of selecting i will depend on: • Rank of i • Size of sample k (higher k increases selection pressure) • Whether fittest contestant always wins (deterministic) or this happens with probability p

  29. Chromosome Modification • Modifications are stochastically triggered • Operator types are: • Mutation • Crossover (recombination) children modification modified children

  30. Crossover P1 (0 1 1 0 1 0 0 0) (0 1 1 1 1 0 1 0) C1 P2 (1 1 0 1 1 0 1 0) (1 1 0 0 1 0 0 0) C2 Crossover is a critical feature of genetic algorithms: • It greatly accelerates search in early evolution of a population • It leads to effective combination of subsolutions on different chromosomes.

  31. Mutation Before: (1 0 1 1 0 1 1 0) After: (0 1 1 0 0 1 1 0) • Causes movement in the search space (local or global) • Restores lost information to the population • Mutation prevents the algorithm to be trapped in a local minimum

  32. Mutations, crossovers, … • Different GAs realizations use different: • Representations • Mutations • Crossovers • Selection mechanisms

  33. 1-point crossover • Choose a random point on the two parents • Split parents at this crossover point • Create children by exchanging tails • Pc typically in range (0.6, 0.9)

  34. n-point crossover • Choose n random crossover points • Split along those points • Glue parts, alternating between parents • Generalisation of 1 point

  35. Uniform crossover • Assign 'heads' to one parent, 'tails' to the other • Flip a coin for each gene of the first child • Make an inverse copy of the gene for the second child • Inheritance is independent of position

  36. Mutation • Alter each gene independently with a probability pm • pm is called the mutation rate • Typically between 1/pop_size and1/chromosome_length

  37. Crossover OR mutation? • which one is better / necessary / main-background ? • Answer (at least, rather wide agreement): • it depends on the problem, but • in general, it is good to have both • both have another role • mutation-only-EA is possible, crossover-only-EA would not work.

  38. Crossover OR mutation? Exploration: Discovering promising areas in the search space, i.e. gaining information on the problem Exploitation: Optimising within a promising area, i.e. using information There is co-operation AND competition between them • Crossover is explorative, it makes a big jump to an area somewhere “in between” two (parent) areas • Mutation is exploitative, it creates random small diversions, thereby staying near (in the area of ) the parent

  39. Evaluation • The evaluator decodes a chromosome and assigns it a fitness measure • The evaluator is the only link between a classical GA and the problem it is solving modified children evaluated children evaluation

  40. Deletion population • Generational GA:entire populations replaced with each iteration • Steady-state GA:a few members replaced each generation discarded members discard

  41. Issues for GA Practitioners • Choosing basic implementation issues: • representation • population size, mutation rate, ... • selection, deletion policies • crossover, mutation operators • Termination Criteria • Evaluation function for decoding solution

  42. Traveling Salesman Problem(example with route lists)

  43. Initial Population for TSP (5,3,4,6,2) (2,4,6,3,5) (4,3,6,5,2) (2,3,4,6,5) (4,3,6,2,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4)

  44. Select Parents (5,3,4,6,2) (2,4,6,3,5) (4,3,6,5,2) (2,3,4,6,5) (4,3,6,2,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4) Try to pick the better ones.

  45. Create Off-Spring – 1 point (5,3,4,6,2) (2,4,6,3,5) (4,3,6,5,2) (2,3,4,6,5) (4,3,6,2,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4) (3,4,5,6,2)

  46. Create More Offspring (5,3,4,6,2) (2,4,6,3,5) (4,3,6,5,2) (2,3,4,6,5) (4,3,6,2,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4) (3,4,5,6,2) (5,4,2,6,3)

  47. Mutate (5,3,4,6,2) (2,4,6,3,5) (4,3,6,5,2) (2,3,4,6,5) (4,3,6,2,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4) (3,4,5,6,2) (5,4,2,6,3)

  48. Mutate (5,3,4,6,2) (2,4,6,3,5) (4,3,6,5,2) (2,3,4,6,5) (2,3,6,4,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4) (3,4,5,6,2) (5,4,2,6,3)

  49. Eliminate (5,3,4,6,2) (2,4,6,3,5) (4,3,6,5,2) (2,3,4,6,5) (2,3,6,4,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4) (3,4,5,6,2) (5,4,2,6,3) Tend to kill off the worst ones.

  50. Integrate (5,4,2,6,3) (5,3,4,6,2) (2,4,6,3,5) (3,4,5,6,2) (2,3,6,4,5) (3,4,5,2,6) (3,5,4,6,2) (4,5,3,6,2) (5,4,2,3,6) (4,6,3,2,5) (3,4,2,6,5) (3,6,5,2,4)

More Related