1 / 93

Algoritmi Evolutivi

Algoritmi Evolutivi. Andrea G. B. Tettamanzi. Lezione 1. 23 aprile 2002. Contents of the Lectures. Taxonomy and History; Evolutionary Algorithms basics; Theoretical Background;

Download Presentation

Algoritmi Evolutivi

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algoritmi Evolutivi Andrea G. B. Tettamanzi

  2. Lezione 1 23 aprile 2002

  3. Contents of the Lectures • Taxonomy and History; • Evolutionary Algorithms basics; • Theoretical Background; • Outline of the various techniques: plain genetic algorithms, evolutionary programming, evolution strategies, genetic programming; • Practical implementation issues; • Evolutionary algorithms and soft computing; • Selected applications from the biological and medical area; • Summary and Conclusions.

  4. Bibliography • Th. Bäck. Evolutionary Algorithms in Theory and Practice. Oxford University Press, 1996 • L. Davis. The Handbook of Genetic Algorithms. Van Nostrand & Reinhold, 1991 • D.B. Fogel. Evolutionary Computation. IEEE Press, 1995 • D.E. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley, 1989 • J. Koza. Genetic Programming. MIT Press, 1992 • Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution Programs. Springer Verlag, 3rd ed., 1996 • H.-P. Schwefel. Evolution and Optimum Seeking. Wiley & Sons, 1995 • J. Holland. Adaptation in Natural and Artificial Systems.MIT Press 1995

  5. Tassonomia (1) Taboo Search Metodi stocastici di ottimizzazione Metodi Monte Carlo Ricottura simulata (simulated annealing) Algoritmi Evolutivi Algoritmi Genetici Strategie Evolutive Programmazione Genetica Programmazione Evolutiva

  6. Tassonomia Tratti distintivi di un algoritmo evolutivo • opera su una codifica appropriata delle soluzioni; • considera a ogni istante una popolazione di soluzioni candidate; • non richiede condizioni di regolarità (p.es. derivabilità); • segue regole di transizione probabilistiche.

  7. History (1) I. Rechenberg, H.-P. Schwefel TU Berlin, ‘60s John Koza Stanford University ‘80s John H. Holland University of Michigan, Ann Arbor, ‘60s L. Fogel UC S. Diego, ‘60s

  8. History (2) 1859 Charles Darwin: inheritance, variation, natural selection 1957 G. E. P. Box: random mutation & selection for optimization 1958 Fraser, Bremermann: computer simulation of evolution 1964 Rechenberg, Schwefel: mutation & selection 1966 Fogel et al.: evolving automata - “evolutionary programming” 1975 Holland: crossover, mutation & selection - “reproductive plan” 1975 De Jong: parameter optimization - “genetic algorithm” 1989 Goldberg: first textbook 1991 Davis: first handbook 1993 Koza: evolving LISP programs - “genetic programming”

  9. Evolutionary Algorithms Basics • what an EA is (the Metaphor) • object problem and fitness • the Ingredients • schemata • implicit parallelism • the Schema Theorem • the building blocks hypothesis • deception

  10. La metafora EVOLUZIONE PROBLEM SOLVING Ambiente Problema da risolvere Individuo Soluzione candidata Addattamento Qualità della soluzione

  11. Object problem and Fitness solution genotype s M f fitness object problem

  12. Gli ingredienti di un algoritmo evolutivo popolazione di soluzioni (appropriatamente codificate) t + 1 generazione t riproduzione selezione (sopravvivenza del più adatto) “DNA” di una soluzione mutazione ricombinazione

  13. Genitori Popolazione Figli Il ciclo evolutivo Selezione Ricombinazione Riproduzione Mutazione Sostituzione

  14. Pseudocode generation = 0; SeedPopulation(popSize); // at random or from a file while(!TerminationCondition()) { generation = generation + 1; CalculateFitness(); // ... of new genotypes Selection(); // select genotypes that will reproduce Crossover(pcross); // mate pcross of them on average Mutation(pmut); // mutate all the offspring with Bernoulli // probability pmut over genes }

  15. A Sample Genetic Algorithm • The MAXONE problem • Genotypes are bit strings • Fitness-proportionate selection • One-point crossover • Flip mutation (transcription error)

  16. The MAXONE Problem Problem instance: a string of l binary cells, l: Fitness: Objective: maximize the number of ones in the string.

  17. Fitness Proportionate Selection Probability of  being selected: Implementation: “Roulette Wheel” 

  18. 0 0 0 1 1 1 1 0 1 0 0 0 0 1 0 0 1 1 0 0 1 0 1 1 0 0 1 1 0 0 1 0 1 1 1 1 1 0 1 0 One Point Crossover parents offspring crossover point

  19. 1 0 1 1 0 0 1 1 0 1 1 0 1 1 1 0 1 1 0 0 Mutation pmut independent Bernoulli transcription errors

  20. Example: Selection 0111011011 f = 7 Cf = 7 P = 0.125 1011011101 f = 7 Cf = 14 P = 0.125 1101100010 f = 5 Cf = 19 P = 0.089 0100101100 f = 4 Cf = 23 P = 0.071 1100110011 f = 6 Cf = 29 P = 0.107 1111001000 f = 5 Cf = 34 P = 0.089 0110001010 f = 4 Cf = 38 P = 0.071 1101011011 f = 7 Cf = 45 P = 0.125 0110110000 f = 4 Cf = 49 P = 0.071 0011111101 f = 7 Cf = 56 P = 0.125 Random sequence: 43, 1, 19, 35, 15, 22, 24, 38, 44, 2

  21. Example: Recombination & Mutation 0111011011  0111011011  0111111011 f = 8 0111011011  0111011011  0111011011 f = 7 110|11000101100101100  1100101100 f = 5 010|01011000101100010  0101100010 f = 4 1|1001100111100110011  1100110011 f = 6 1|1001100111100110011  1000110011 f = 5 0110001010  0110001010  0110001010 f = 4 1101011011  1101011011  1101011011 f = 7 011000|10100110001011  0110001011 f = 5 110101|10111101011010  1101011010 f = 6 TOTAL = 57

  22. Lezione 2 24 aprile 2002

  23.   1 0  1    Schemata Don’t care symbol:  order of a schema: o(S) = # fixed positions defining length (S) = distance between first and last fixed position a schema S matches 2l - o(S) strings a string of length l is matched by 2l schemata

  24. Implicit Parallelism In a population of n individuals of length l 2l # schemata processed n2l n3 of which are processed usefully (Holland 1989) (i.e. are not disrupted by crossover and mutation) But see Bertoni & Dorigo (1993) “Implicit Parallelism in Genetic Algorithms” Artificial Intelligence 61(2), p. 307314

  25. Fitness of a schema f(): fitness of string  qx(): fraction of strings equal to  in population x qx(S): fraction of strings matched by S in population x

  26. The Schema Theorem {Xt}t=0,1,... populations at times t suppose that is constant i.e. above-average individuals increase exponentially!

  27. The Schema Theorem (proof)

  28. The Building Blocks Hypothesis ‘‘An evolutionary algorithm seeks near-optimal performance through the juxtaposition of short, low-order, high-performance schemata — the building blocks’’

  29. S = 000*****00 Deception i.e. when the building block hypothesis does not hold: for some schema S, but Example: S1 = 111******* S2 = ********11 * = 1111111111 S = 111*****11

  30. Remedies to deception Prior knowledge of the objective function Non-deceptive encoding Inversion Semantics of genes not positional Underspecification & overspecification “Messy Genetic Algorithms”

  31. Theoretical Background • Theory of random processes; • Convergence in probability; • Open question: rate of convergence.

  32. Eventi Spazio campionario W D w A B

  33. Variabili aleatorie W w X 0

  34. Processi Stocastici Un processo stocastico è una successione di v.a. Ciascuna con la propria distribuzione di probabilità. Notazione:

  35. EAs as Random Processes probability space a sample of size n trajectory “random numbers” evolutionary process

  36. 0.4 0.7 0.6 A B C 0.75 0.3 0.25 Catene di Markov Un processo stocastico è una catena di Markov sse il suo stato dipende solo dallo stato precedente, cioè, per ogni t,

  37. Abstract Evolutionary Algorithm Xt Stochastic functions: select select: (n) cross:  mutate:  mate:  insert:  select mate cross Transition function: mutate insert Xt+1

  38. Convergence to Optimum Theorem: if {Xt()}t = 0, 1, ...ismonotone, homogeneous, x0 is given, y in reach(x0) (n)O reachable, then Theorem: if select, mutate are generous, the neighborhood structure is connective, transition functions Tt(), t = 0, 1, ...are i.i.d. and elitist, then

  39. Lezione 3 7 maggio 2002

  40. Outline of various techniques • Plain Genetic Algorithms • Evolutionary Programming • Evolution Strategies • Genetic Programming

  41. Plain Genetic Algorithms • Individuals are bit strings • Mutation as transcription error • Recombination is crossover • Fitness proportionate selection

  42. Evolutionary Programming • Individuals are finite-state automata • Used to solve prediction tasks • State-transition table modified by uniform random mutation • No recombination • Fitness depends on the number of correct predictions • Truncation selection

  43. Evolutionary Programming: Individuals a/a • Finite-state automaton: (Q, q0, A, , ) • set of states Q; • initial state q0; • set of accepting states A; • alphabet of symbols ; • transition function : QQ; • output mapping function : Q; q0 c/c b/c b/c c/b b/a c/a a/b q1 q2 a/b state q0 q1 q2 input a q0 a q2 b q1 b b q1 c q1 a q0 c c q2 b q0 c q2 a

  44. Evolutionary Programming: Fitness a b c a b c a b individual  no prediction b =? yes f() = f() + 1

  45. Evolutionary Programming: Selection Variant of stochastic q-tournament selection: 2 1  score() = #{i | f() > f(i) } ... q Order individuals by decreasing score Select first half (Truncation selection)

  46. Evolution Strategies • Individuals are n-dimensional vectors of reals • Fitness is the objective function • Mutation distribution can be part of the genotype (standard deviations and covariances evolve with solutions) • Multi-parent recombination • Deterministic selection (truncation selection)

  47. Evolution Strategies: Individuals candidate solution rotation angles standard deviations

  48. Evolution Strategies: Mutation self-adaptation Hans-Paul Schwefel suggests:

  49. Genetic Programming • Program induction • LISP (historically), math expressions, machine language, ... • Applications: • optimal control; • planning; • sequence induction; • symbolic regression; • modelling and forecasting; • symbolic integration and differentiation; • inverse problems

  50. OR AND AND NOT NOT d0 d1 d0 d1 Genetic Programming: The Individuals subset of LISP S-expressions (OR (AND (NOT d0) (NOT d1)) (AND d0 d1))

More Related