1 / 22

Genetic Algorithms A Search & Optimization Tool

Genetic Algorithms A Search & Optimization Tool. Genetic Algorithm. Search and Optimization Algorithm. Based on principles of Natural Selection and Genetics. Proposed by John Holland of University of Michigan in 1975, to study the phenomenon of adaptation as it occurs in nature.

prue
Download Presentation

Genetic Algorithms A Search & Optimization Tool

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Genetic AlgorithmsA Search & Optimization Tool

  2. Genetic Algorithm • Search and Optimization Algorithm. • Based on principles of Natural Selection and Genetics. • Proposed by John Holland of University of Michigan in 1975, to study the phenomenon of adaptation as it occurs in nature.

  3. Problems with Traditional Methods • Search Space is often complicated and one doesn’t know where to look for the solution or where to start from. Here GA comes to help. • Traditional methods often require some domain knowledge of the problem which might not be readily available. • Many traditional methods are often sensitive to initial guesses made and provided an inappropriate guess the method may not converge to the solution.

  4. Natural Selection • A process called natural selection, ‘selects’ individuals best adapted to the environment. • Those fittest survive longest. • Characteristics, encoded in genes are transmitted to offspring and tend to propagate into new generations. • In sexual reproduction, the chromosomes of offspring are a mix of their parents. • An offspring’s characteristics are partially inherited from parents and partly the result of new genes created during the reproduction process.

  5. Terminology • Chromosome often encoded as a bit string, represent a candidate solution in the population. • Genes are either single bits or short blocks of adjacent bits that encode a particular element of the candidate solution. • Alleles are 0’s or 1’s in a bit string.

  6. Nature to Computer Mapping

  7. Elements of a Genetic Algorithm • A Population of chromosomes. • A Fitness Function. • Genetic Operators - Selection - Crossover - Mutation

  8. Genetic Operators • Selection : This operator selects chromosomes in the population for reproduction.The fitter the chromosome, the more times it is likely to be selected to reproduce. • Crossover : This operator randomly chooses a locus and exchanges the subsequences before and after that locus between two chromosomes to create two offspring. • Mutation :This operator randomly flips some of the bits in a chromosome.

  9. Basic Algorithm • Initialise and evaluate a population • While (termination condition not met) do • Select sub-population based on fitness • Produce offspring of the population using crossover • Mutate offspring stochastically • Select survivors based on fitness

  10. A Sample Example Problem: Find the value of x which maximises the function f(x)=x2 on the integer range for x [0..31] Solution: • Choose a population of 4 individuals (small by GA standards), values chosen at random. Sample population: 01101,11000,00100,10011 • Fitness values: 169, 576, 64, 361 (Average 293)

  11. A Sample Example Contd.(1) • Reproduction (e.g using a Roulette wheel) • 14.4%, 49.2%, 5.5%, 30.9% • Selection for Crossover: 01101 and 11000 • Crossover after bit 4 0110 0 and 1100 1 • Another selection: 11000 and 10011 • Crossover after bit 2 11 011 and 10 000 • In both cases bit position chosen at random

  12. A Sample Example Contd..(2) • New population after first generation • 01100, 11001, 11011, 10000 • Fitness values: 144, 625, 729, 256 Avg. 439 • Prev. generation best was 576 and Avg. 293 • Next generation will start with this population

  13. A Sample Example Contd…(3) • Mutation: Change one bit probabilistically • e.g. p_m = 0.001 • Expected probability of a mutation of one individual is 0.005 (no.of bits * p_m) • Expected probability of a mutation in whole family is 0.02 (since 4 individuals in population) • Eventually get convergence with best individual 11111 and a fitness of 961

  14. Algorithm (Revisited...) 1. Start with a randomly generated population of n chromosomes each of size m-bits. 2. Calculate the fitnessf(x) of each chromosome x in the population. 3. Repeat the following steps: a) Select a pair of parent chromosomes from the current population. b) With probability Pc, cross over the pair at a randomly chosen point to form offspring. c) Mutate the two offspring at each locus with probability Pm, and place the resulting chromosomes in the new population 4. Replace the current population with n most fit chromosomes. 5. Go To step 2

  15. Algorithm contd.. • Each iteration of this process is called a Generation (50 to 500). • The entire set of generations is called a Run. • At the end of a run there are often one or more highly fit chromosomes in the population. • This simple procedure in fact forms the basis for most applications of GA. • The success of the algorithm depends on various details like size of the population, probabilities of crossover and mutation.

  16. Search Space • The set of all possible individuals (solutions) defines the search space. • One measure of the complexity of the problem is the size of the search space. • Crossover and mutation implement a pseudo-random walk through the search space. • Walk is random because crossover and mutation are non-deterministic. • Walk is directed in the sense that the algorithm aims to maximise quality of solutions using a fitness function which measures the fitness of an individual.

  17. Theoretical Foundations • John Holland’s Schemata theorem A schema is a similarity template describing a subset of strings with similarities at certain positions. • Building Block Hypothesis Schemata with high fitness and small defining length are called building blocks. Building blocks combine together t form bigger and better BBs and eventually the optimal solution(s).

  18. GAs Vs. other Search & Optimization Methods • GAs work with a population of candidate solutions and not a single point. • GAs work with coding of parameters instead of parameters themselves. • GAs do not require any domain knowledge (gradient information etc.) and just use the payoff information. • GAs are stochastic methods, i.e., use probabilistic transition rules and not deterministic ones. • Applies to a variety of problems and not works in a restricted domain.

  19. GAs Vs. other Search & Optimization Methods • Multiple solutions can be obtained without extra effort. • GAs are implicitly parallel and can be implemented on parallel machines. • GAs are quite successful in locating the regions containing optimal solution(s), if not the optimum solution itself. • GAs can solve problems involving large time domain.

  20. Related Fields • Evolutionary Strategies • Genetic Programming • Genetic Engineering

  21. Some Applications of GA • Optimization • Automatic Programming • Machine Learning • Economics • Immune systems • Ecology • Population genetics • Evolution and learning • Social systems • Bioinformatics • Neural Networks & Fuzzy Logic

  22. References Books • Introduction to Genetic Algorithms, M. Mitchell, MIT press, 1996 • Genetic Algorithms in Search, Optimisation and Machine Learning, D.E. Goldberg, Addison-Wesley 1989 • An introduction to Genetic Algorithms for Scientists and Engineers, D. A. Coley, World Scientific 1999. • Optimization for Engineering Design – Algorithms and Examples, Kalyanmoy Deb, P.H.I.

More Related