1 / 11

Advanced Topics in Evolutionary Algorithms

Advanced Topics in Evolutionary Algorithms. Meta-GP Crossover operator evolver for Torcs car setup optimization problem . Mati Bot & Shimi Azrad. Preparation for contest. Use the simpleGA from the site. With one change! The Crossover Operator

alida
Download Presentation

Advanced Topics in Evolutionary Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Topics in Evolutionary Algorithms Meta-GP Crossover operator evolver for Torcs car setup optimization problem MatiBot & ShimiAzrad

  2. Preparation for contest • Use the simpleGA from the site. • With one change! • The Crossover Operator • We need to find the best Crossover operator offline. Before the contest.

  3. XO Meta-GP: Evolving crossover operators. • Objective: Trying to evolve a better XO operator for the Torcscar setup optimization problem. • Why we thought is should work? • The parameters are arranged in an array. It is rational to think that some of the parameters are related to each other (wheels and friction for example)

  4. How it works? Crossover Operators Population • Fitness Function(XO): • Do N times: • Start the Torcs simulation • Run the SimpleGA with the XO operator • Close Torcs • Start Torcs • Take the best solution from the SimpleGA and test it in the simulation for a longer time • Close torcs • The fitness is the average distance raced of all N solutions(Just like in contest, the evaluations of the participants) XO XO XO XO XO XO XO XO

  5. Partition Crossover Operator Basic assumption: the relations between the parameters are transitive. So it’s basically a partition of the parameter set. To code 0 1 1 2 0 2 0 1 3 2 3 3

  6. Example of a crossover 0.1 0.3 0.6 0.9 0.4 0.2 0.4 0.3 0.5 0.3 0.7 0.8 0 0.1 0.4 0.4 0.2 0.3 0.2 0.6 0.3 0.3 0.9 0.1 0.1 0.5 0.5 0.4 0.3 0.2 0.3 0.9 0.4 0.9 0.6 0.3 0.6 0.2 0.2 0.5 0.3 0.7 0.7 0.9 0.7 0.9 0.6 0.6 0.8 1 For each building block b do Randomly choose x from {0,1} Copy b from parent x to offsprint 0 Copy b from parent (x-1) to offsprint 1 0 1 Example for the selection for x: 1,0,1,0

  7. Result: We can observe that maybe some rational relations were found between the parameters

  8. 50 Runs Results: Our XO Avg: 16681 stddev:3141 Simple GA: 14402 stddev:3371 approx~~15% Improvement!

  9. Typical runs of the genetic algorithms, usually around fitness 400. Our XO SimpleGA

  10. Future work • Test the algorithm with bigger values of N • Try to find other applications for this method of evolving XO operators • Compare the results with other algorithms • …and more!

  11. Raw Data: The best distance raced from a single GA process of each type. 50 for each one.

More Related