genetic programming and artificial neural networks n.
Skip this Video
Download Presentation
Genetic Programming and Artificial Neural Networks

Loading in 2 Seconds...

play fullscreen
1 / 14

Genetic Programming and Artificial Neural Networks - PowerPoint PPT Presentation

  • Uploaded on

Genetic Programming and Artificial Neural Networks. COSC 4V82 Michael Samborski 16 November 2012. Overview. Artificial Neural Network Review The First to Try It Developmental Approaches Good Ideas Always Come Back Comparison to Other Evolutionary ANN techniques.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Genetic Programming and Artificial Neural Networks' - zaide

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
genetic programming and artificial neural networks

Genetic Programming and Artificial Neural Networks


Michael Samborski

16 November 2012

  • Artificial Neural Network Review
  • The First to Try It
  • Developmental Approaches
  • Good Ideas Always Come Back
  • Comparison to Other Evolutionary ANN techniques
artificial neural networks anns
Artificial Neural Networks (ANNs)
  • Structure of neurons and links
    • Neuron has summation of links in, activation function, and output
  • Take inputs and give outputs
  • Learns by changing the link weights that the inputs are passed through
  • Hidden layers between the input and output layers are necessary to handle non-linearly separable problems
the problem with anns
The Problem with ANNs
  • Wide variety of problems leads to wide variety of network configurations
  • No good way to know what network configuration to use
  • Trial and error to find good configuration is a lengthy process
  • Never sure if you have the optimal network setup
  • What is the better way?
  • GA was used to some success but a new player had just arrived on the AI scene
gp and anns
GP and ANNs
  • Surprise, surprise, Koza is one of the first to try it
  • Used a direct representation of an ANN in GP language form
  • Tree would organize itself into a single root tree that could be seen as a single output arbitrary shape ANN
  • No regular layering of the tree
    • Nodes could be anywhere in his tree and there was no guaranteed organized structure of layers except for the output layer
koza s ann gp language
Koza’s ANN GP Language
  • F = {P,W,+,-,*,/}
    • P is the linear activation function of a neuron
    • W multiplies all the branches coming into itself
    • Restrictions
      • Root of the tree must be P
      • All branches coming into a P must be Ws
      • Any subtree below and arithmetic operation must only contain more arithmetic operations or terminals
    • LIST function could be used to give multiple outputs
      • Only used as root and could only have P branches coming into it
  • T= {D0,D1,R}
    • Two inputs and ephemeral random constant
the xor gp tree
The XOR GP Tree
  • When Koza applied this to XOR, this tree result work 100% of the time
  • Translated to a 2-2-1 network
  • Easy problem = easy tree
  • What about a more complex problem?
koza s full adder con t
Koza’s Full Adder Con’t
  • This full adder tree performed at 100% as well
  • Over 102 runs of popsize = 500, a 100% solution was found 86% of the time after 51 generations
  • Koza’s idea seemed to have promise and as Koza’s ideas typically do, GP for ANN snowballed
taking a developmental approach
Taking a Developmental Approach
  • Frederic Gruau used developmental GP performing operations on nodes of a ANN to create new ones
    • This lead to highly connected graphs and there were few ways to change the individual edges
  • Sean Luke and Lee Spector took Gruau’s idea but applied it to edges instead of nodes
    • This lead to less connected graphs that were able to change edges and nodes easier
  • Unfortunately Gruau’s report was inaccessible behind a pay wall and Luke and Spector only released a preliminary report so no data on how well these really performed was available
ideas come full circle
Ideas Come Full Circle
  • D. Rivero et al. upon reading at the Gruau and Luke and Spector papers thought they had a better way
  • Have a language where there is a function to represent a neuron, a function to represent an input neuron, and a function to act as the tree root that takes in all the output neurons and lists them
  • Sounds familiar, doesn’t it?
koza s gpann v2
Koza’s GPANN v2
  • Koza’s paper was never referenced in Rivero et al, 2005
    • Likely they came upon the idea organically just like Kozadid
  • While Koza’s tests for his GPANN could be considered toyish, it was tested much more rigorously this time
  • While not explicitly stated their activation function was likely more complex than the linear one Koza used
  • Attempted 4 different problems from the UCI database
their results
Their Results
  • D. Rivero at al. didn’t compare themselves to Gruau, Luke, and Spector but did compare their ANN to the results found by Cantú-Paz and Kamath in a paper that compared many different ANN with evolutionary algorithms
  • Their method on these problems performed at least as good and better in most cases
  • All other algorithms had individual evaluation separate from the design process
  • The combination of these by Rivero et al. led to shorter training time and much less computational power used
  • Koza, John R., and James P. Rice. "Genetic generation of both the weights and architecture for a neural network." In Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on, vol. 2, pp. 397-404. IEEE, 1991.
  • F. Gruau, “Genetic micro programming of neural networks”, in Kinnear, Jr., K. E., editor, Advances in Genetic Programming, chapter 24, MIT Press, 1994. pp. 495–518,
  • Luke, Sean, and Lee Spector. "Evolving graphs and networks with edge encoding: Preliminary report." In Late Breaking Papers at the Genetic Programming 1996 Conference, pp. 117-124. Stanford, CA: Stanford University, 1996.
  • Rivero, Daniel, Julián Dorado, Juan R. Rabuñal, Alejandro Pazos, and Javier Pereira. "Artificial neural network development by means of genetic programming with graph codification." Transactions on Engineering, Computing and Technology 16 (2006): 209-214.