1 / 11

AI & Machine Learning Libraries

By Logan Kearsley. AI & Machine Learning Libraries.

heckart
Download Presentation

AI & Machine Learning Libraries

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. By Logan Kearsley AI & Machine Learning Libraries

  2. The purpose of this project is to design a system that combines the capabilities of multiple types of AI and machine learning systems, such as nervous networks and subsumption architectures, to produce a more flexible and versatile hybrid system. Purpose

  3. The end goal is to produce a set of basic library functions and architecture descriptions for the easy manipulation of the AI/ML subsystems (particularly neural networks), and use those to build an AI system capable of teaching itself how to complete tasks specified by a human-defined heuristic and altering learned behaviors to cope with changes in its operational environment with minimal human intervention. Goals

  4. Other Projects • Don't know of any other similar projects. • Builds on previous work done on multilayer perceptrons and subsumption architecture. • Varies in trying to find ways to combine the different approaches to AI.

  5. Design & Programming • Modular / Black Box Design • The end user should be able to put together a working AI system with minimal knowledge of how the internals work • Programming done in C

  6. Testing • Perceptron Neural Nets • Forced learning: make sure it will learn arbitrary input-output mappings after a certain number of exposures • Subsumption Architecture • Simple test problems: does it run the right code for each sub-problem?

  7. Algorithms • Perceptrons: • Delta-rule learning: weights are adjusted based on the distance between the net's current output and the optimal output • Matrix simulation: weights are stored in an I (# of inputs) by O (# of outputs) matrix for each layer, rather than simulating each neuron individually. • Subsumption Architecture: • Scheduler takes a list of function pointers to task-specific functions • Task functions return an output or null • Highest-prioritynon-null task has its output executed each iteration

  8. Algorithms • Perceptron structure • Individual Neurons • vs. • Weight Matrix

  9. Algorithms • Subsumption Architecture:

  10. Problems • Back-Propagation is really confusing!

  11. Results & Conclusions • Single-layer perceptron works well • Capable of learning arbitrary mappings, but not an arbitrary combination of them • Multi-layer nets should learn arbitrary combinations, but learning algorithm for hidden layers is confusing. • Can't re-use all of the same single-layer functions • Plan Change • Originally, wanted to create a working system • Now, project goal is to produce useful function libraries- working systems are just for testing the code

More Related