1 / 25

Towards an Implementation of a Theory of Visual Learning in the Brain

Towards an Implementation of a Theory of Visual Learning in the Brain. Shamit Patel CMSC 601 May 2, 2011. The Problem. To develop a working theory of learning in the human neocortex and implement it in software

marcus
Download Presentation

Towards an Implementation of a Theory of Visual Learning in the Brain

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Towards an Implementation of a Theory of Visual Learning in the Brain Shamit Patel CMSC 601 May 2, 2011

  2. The Problem • To develop a working theory of learning in the human neocortex and implement it in software • Goal is for the learning algorithm to match or exceed human-level accuracy in visual pattern recognition and other hierarchical inference tasks

  3. Hypothesis • My hypothesis is that the brain learns through a feedback loop of sensing and reacting. I call this theory SensoReaction. • The brain essentially learns through experience • Feedback is the crucial ingredient of intelligence because it allows the brain to refine its predictions into the correct answer

  4. Motivation • Medical image processing • Quality control • Surveillance • Ultimately, we would like to build machines that operate on the same neurocomputational mechanisms as the human brain

  5. From Von Neumann Architecture to Neural Architecture of the Brain Image source: http://bluebrain.epfl.ch/files/content/sites/bluebrain/files/bluebrain-neuron.jpg Image source: http://en.wikipedia.org/wiki/File:Von_Neumann_architecture.svg

  6. Related Work • Numenta’s Hierarchical Temporal Memory (HTM) model • Riesenhuber and Poggio’s HMAX model • Fukushima’s Neocognitron model

  7. The Human Neocortex Image source: http://www.ncbi.nlm.nih.gov/books/NBK10870/bin/ch26f3.jpg

  8. Hierarchical Temporal Memory Image source: http://upload.wikimedia.org/wikipedia/en/8/87/HTM_Hierarchy_example.png

  9. Hierarchical Temporal Memory • Directly based on the structure and computational properties of the human neocortex [1] • Four main tasks of HTM: learning, inference, prediction, and behavior [1] • Strength: Efficiency due to hierarchical structure [1] • Weakness: Needs lots of training data

  10. HMAX Image source: http://riesenhuberlab.neuro.georgetown.edu/hmaxSchemeCD.jpg

  11. HMAX • Models the behavior of the ventral visual stream [2] • Fundamental operations: (1) Weighted linear sum for aggregating simple features into complex ones, (2) Highly nonlinear “MAX” operation that computes output based on most active input [2] • Strengths: Efficiency and invariance to position and size of input pattern [2] • Weakness: Poor generalization to objects of different classes [2]

  12. Neocognitron Image source: http://www.scholarpedia.org/wiki/images/9/9d/ScholarFig1.gif

  13. Neocognitron • Self-organized via unsupervised learning [3] • S-cells are changeable and C-cells are invariant to position, shape, and size of input pattern [3] • Strength: Unsupervised learning means we don’t need labeled data • Weakness: Poor generalization to objects of different classes

  14. Approach • Implementation of HTM system • Integration of SensoReaction algorithm into the HTM system • Training the HTM system on temporal image data • Testing the HTM system on novel input patterns • Statistical analysis of results

  15. Implementation of HTM system • I have already implemented a considerable part of the HTM system, including the overall structure of the network and most of the training functionality • Remaining work consists of implementing inference and integrating SensoReaction into the system

  16. Integration of SensoReaction algorithm into HTM system • SensoReaction is a feedback propagation mechanism that allows predictions to be propagated down the hierarchy for correction • Algorithm will be integrated into the HTM system by first introducing feedback connections between every pair of successive layers in the network. Then, predictions will be passed down the hierarchy via these feedback connections.

  17. Training the HTM system • Present hundreds of streams of temporal image data to the input layer • Allow the system to build its internal representations • Training will consist of: (1) memorizing patterns, (2) building the Markov graphs, and (3) forming the temporal groups

  18. Evaluation/Testing the HTM system • Present thousands of noisy input patterns to the HTM network • Observe the classification accuracy of the HTM system • SensoReaction algorithm comes into play here by making predictions, passing them down the hierarchy, correcting them, and passing them back up

  19. Statistical Analysis of Results • Classification accuracy of HTM system with SensoReaction will be compared with classification accuracy of standard HTM system • Two-sample t-test will be used to compare the classification accuracies of the two systems

  20. Feasibility of Approach • SensoReaction is feasible because it is essentially based on how the neocortex processes feedback • Feedback can only improve the classification accuracy because prior experience is taken into account

  21. Conclusion • Feedback is the critical piece of intelligence • Brain learns through constant sensing and reacting • Ultimate goal is to build machines that work on the same computational principles as the brain

  22. References • [1] Numenta, Inc. (2010, December 10). Hierarchical Temporal Memory including HTM cortical learning algorithms (Version No. 0.2). Retrieved from http://www.numenta.com/htm-overview/education/HTM CorticalLearningAlgorithms.pdf

  23. References • [2] Riesenhuber, M., & Poggio, T. (1999, November). Hierarchical models of object recognition in cortex. Nature America, 2(11), 1019-1025. Retrieved from http://cbcl.mit.edu/publications/ps/nn99.pdf

  24. References • [3] Fukushima, K. (1980). Neocognitron: a self-organizing neural network model for a mechanism of pat- tern recognition unaffected by shift in position. Biological Cybernetics, 36, 193-202. Retrieved from http://lrn.no-ip.info/other/books/neural/Neocognitron/1980 Neocognitron%20A%20Self-organizing %20Neural%20Network%20Model%20for%20a%20Mechanism%20of%20Pattern%20Recognition%20 Unaffected%20by%20Shift%20in%20Position.pdf

  25. Questions?

More Related