1 / 15

Associative Pattern Memory (APM) Larry Werth July 14, 2007

Associative Pattern Memory (APM) Larry Werth July 14, 2007. Introduction and Background of APM. Human Associative Pattern Memory Computer Implemented APM Basis for Two Successful Startup Companies Six Patents Granted and Others Pending Successful Implementation of NKS.

guido
Download Presentation

Associative Pattern Memory (APM) Larry Werth July 14, 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Associative Pattern Memory (APM)Larry WerthJuly 14, 2007

  2. Introduction and Background of APM • Human Associative Pattern Memory • Computer Implemented APM • Basis for Two Successful Startup Companies • Six Patents Granted and Others Pending • Successful Implementation of NKS

  3. Objective of My Presentation • Describe the APM Concept & Implementation • Describe its Advantages / Features • Identify Types of Applications • Describe its Current Status and Future Goals

  4. Origin of Concept • Randomly Connected Neural Network Models • States Sequence Terminates in a Cycle • Randomly Map Each State to an Input Pattern • Sampled Pattern Value & Current State Determine Next State • The Ultimate Cycle Represents the Input Pattern • Cycles Form the Basis of the APM

  5. Cycle PropertiesRandomly Connected DFA’s Expected# Expected# Expected# Fraction Terminal Number Transition Terminal Total States (N)States(S)Cycles(C)States(T)States(F) 100 12 3 7 .12 1,000 40 4 20 .040 10,000 125 5 63 .0125 100,000 396 6 198 .00396 1,000,000 1,253 8 627 .001253 10,000,000 3,963 9 1982 .000396 100,000,000 12,533 10 6267 .0001253 1,000,000,000 39,632 11 19817 .0000396

  6. Conceptual Implementation of APM Train Pattern (Write to Cycle Addresses) Input Pattern Array State Array Pattern Address Current State Address Next State Address Response Array Next State Array (Value = 0) Respond to Pattern (Read From Cycle Addresses) Pattern Value Next State Array (Value = 1) State Array: Filled with Random Pattern Addresses Next State Arrays : Filled with Random State Addresses Response Array: Assigned Responses to Patterns

  7. Solution to Multiple Cycles • Introduce a Refractory Period • A State Can Not Occur Again Until After a Specified Number of Steps • Establishes a Minimum Cycle Length • Assures One Cycle Per Input Pattern Independent of Initial State • Input Pattern is Represented By a Single Sequence of Random Addresses in Memory

  8. Minimum Cycle Length Example • Number of States: 1,000,000 • Minimum Cycle Length: 3,700 • Probability of a Second Cycle of 3,700 in Length: 1 in 1,000,000 Based on the probability of not picking one of 3700 in 1,000,000 after 3700 tries.

  9. Response/Recognition Capacity • During Training Desired Responses are Written to Cycle Addresses in Response Memory • Problem: Response Memory Fills UP Quickly • Any Cycle Address has Memory of Previous Input Sample Values • Do Not Need to Use All Cycle Addresses • Solution: Vertical Sensors

  10. Vertical Sensor Cycle Detection Upper Memory Plane forms New Input Pattern Based on Sensor Status Vertical Sensors Detect Presence Absence of Cycle State Addresses Plane With Cycle

  11. Vertical Sensor Implementation • Number of States: 1,000,000 • Minimum Cycle Length: 3,700 • One of 270 Addresses are in Cycles • Vertical Sensor Field Size: 135 • Probability Field Contains Cycle Address: .5 • Vertical Sensor Determines Bit Status of Hash Values that Addresses Response Memory

  12. Fuzzy Hash • Similar Input Patterns Produce Similar Cycles • Similar Input Patterns Generate the same or Similar Hash Codes • Multiple Independent Hash Codes are Generated By One Cycle (One Input Pattern) • A Voting Mode For Response Identification Contributes to Fuzzy Recognition

  13. Advantages of Using Cycles • Creates a Fuzzy Hash • Simple and Fast Implementation • Common Language for Different Pattern Types • Spatial and Temporal Integration to Form New Higher Level Input Patterns • Automatic Segmentation of Time Varying Patterns

  14. Applications • Actual Applications: Hand Printed Character Recognition, Machine Vision, Video Compression, Financial Pattern Forecasting • Signal Processing – Vector Quantization • Video Surveillance – Smart Cameras • Video Object Tracking • Stereo Vision

  15. Current Status and Objectives • Software Library Written in C/C++ • Objective: General Purpose Tool for Pattern Recognition Development • Looking for a Business Partner • Software Will be Available on Our Web Site www.netwerth.net

More Related