1 / 38

Path finding Framework using HRR

Surabhi Gupta ’11 Advisor: Prof. Audrey St. John. Algorithm and associated equations. Path finding Framework using HRR. Roadmap. Circular Convolution Associative Memory Path finding algorithm. Hierarchical environment. Locations are hierarchically clustered. X 1. X 4. j

herman
Download Presentation

Path finding Framework using HRR

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Surabhi Gupta ’11 Advisor: Prof. Audrey St. John Algorithm and associated equations Path finding Framework using HRR

  2. Roadmap • Circular Convolution • Associative Memory • Path finding algorithm

  3. Hierarchical environment • Locations are hierarchically clustered X1 X4 j k l a b c X2 X3 Z Y2 Y1 m n o d e f g h i X6 p q r X5

  4. Tree representation • The scale of a location corresponds to its height in the tree structure. • The node of a tree can be directly queried without pointer following • Maximum number of goal searches = height of the tree

  5. Circular Convolution Holographic Reduced Representations

  6. Circular Convolution (HRR) • Developed by Tony Plate in 1991 • Binding (encoding) operation – Convolution • Decoding operation – Involution followed by convolution

  7. Basic Operations • Binding • Merge

  8. Binding - encoding C≁A C≁B

  9. Circular Convolution ( ) • Elements are summed along the trans-diagonals (1991, Plate).

  10. Involution • Involution is the approximate inverse.

  11. Decoding

  12. Basic Operations • Binding • Merge

  13. Merge • Normalized Dot product

  14. Properties • Commutativity: • Distributivity:(shown by sufficiently long vectors) • Associativity:

  15. Associative Memory Recall and retrieval of locations

  16. Framework X1 X4 j k l a b c X2 X3 Z Y2 Y1 m n o d e f g h i X6 p q r X5

  17. Assumptions • Perfect tree – each leaf has the same depth • Locations within a scale are fully connected e.g. a,b and c, X4, X5 and X6 etc. • Each constituent has the same contribution to the scale location (no bias). X1 X4 a X2 X5 X3 Z Y2 X6 Y1 p

  18. Associative Memory • Consists of a list of locations • Inputs a location and returns the most similar location from the list. What do we store?

  19. Scales • Locations a-r are each2048-bit vectors taken from a normal distribution (0,1/2048). • Higher scales - Recursive auto-convolution of constituents

  20. X1 = Constructing scales X1 + a b c +

  21. Across Scale sequences • Between each location and corresponding locations at higher scales. X1 a b c +

  22. Path finding algorithm Quite different from standard graph search algorithms…

  23. Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal

  24. Retrieving the next scale • If at scale-0, query the AS memory to retrieve the AS sequence. Else use the sequence retrieved in a previous step. • Query the L memory with

  25. Retrieving the next scale • Helllo • Query the L memory with

  26. Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal

  27. Locating the goal • For example:location: • and goal: c

  28. Locating the goal • Goal: p • Not contained in X1 X1 X4 a X2 X5 X3 Z Y2 X6 Y1 p

  29. Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal

  30. Goal not found at Y1 X1 X4 a X2 X5 X3 Z Y1 Y2 p X6

  31. Goal found at Z! X1 X4 a X2 X5 X3 Z Y1 Y2 p X6

  32. Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal

  33. Decoding scales • Same decoding operation

  34. Decoding scales • Using the retrieved scales

  35. Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal

  36. Moving to the Goal X1 X4 j k l a b c X2 X3 Z Y2 Y1 m n o d e f g h i X6 p q r X5

  37. To work on • Relax the assumption of a perfect tree. • Relax the assumption of a fully connected graph within a scale location.

  38. References • Kanerva, P., Distributed Representations, Encyclopedia of Cognitive Science 2002. 59. • Plate, T. A. (1991). Holographic reduced representations: Convolution algebra for compositional distributed representations. In J. Mylopoulos & R. Reiter (Eds.), Proceedings of the 12th International Joint Conference on Artificial Intelligence (pp. 30-35). San Mateo, CA: Morgan Kaufmann.

More Related