Algorithm and associated equations
This presentation is the property of its rightful owner.
Sponsored Links
1 / 38

Path finding Framework using HRR PowerPoint PPT Presentation


  • 98 Views
  • Uploaded on
  • Presentation posted in: General

Surabhi Gupta ’11 Advisor: Prof. Audrey St. John. Algorithm and associated equations. Path finding Framework using HRR. Roadmap. Circular Convolution Associative Memory Path finding algorithm. Hierarchical environment. Locations are hierarchically clustered. X 1. X 4. j

Download Presentation

Path finding Framework using HRR

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Algorithm and associated equations

Surabhi Gupta ’11

Advisor: Prof. Audrey St. John

Algorithm and associated equations

Path finding Framework using HRR


Roadmap

Roadmap

  • Circular Convolution

  • Associative Memory

  • Path finding algorithm


Hierarchical environment

Hierarchical environment

  • Locations are hierarchically clustered

X1

X4

j

k l

a

b c

X2

X3

Z

Y2

Y1

m

n o

d

e f

g

h i

X6

p

q r

X5


Tree representation

Tree representation

  • The scale of a location corresponds to its height in the tree structure.

  • The node of a tree can be directly queried without pointer following

  • Maximum number of goal searches = height of the tree


Circular convolution

Circular Convolution

Holographic Reduced Representations


Circular convolution hrr

Circular Convolution (HRR)

  • Developed by Tony Plate in 1991

  • Binding (encoding) operation – Convolution

  • Decoding operation – Involution followed by convolution


Basic operations

Basic Operations

  • Binding

  • Merge


Binding encoding

Binding - encoding

C≁A

C≁B


Circular convolution1

Circular Convolution ( )

  • Elements are summed along the trans-diagonals (1991, Plate).


Involution

Involution

  • Involution is the approximate inverse.


Decoding

Decoding


Basic operations1

Basic Operations

  • Binding

  • Merge


Merge

Merge

  • Normalized Dot product


Properties

Properties

  • Commutativity:

  • Distributivity:(shown by sufficiently long vectors)

  • Associativity:


Associative memory

Associative Memory

Recall and retrieval of locations


Framework

Framework

X1

X4

j

k l

a

b c

X2

X3

Z

Y2

Y1

m

n o

d

e f

g

h i

X6

p

q r

X5


Assumptions

Assumptions

  • Perfect tree – each leaf has the same depth

  • Locations within a scale are fully connected e.g. a,b and c, X4, X5 and X6 etc.

  • Each constituent has the same contribution to the scale location (no bias).

X1

X4

a

X2

X5

X3

Z

Y2

X6

Y1

p


Associative memory1

Associative Memory

  • Consists of a list of locations

  • Inputs a location and returns the most similar location from the list.

What do we store?


Scales

Scales

  • Locations a-r are each2048-bit vectors taken from a normal distribution (0,1/2048).

  • Higher scales - Recursive auto-convolution of constituents


Constructing scales

X1 =

Constructing scales

X1

+

a

b c

+


Across scale sequences

Across Scale sequences

  • Between each location and corresponding locations at higher scales.

X1

a

b c

+


Path finding algorithm

Path finding algorithm

Quite different from standard graph search algorithms…


Path finding algorithm1

Path finding algorithm

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal


Retrieving the next scale

Retrieving the next scale

  • If at scale-0, query the AS memory to retrieve the AS sequence. Else use the sequence retrieved in a previous step.

  • Query the L memory with


Retrieving the next scale1

Retrieving the next scale

  • Helllo

  • Query the L memory with


Path finding algorithm2

Path finding algorithm

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal


Locating the goal

Locating the goal

  • For example:location:

  • and goal: c


Locating the goal1

Locating the goal

  • Goal: p

  • Not contained in X1

X1

X4

a

X2

X5

X3

Z

Y2

X6

Y1

p


Path finding algorithm3

Path finding algorithm

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal


Goal not found at y1

Goal not found at Y1

X1

X4

a

X2

X5

X3

Z

Y1

Y2

p

X6


Goal found at z

Goal found at Z!

X1

X4

a

X2

X5

X3

Z

Y1

Y2

p

X6


Path finding algorithm4

Path finding algorithm

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal


Decoding scales

Decoding scales

  • Same decoding operation


Decoding scales1

Decoding scales

  • Using the retrieved scales


Path finding algorithm5

Path finding algorithm

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal


Moving to the goal

Moving to the Goal

X1

X4

j

k l

a

b c

X2

X3

Z

Y2

Y1

m

n o

d

e f

g

h i

X6

p

q r

X5


To work on

To work on

  • Relax the assumption of a perfect tree.

  • Relax the assumption of a fully connected graph within a scale location.


References

References

  • Kanerva, P., Distributed Representations, Encyclopedia of Cognitive Science 2002. 59.

  • Plate, T. A. (1991). Holographic reduced representations: Convolution algebra for compositional distributed representations. In J. Mylopoulos & R. Reiter (Eds.), Proceedings of the 12th International Joint Conference on Artificial Intelligence (pp. 30-35). San Mateo, CA: Morgan Kaufmann.


  • Login