Loading in 5 sec....

Path finding Framework using HRRPowerPoint Presentation

Path finding Framework using HRR

- 109 Views
- Uploaded on
- Presentation posted in: General

Path finding Framework using HRR

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Surabhi Gupta ’11

Advisor: Prof. Audrey St. John

Algorithm and associated equations

Path finding Framework using HRR

- Circular Convolution
- Associative Memory
- Path finding algorithm

- Locations are hierarchically clustered

X1

X4

j

k l

a

b c

X2

X3

Z

Y2

Y1

m

n o

d

e f

g

h i

X6

p

q r

X5

- The scale of a location corresponds to its height in the tree structure.
- The node of a tree can be directly queried without pointer following
- Maximum number of goal searches = height of the tree

Holographic Reduced Representations

- Developed by Tony Plate in 1991
- Binding (encoding) operation – Convolution
- Decoding operation – Involution followed by convolution

- Binding
- Merge

C≁A

C≁B

- Elements are summed along the trans-diagonals (1991, Plate).

- Involution is the approximate inverse.

- Binding
- Merge

- Normalized Dot product

- Commutativity:
- Distributivity:(shown by sufficiently long vectors)
- Associativity:

Recall and retrieval of locations

X1

X4

j

k l

a

b c

X2

X3

Z

Y2

Y1

m

n o

d

e f

g

h i

X6

p

q r

X5

- Perfect tree – each leaf has the same depth
- Locations within a scale are fully connected e.g. a,b and c, X4, X5 and X6 etc.
- Each constituent has the same contribution to the scale location (no bias).

X1

X4

a

X2

X5

X3

Z

Y2

X6

Y1

p

- Consists of a list of locations
- Inputs a location and returns the most similar location from the list.

What do we store?

- Locations a-r are each2048-bit vectors taken from a normal distribution (0,1/2048).
- Higher scales - Recursive auto-convolution of constituents

X1 =

X1

+

a

b c

+

- Between each location and corresponding locations at higher scales.

X1

a

b c

+

Quite different from standard graph search algorithms…

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal

- If at scale-0, query the AS memory to retrieve the AS sequence. Else use the sequence retrieved in a previous step.
- Query the L memory with

- Helllo
- Query the L memory with

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal

- For example:location:
- and goal: c

- Goal: p
- Not contained in X1

X1

X4

a

X2

X5

X3

Z

Y2

X6

Y1

p

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal

X1

X4

a

X2

X5

X3

Z

Y1

Y2

p

X6

X1

X4

a

X2

X5

X3

Z

Y1

Y2

p

X6

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal

- Same decoding operation

- Using the retrieved scales

Start==Goal?

Start

Move towards the Goal

Go to a higher scale and

search for the goal

If goal not found at this scale

If goal found at this scale

Retrieve the scales corresponding to the goal

X1

X4

j

k l

a

b c

X2

X3

Z

Y2

Y1

m

n o

d

e f

g

h i

X6

p

q r

X5

- Relax the assumption of a perfect tree.
- Relax the assumption of a fully connected graph within a scale location.

- Kanerva, P., Distributed Representations, Encyclopedia of Cognitive Science 2002. 59.
- Plate, T. A. (1991). Holographic reduced representations: Convolution algebra for compositional distributed representations. In J. Mylopoulos & R. Reiter (Eds.), Proceedings of the 12th International Joint Conference on Artificial Intelligence (pp. 30-35). San Mateo, CA: Morgan Kaufmann.