Generic sensory prediction
Download
1 / 34

Generic Sensory Prediction - PowerPoint PPT Presentation


  • 63 Views
  • Uploaded on

Generic Sensory Prediction. Bill Softky Telluride Neuromorphic Engineering Workshop Summer 2011. ----------------- Abstract trends -----------------. Predictive feedback. Feedforward “compression”. ----------------- raw sensory stream ---------------. Today: ONE compressor.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Generic Sensory Prediction' - kevlyn


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Generic sensory prediction

Generic Sensory Prediction

Bill Softky

Telluride Neuromorphic Engineering Workshop

Summer 2011


----------------- Abstract trends -----------------

Predictive feedback

Feedforward “compression”

----------------- raw sensory stream ---------------


Today: ONE compressor.

Use the white images to predict the moving green ones


  • Axioms

  • Trans-modality: light, sound, tactile

  • Temporal

  • Unsupervised

  • Spatiotemporal compression

  • Strictly non-linear problem

  • Fake data for ground-truthvalidation


  • Tricks

  • Reversible piece-wise linear interpolation/extrapolation

  • Represent sub-manifold

  • Compress space and time separately

  • Sparse

  • CPU-intensive (for now)

  • ”Hello World” reference implementation


The sensory input space

  • Low noise

  • High-dim: 8x8 = 64-pixel vector

  • Continuous motion 360 degrees

  • Constant speed

  • Toroidal boundary conditions

8

8


How to learn this unsupervised
How to learn this unsupervised?

  • Discover/interpolate/extrapolate low-dimmanifold

  • Discover/predict temporal evolution

  • Generalize across speeds


Intrinsic generating structure

  • Points generated from 2-d (x,y) + toroidalmanifold

  • HIGHLY nonlinear

Y

X


Using “Isomap” to discover manifolds

1. Points on continuous low-dim manifold embedded in N-dim

2. i) inter-point matrix Dij

ii) convert to via-neighborDij

iii) Pick top few Principal Components (u, v) as axes

u

v

3. Result: matched lists of low-dim and N-dim for each point (x1, x2, x3, x4, …x64)  (u, v)


Isomap discovers toroidal point-cloud


Manifold stored by 30-1000 “parallel pearl pair” table

64-dim

4-dim


Parallel paired pearl-polygon projection (“interpolation”)

Find 3 closest high-dim pearls

On their triangle, interpolate to closest match

Project to corresponding low-dim mix (same convex weights)


Bi-directional: same scheme low-dim to high-dim!

“Pseudo-inversion”? “Cleaning up”?



Dim-reduction recipe doesn’t matter:

Isomap~Local Linear Embedding (“LLE”)


Reconstruction fidelity varies by…

  • # pearls

  • Manifold & sensory dimension

Why?


Scaling heuristic minimum pearls per axis
Scaling heuristic: minimum “pearls per axis”

  • (low-D + 1) points define local interpolation (cont’s plane/polygon)

  • # axes = {25, 64, 121}

  • Min # pearls = (low-D + 1 ) X (#axes)


Pearls min pearls good reconstruction
#pearls > min-pearls  good reconstruction


actual

EXTRAPOLATION fidelity = 64-dim dot product

= actual vs. “constant velocity”

extrapolation

“constant velocity” extrapolation


For prediction, measure extrapolation fidelity:


Scaling redux minimum pearls per axis now curved saddle not plane for continuous derivative
Scaling redux: minimum “pearls per axis”….now curved saddle (not plane) for continuous derivative

  • (low-D + 3) points define local saddle

  • # axes = {25, 64, 121}

  • Min # pearls = (low-D + 3 ) X (#axes)


Pearls min pearls good reconstruction1
#pearls > Min-pearls  good reconstruction

.97

1.0



Local “motion” extrapolation needs state+direction

Bi-linear “Reichart detector” A x B  D

Now: tril-linear mapping A x B x C  D

D’

D

A

B

A’

D

C

B

A


Cross/outer product  tri-linear vector

equal time-intervals

A x B x C = 4x4x4 = 64-dim

C

B

DT

A4

A3

A2

A1

A

DT

C2

B4

C4

B4

B4

C4

C4

C4

C4

C4

C4

C4

C4

C4

C4

C4

C4

B4

C4

C4

C4

B3

C3

C3

C3

C3

C3

C3

C3

C3

C3

C3

C3

B3

C3

C3

C3

B3

B3

C3

C3

B2

C2

B2

C2

C2

B2

C2

B2

C2

C2

C2

C2

C2

C2

C2

C2

C2

C2

C2

C1

C1

C1

C1

C1

C1

C1

C1

C1

C1

C1

C1

B1

C1

B1

C1

B1

C1

C1

B1


D

(4-dim out)

Accumulate linear “transition matrix”

A x B x C  D

4 x 4 x 4=64-dim  4-dim

(like 4th-rank tensor, 3rd-order Markov)

Accumulate every outer product

{A x B x C, D}

D

A x B x C

A x B x C

(64-dim in)


Make one prediction for state D(t)

  • Choose many recent triplets with differentDT

  • Use all recent history

A1

B1

C1

D1(t)

DT1

DT1

DT1

C2

A2

B2

Average these to predict D(t)

D2(t)

DT2

DT2

DT2

C3

A3

B3

D3(t)

DT3

DT3

DT3

Transition matrix





Speed invariance
“Speed invariance”

  • Learn on one “speed”

  • Assume transitions apply to all speeds

  • Rescale DT by d/dt(raw distance)

fast

dist{X(t) - X(t-Dt)

slow

Dt


Learned

speed

Double-speed

Half-speed



Future Directions

  • Echo-cancelling (“go backwards in time”)

  • Sudden onset

  • Multiple objects

  • Control

  • Hierarchy

    Current needs:

  • Cool demo problems w/”ground truth”

  • Haptic? Rich structure?

  • Helpers!


ad