1 / 35

Lyapunov Functions and Memory

Lyapunov Functions and Memory. Justin Chumbley. Why do we need more than linear analysis? What is Lyapunov theory? Its components? What does it bring? Application: episodic learning/memory. Linearized stability of non-linear systems: Failures.

lise
Download Presentation

Lyapunov Functions and Memory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lyapunov Functions and Memory Justin Chumbley

  2. Why do we need more than linear analysis? • What is Lyapunov theory? • Its components? • What does it bring? • Application: episodic learning/memory

  3. Linearized stability of non-linear systems: Failures • Is there a steady state under pure imaginary eigenvalues? • theorem 8 doesn’t say • Size/Nature of Basin of attractions? • cfa small neighborhood of the ss (linearizing) • Lyapunov • Geometric interpretation of state-space trajectories

  4. Important geometric concepts(in 2d for convenience) • State function • scalar function U of system with continuous partial derivatives • A landscape • Define a landscape with steady state at the bottom of a valley

  5. Positive definite state function

  6. e.g. • Unique singular point at 0 • Not unique U

  7. * • U defines the valley • Do state trajectories travel downhill? Temporal change of pdstate function along trajectories? • Time implicit in U

  8. e.g. • N-dim case

  9. Lyapunov functions and asymptotic stability • Intuition • Water down a valley all trajectories in a neighborhood approach singular point as

  10. satisfies a

  11. Ch 8 Hopf bifurcation • Van der Pol model for a heart-beat • Analyzed at bifurcation point (where linearized eigenvalues are purely imaginary) • At this point… (0,0) is the only steady state Linearized analysis can’t be applied (pure imaginary eigs) • But: pd state function has time derivates along trajectories

  12. satisfies b • So • Except on x,yaxes where • But when x = 0 then trajectories will move to points where • SoU is a Lyapunov function for … • Ss at (0,0) is asymptotically stable Conclusion: have proven stability where linearization fails

  13. Another failure of Theorem 8 • Points ‘sufficiently close’ to asymptotically stable steady state go there as • But U defines ALL points in the valley in which the sslies! • Intuition: any trajectory starting within the valley flows to ss.

  14. Formally • many steady and basins • Assume we have U for • It delimits a region R within which theorem 12 holds • A constraint U<K defines a subregion within the basin

  15. Key concept: closed contour (or spheroid surface in 3d+) that encloses the ss • As long as this region is within R, T12 guarantees that all points go to steady state • K = highest point on valley walls from which nothing can flow out • is a lower bound on the basin (depends on U too!) e.g. use

  16. Where does U come from? • No general rule. • Another e.g. divisive feedback

  17. *

  18. Memory • Declarative • Episodic • Semantic • Procedural • …

  19. Episodicmemory (thenlearning) • Computationallevel: one-shotpatternlearning& robust recognition(Generalizationoverinputsanddiscriminate) • Learntogeneralize/discriminateappropriately, givenouruncertainty (statistics) • p(f,x) ? p((x) ? …. e.g. regresion/discriminant • Algorithmiclevel: usestabledynamicequilibria • (x) issteady-stateofsystem m, giveninitialcondition x • not smooth generalization (overinputs) • Dynamics • Implementation levelconstraints • Anatomical: Hippocampal ca3 Network • Physiological: Hebbian

  20. m • 16*16 pyramidal • Completely connected but not self-connected • 1 for feedback inhibition If R is a rate/speed, then acceleration of R is a sigmoidal function of PSP No Self connection … is pre-learnt PSP includes inputs: a subset x of neurons exogenously stimulated What is (x) go? Sigma = semi-saturation time constant

  21. Aim • Understandgeneralization/discrimination • Strategy • Input in thebasin will be ‘recognized’ • i.e. identifiedwiththestoredpattern (asympotically) • Lyapunovtheory assessbasinsofattraction Notation: etc…

  22. Theorem 14

  23. For reference Can begeneralizedtohigherorder

  24. s , 

  25. Pattern recognition (matlab)

  26. Hebb Rule • Empirical results • Implicate cortical and hippocampal NMDA • 100-200ms window for co-occurance • Presynaptic Glu and Postsynaptic depolarisation by backpropogation from postsynaptic axon (Mg ion removal).  Chemical events change synapse

  27. For simplicity… • M = max firing rate • (both pre and post must be firing higher than half maximum) • Synapse changes to fixed k when modified • Irreversible synaptic change • All pairs symmetrically coupled 

  28. Learning (matlab) • One stimuli • Multiple stimuli

  29. Pros and limitations of Lyapunov theory • More general stability analysis • Basins of attraction • Elegance and power • No algorithm for getting U • Not unique U: each gives lower bound on basin

More Related