Lecture 1 MATHEMATICS OF THE BRAIN with an emphasis on the problem of a universal learning computer (ULC) and a universal learning robot (ULR) Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering. Slide 0.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
MATHEMATICS OF THE BRAIN
with an emphasis on the problem
of a universal learning computer (ULC)
and a universal learning robot (ULR)
Consulting professor, Stanford University,
Department of Electrical Engineering
1. User understanding.
2. Repairman understanding.
3. Programmer (educator) understanding.
4. Systems developer understanding.
5. Salesman understanding.
1. BIOLOGICALLY-INSPIRED ENGINEERING (bionics)
Formulate biologically-inspired engineering / mathematical problems. Try to solve these problems in the most efficient engineering way.
This approach had big success in engineering: universal programmable computer vs. human computer , a car vs. a horse, an airplane vs. a bird.
It hasn’t met with similar success in simulating human cognitive functions.
2. SCIENTIFIC / ENGINEERING (reverse engineering = hacking)
Formulate biologically-inspired engineering or mathematical hypotheses. Study the implications of these hypotheses and try to falsify the hypotheses. That is, try to eliminate biologically impossible ideas!
We believe this approach has a better chance to succeed in the area of brain-like computers and intelligent robots than the first one. Why?
So far the attempts to define the concepts of learning and intelligence per se as engineering/mathematical concepts have led to less interesting problems than the original biological problems.
12 cranial nerves ; ~1010 neurons in each hemisphere
31 pairs of nerves; ~ 107 neurons
The shown cerebellar network has ~1011 granule (Gr) cells and ~2.5 107 Purkinje (Pr) cells. There are around 105 synapses between T-shaped axons of Gr cells and the dendrites of a single Pr cell.
Memory is stored in such matrices
Cerebelum: N=2,5 107 * 105= 2.51012 B= 2.5 TB.
Neocortex: N=1010 * 104= 1014 B= 100 TB.
External system (W,D)
Computing system, B, simulating
the work of human nervous system
Sensorimotor devices, D
Human-like robot (D,B)
External world, W
B(t)is a formal representation ofBat time t, where t=0 is the beginning of learning. B(0) is an untrained brain.B(0)=(H(0),g(0)), where
H(0) = H is the representation of the brain hardware,
g(0) is the representation of initial knowledge (state of LTM)
CONCEPT OF FORCED MOTOR TRAINING circuitry
External system (W,D)
During training, motor signals (M) can be controlled byTeacher or by learner (AM) . Sensory signals (S) are received from external system (W,D).
Working memory and circuitry
TWO TYPES OF LEARNING
Mental computations (thinking) as an interaction between motor control and working memory (EROBOT.EXE)
Motor and sensory areas of the neocortex motor control and working memory (
Working memory, episodic
memory, and mental imagery
Primary sensory and motor areas, association areas motor control and working memory (
Association fibers (neural busses) motor control and working memory (
SYSTEM-THEORETICAL BACKGROUND motor control and working memory (
Type 0: Turing machines motor control and working memory ((the highest computing power)
Type 1: Context-sensitive grammars
Type 2: Context-free grammars
Type 3: Finite-state machines
Type 4: Combinatorial machines
(the lowest computing power)
Fundamental constraint associated with the general levels of computing power
Traditional ANN models are below thered line. Symbolic systems go above the red line but they require a read/write memory buffer.The brain doesn’t have such buffer.
Fundamental problem: How can the human brain achieve the highest level of computing power without a memory buffer?
Type 4: Combinatorial machines motor control and working memory (
a b c b a c
General structure of universal programmable
systems of different types
PROMstands forProgrammable Read-Only Memory.
In psychological terms PROM can be thought of as a Long-Term Memory (LTM). Letter G implies the notion of synaptic Gain.
Type 3: motor control and working memory (Finite-state machines
Type 0: motor control and working memory (Turing machines(state machines coupled with a read/write external memory)
Memory buffer, e.g, a tape
Basic arcitecture of a primitive E-machine motor control and working memory (
Data inputs to ILTM
INPUT LONG-TERM MEMORY (ILTM)
DECODING, INPUT LEARNING
E-STATES (dynamic STM and ITM)
MODULATION, NEXT E-STATE PROCEDURE
Modulated (biased) similarity function
Data inputs to OLTM
Selected subset of active locations of OLTM
OUTPUT LONG-TERM MEMORY (OLTM)
ENCODING, OUTPUT LEARNING
Data outputs from OLTM
The brain as a complex E-machine motor control and working memory (
A GLANCE AT THE SENSORIMOTOR DEVICES motor control and working memory (
VISION motor control and working memory (
EYE motor control and working memory (
EYE MOVEMENT CONTOL motor control and working memory (
AUDITORY AND VESTIBULAR SENSORS motor control and working memory (
AUDITORY PREPROCESSING motor control and working memory (
~4,000 inner hair cells ~12,000 outer hair cells
OTHER STUFF motor control and working memory (
EMOTIONS motor control and working memory ((1)
EMOTIONS motor control and working memory ((2)
SPINAL MOTOR CONTROL motor control and working memory (