cogsci c131 psych c123 computational models of cognition l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
CogSci C131/Psych C123 Computational Models of Cognition PowerPoint Presentation
Download Presentation
CogSci C131/Psych C123 Computational Models of Cognition

Loading in 2 Seconds...

play fullscreen
1 / 69

CogSci C131/Psych C123 Computational Models of Cognition - PowerPoint PPT Presentation


  • 270 Views
  • Uploaded on

CogSci C131/Psych C123 Computational Models of Cognition. Tom Griffiths. CogSci C131/Psych C123 Computation al Models of Cognition. Tom Griffiths. Computation. Cognition. input. input. output. output. Cognitive science. The study of intelligent systems

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'CogSci C131/Psych C123 Computational Models of Cognition' - laszlo


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide3

Computation

Cognition

slide4

input

input

output

output

Cognitive science

  • The study of intelligent systems
  • Cognition as information processing
slide5

computation

computation

input

input

output

output

Computational modeling

Look for principles that characterize both computation and cognition

two goals
Two goals
  • Cognition:
    • explain human cognition (and behavior) in terms of the underlying computation
  • Computation:
    • gain insight into how to solve some challenging computational problems
computational problems

human cognition sets the standard

Computational problems
  • Easy:
    • arithmetic, algebra, chess
  • Difficult:
    • learning and using language
    • sophisticated senses: vision, hearing
    • similarity and categorization
    • representing the structure of the world
    • scientific investigation
three approaches
Three approaches

Rules and symbols

Networks, features, and spaces

Probability and statistics

three approaches9
Three approaches

Rules and symbols

Networks, features, and spaces

Probability and statistics

logic
Logic

All As are Bs

All Bs are Cs

All As are Cs

Aristotle

(384-322 BC)

the mathematics of reason
The mathematics of reason

Thomas Hobbes

(1588-1679)

Rene Descartes

(1596-1650)

Gottfried Leibniz

(1646-1716)

modern logic
Modern logic

PQ

P

Q

George Boole

(1816-1854)

Friedrich Frege

(1848-1925)

computation
Computation

Alan Turing

(1912-1954)

logic15

Inference Rules

Q

The World

Logic

Facts

P  Q

P

categorization
Categorization

cat  small  furry  domestic  carnivore

logic18

Inference Rules

Q

The World

Logic

Facts

P  Q

P

early ai systems

Operations

The World

Early AI systems…

Workspace

Operations

Facts

Goals

Actions

Observations

rules and symbols
Rules and symbols
  • Perhaps we can consider thought a set of rules, applied to symbols…
    • generating infinite possibilities with finite means
    • characterizing cognition as a “formal system”
  • This idea was applied to:
    • deductive reasoning (logic)
    • language (generative grammar)
    • problem solving and action (production systems)
language

L

all sequences

Language

“a set (finite or infinite) of sentences, each finite in length and constructed out of a finite set of elements”

This is a good sentence 1

Sentence bad this is 0

linguistic analysis aims to separate the grammatical sequences which are sentences of L from the ungrammatical sequences which are not

a context free grammar

NP VP

T N

V NP

man

T N

hit

the

the

ball

A context free grammar

S  NP VP

NP  T N

VP  V NP

T  the

N  man, ball, …

V  hit, took, …

S

rules and symbols24
Rules and symbols
  • Perhaps we can consider thought a set of rules, applied to symbols…
    • generating infinite possibilities with finite means
    • characterizing cognition as a “formal system”
  • This idea was applied to:
    • deductive reasoning (logic)
    • language (generative grammar)
    • problem solving and action (production systems)
  • Big question: what are the rules of cognition?
computational problems25

human cognition sets the standard

Computational problems
  • Easy:
    • arithmetic, algebra, chess
  • Difficult:
    • learning and using language
    • sophisticated senses: vision, hearing
    • similarity and categorization
    • representing the structure of the world
    • scientific investigation
inductive problems

“In solving a problem of this sort, the grand thing is to be able to reason backward. That is a very useful accomplishment, and a very easy one, but people do not practice it much.”

Inductive problems
  • Drawing conclusions that are not fully justified by the available data
    • e.g. detective work
  • Much more challenging than deduction!
challenges for symbolic approaches
Challenges for symbolic approaches
  • Learning systems of rules and symbols is hard!
    • some people who think of human cognition in these terms end up arguing against learning…
the poverty of the stimulus
The poverty of the stimulus
  • The rules and principles that constitute the mature system of knowledge of language are actually very complicated
  • There isn’t enough evidence to identify these principles in the data available to children

Therefore

  • Acquisition of these rules and principles must be a consequence of the genetically determined structure of the language faculty
the poverty of the stimulus29
The poverty of the stimulus

Learning language requires strong constraints on the set of possible languages

These constraints are “Universal Grammar”

challenges for symbolic approaches30
Challenges for symbolic approaches
  • Learning systems of rules and symbols is hard!
    • some people who think of human cognition in these terms end up arguing against learning…
  • Many human concepts have fuzzy boundaries
    • notions of similarity and typicality are hard to reconcile with binary rules
  • Solving inductive problems requires dealing with uncertainty and partial knowledge
three approaches31
Three approaches

Rules and symbols

Networks, features, and spaces

Probability and statistics

similarity
Similarity

What determines similarity?

representations
Representations

What kind of representations are used by the human mind?

representations34

Semantic networks

Semantic spaces

Representations

How can we capture the meaning of words?

computing with spaces

error:

+1 = cat, -1 = dog

y

cat

dog

y

x2

x1

x2

x1

perceptual features

Computing with spaces
networks features and spaces
Networks, features, and spaces
  • Artificial neural networks can represent any continuous function…
problems with simple networks

OR

AND

x2

x2

x1

x1

XOR

x2

x1

Problems with simple networks

y

x2

x1

x1

x2

Some kinds of data are not linearly separable

a solution multiple layers

output layer

y

z2

z1

x2

input layer

x1

A solution: multiple layers

y

hidden layer

z1

z2

x1

x2

networks features and spaces40
Networks, features, and spaces
  • Artificial neural networks can represent any continuous function…
  • Simple algorithms for learning from data
    • fuzzy boundaries
    • effects of typicality
general purpose learning mechanisms
General-purpose learning mechanisms

E (error)

( is learning rate)

wij

the delta rule

influence

of input

output

error

The Delta Rule

+1 = cat, -1 = dog

y

for any function g with derivativeg

x1

x2

perceptual features

networks features and spaces43
Networks, features, and spaces
  • Artificial neural networks can represent any continuous function…
  • Simple algorithms for learning from data
    • fuzzy boundaries
    • effects of typicality
  • A way to explain how people could learn things that look like rules and symbols…
simple recurrent networks

output layer

copy

context units

input

input layer

Simple recurrent networks

x1

x2

hidden layer

z1

z2

x1

x2

(Elman, 1990)

networks features and spaces46
Networks, features, and spaces
  • Artificial neural networks can represent any continuous function…
  • Simple algorithms for learning from data
    • fuzzy boundaries
    • effects of typicality
  • A way to explain how people could learn things that look like rules and symbols…
  • Big question: how much of cognition can be explained by the input data?
challenges for neural networks
Challenges for neural networks
  • Being able to learn anything can make it harder to learn specific things
    • this is the “bias-variance tradeoff”
what happened
What happened?
  • The set of 8th degree polynomials contains almost all functions through 10 points
  • Our data are some true function, plus noise
  • Fitting the noise gives us the wrong function
  • This is called overfitting
    • while it has low bias, this class of functions results in an algorithm that has high variance (i.e. is strongly affected by the observed data)
the moral
The moral
  • General purpose learning mechanisms do not work well with small amounts of data

(the most flexible algorithm isn’t always the best)

  • To make good predictions from small amounts of data, you need algorithms with bias that matches the problem being solved
  • This suggests a different approach to studying induction…
    • (what people learn as n 0, rather than n )
challenges for neural networks55
Challenges for neural networks
  • Being able to learn anything can make it harder to learn specific things
    • this is the “bias-variance tradeoff”
  • Neural networks allow us to encode constraints on learning in terms of neurons, weights, and architecture, but is this always the right language?
three approaches56
Three approaches

Rules and symbols

Networks, features, and spaces

Probability and statistics

probability
Probability

Gerolamo Cardano (1501-1576)

probability58
Probability

Pierre-Simon Laplace (1749-1827)

Thomas Bayes (1701-1763)

bayes rule

Likelihood

Prior

probability

Posterior

probability

Sum over space

of hypotheses

Bayes’ rule

How rational agents should update their beliefs in the light of data

h: hypothesis

d: data

cognition as statistical inference
Cognition as statistical inference
  • Bayes’ theorem tells us how to combine prior knowledge with data
    • a different language for describing the constraints on human inductive inference
prior over functions
Prior over functions

k = 8, = 5,= 1

k = 8, = 5,= 0.3

k = 8,= 5,= 0.1

k = 8, = 5,= 0.01

cognition as statistical inference63
Cognition as statistical inference
  • Bayes’ theorem tells us how to combine prior knowledge with data
    • a different language for describing the constraints on human inductive inference
  • Probabilistic approaches also help to describe learning
probabilistic context free grammars

1.0

NP VP

0.7

1.0

T N

V NP

0.7

0.8

0.5

0.6

man

T N

hit

the

0.8

0.5

the

ball

Probabilistic context free grammars

S

S  NP VP 1.0

NP  T N 0.7

NP  N 0.3

VP  V NP 1.0

T  the 0.8

T  a 0.2

N  man 0.5

N  ball 0.5

V  hit 0.6

V  took 0.4

P(tree) = 1.00.71.00.80.50.60.70.80.5

probability and learnability
Probability and learnability
  • Any probabilistic context free grammar can be learned from a sample from that grammar as the sample size becomes infinite
  • Priors trade off with the amount of data that needs to be seen to believe a hypothesis
cognition as statistical inference66
Cognition as statistical inference
  • Bayes’ theorem tells us how to combine prior knowledge with data
    • a language for describing the constraints on human inductive inference
  • Probabilistic approaches also help to describe learning
  • Big question: what do the constraints on human inductive inference look like?
challenges for probabilistic approaches
Challenges for probabilistic approaches
  • Computing probabilities is hard… how could brains possibly do that?
  • How well do the “rational” solutions from probability theory describe how people think in everyday life?
three approaches68
Three approaches

Rules and symbols

Networks, features, and spaces

Probability and statistics