Computational cognitive modelling
1 / 51

Computational Cognitive Modelling - PowerPoint PPT Presentation

  • Uploaded on

Computational Cognitive Modelling. COGS 511-Lecture 8 Computational Models of Analogy Making . Related Readings. Readings: Forbus et al. MAC/FAC: A Model of Similarity-Based Retrieval. Cognitive Science, 19. (1995)

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Computational Cognitive Modelling' - sophie

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Computational cognitive modelling

Computational Cognitive Modelling

COGS 511-Lecture 8

Computational Models of Analogy Making

COGS 511

Related readings
Related Readings


  • Forbus et al. MAC/FAC: A Model of Similarity-Based Retrieval. Cognitive Science, 19. (1995)

  • Hummel and Holyoak, (1997) Distributed Representations of Structure, Psychological Review

    both in Polk and Seifert (2002)

    Optional and Further Readings

  • French (2002). The Computational Modeling of Analogy Making. Trends in Cognitive Sciences 6(5).

  • Gentner and Markman (2003). Analogy-Based Reasoning and Metaphor. In Arbib (ed). Handbook of Brain Theory and Neural Networks.

  • Salvucci and Anderson (2001). Integrating Analogical Mapping and General Problem Solving: the Path-mapping Theory. Cognitive Science 25, 67-110.

COGS 511

Analogy making
Analogy Making

  • Mapping between two domains, a source (aka base) and a target based on the perceived relational commonalities between domains that might be dissimilar on the surface.

  • Hall’s essentials:

    • Recognition (Retrieval) of a source, given a target description

    • Establishment, elaboration and evaluation of the mapping between the two

    • Transfer of information from source to target

    • Consolidation (i.e. learning) of the outcome: generalizing from specific cases; developing general mental schemata

      (other processes suggested: dynamic representation building)

COGS 511

Benchmark phenomena of analogy experimental evidence
Benchmark Phenomena of Analogy-Experimental Evidence

  • Relational Similarity: Alignment of Relational Structure; ex: The atom is like the solar system

  • Structured Pattern Matching

    • Parallel connectivity: matching relations have matching arguments

    • One-to-one correspondence: The same item in the base can not be aligned with multiple items in the target or vice versa.

  • Systematicity: Analogies seek connected systems of matching relations rather than isolated relational matches

  • Candidate Inferences: Analogical inferences are generated via structural completion

COGS 511

Benchmark phenomena of analogy
Benchmark Phenomena of Analogy

  • Alignable Differences: Corresponding but nonidentical differences are considered to be more salient; e.g. atoms have electrons and solar systems have planets (an alignable difference). Solar systems have asteroids, atoms do not (nonalignable difference)

  • Flexibility in Interactive Interpretation

    • The same item can take part in many comparisons, with different aspects of the representation participating in each comparison, e.g. US politics in Iraq: World War 2 or Vietnam? (different bases) “A lake is a mirror” vs “Meditation is a mirror” (different targets)

  • Flexibility in Multiple Interpretations

    • Multiple Interpretations of the Same Comparison- structural consistency: Cameras are like tape recorders

  • Cross-mapping

    • Object similarities suggest different correspondences than do relational similarities; people can compute both alignments and depending on the features available can prefer one over another.

COGS 511

Symbolic models
Symbolic Models

  • Early Models (1960s):

    • Argus (Reitman)- proportional analogies by conceptual networks: bear:pig; chair:? (foot,table,coffee,strawberry)

    • ANALOGY (Evans)– proportional analogies on the domain of geometric figures;able to build high level representations from low level descriptions of the figure and discover rules.

  • Structure Mapping Theory (SMT) (Gentner, 1983)

    • Emphasis on structural (relational) similarity and systematicity for mapping and making inferences

    • Structural Mapping Engine (SME) – computational implementation of SMT

    • MAC/FAC model of analogical retrieval intended to be coupled with SME as a frontend.

    • Later SME-based models such as SEQL (Kuehne, 2000) applied to infant categorization.

COGS 511

Connectionist models
Connectionist Models

  • ACME (Holyoak and Thagard, 1989):

    • localist, constraint-satisfaction network, where structural and semantic similarity and pragmatic importance (being related to goals, preknown mappings) determine the constraints that help consistent nodes to become active. ARCS is a model of retrieval for ACME where retrieval is dominated by semantic similarity.

  • STAR (Halford, 1994; Wilson 2001)

    • STAR-1: distributed connectionist model for proportional analogies; based on tensor products. STAR-2 has been used to model children’s analogy making capabilities.

  • DRAMA (Eliasmith and Thagard, 2001); Adaptive Resonance Theory- based analogy making (Jani and Levine, 2000)

COGS 511

Hybrid models
Hybrid Models

  • COPYCAT and related architectures (TABLETOP, METACAT) (Hofstadter, 1984, 1995):

    • COPYCAT solves letter-string proportional analogies and has working memory, long term memory and a procedural memory of nondeterministic, parallel working codelets.

  • AMBR (Kokinov, 1994; Kokinov and Petrov 2001):

    • Solves problems by analogy on a cognitive architecture called DUAL that is based on parallel, interacting, context-sensitive hybrid microagents.

COGS 511

Metaphors and analogy
Metaphors and Analogy

  • Metaphors are nonliteral assertions of likeness: “A cloud is like a sponge” or “A cloud is a sponge”.

  • Novel metaphors are processed similarly to analogies but they need not be based on relations, e.g. above metaphor can be interpreted on a attribute of a sponge, fluffiness.

  • Conventional metaphors (e.g. Some people are sheep) are more likely to be interpreted as class inclusions.

  • Metaphors can carry an emotional aspect rather than having explanatory-predictive functions as analogies.

  • Few computational models of metaphor see Budiu and Anderson (2000, 2002); Narayanan (1999).

COGS 511

Mac fac a model of similarity based retrieval
MAC/FAC: A Model of Similarity Based Retrieval

  • Models of analogical retrieval should be able to explain the fact that remindings are often based on surface similarities, whereas mapping and transfer on structural similarity

    • Experiments on “Karla the Hawk” stories about memory access and subjective soundness (how well inferences could be carried from one story to another).

  • Focus on

    • Accessing a similar base situation

    • Creating a mapping from the base to the target

    • Evaluating the mapping

COGS 511

Mac fac model
MAC/FAC model

  • Two stage model; MAC stage using a computationaly cheap but non-structural filter; FAC stage a more expensive but more accurate structural match among items from the MAC output. Input: pool of memory items and a probe (target)

  • Output: an item from the memory and a comparison with the probe

  • Each stage has parallel matchers and a selector

COGS 511

The fac stage
The FAC Stage

  • Based on Structure-Mapping Engine (SME), which has already been tested to be consistent with psychological evidence.

  • Computing a set of global interpretations, each of which has

    • A set of correspondences

    • A structural evaluation

    • A set of candidate inferences

  • Sensitive to both attributes and relations

  • Selector returns upto 3 very close matches.

COGS 511

The mac stage
The MAC Stage

  • Selector similar to FAC Stage

  • Matchers are based on content vectors, special type of feature vectors denoting how many times each functor from the domain of functors (relations, attributes, etc) occur in a given description.

  • Each memory item has a content vector stored with it; each probes content vector calculated and compared with existing ones.

  • Similarity with connectionist frameworks

COGS 511

Cognitive simulation experiments
Cognitive Simulation Experiments

  • Variations on Karla the Hawk stories

  • Psychological Experiments:20 base stories, 12 distractors, and 5*4 probe stories- variations classified into Literal Similarity (LS), analogy (AN), Surface Similarity (SF), Common First Order Relation –mostly events (FOR); subjects are asked to write down stories that are reminded to them with probes.

  • Computational Simulations: 9 stories with 4 variants for each, encoded in predicate logic. 9 distractors.

  • Three variations of simulations

  • Same qualitative ordering of stories that pass the MAC/FAC stages: LS>SF>AN>FOR as in human subjects

  • FAC accepts nearly half of MAC outputs.

  • Sensitivity Analysis of the Model to different factors in the model

    • Not sensitive to normalization procedure for content vectors

    • Content vectors must include both attributes and relational information

    • Not sensitive to selector widths within a reasonable range

COGS 511

COGS 511

(Forbus et. al, 1995)

Comparative evaluation
Comparative Evaluation

  • Comparison with ARCS, localist connectionist, constraint satisfaction analogy retrieval (Thagard et al.)

    • No user-marked pragmatic effects in MAC/FAC

    • No cheap, initial estimation in ARCS

    • Different similarity metrics, no special status to relational similarity in ACME and ARCS, where lexical similarity is calculated via WordNet.

    • MAC/FAC can operate on ARCS datasets with similar results but not vice versa.

    • ARCS exhibits at least two reversals of the ordinal order in Karla the Hawk stories

COGS 511


  • No Modeling of Retrieval Failure or Decay

  • No Explicit Effect of Goal Structures

  • MAC stage, psychologically plausible? Feeling-of-knowing studies

  • Scaling up content vectors

  • More between-item and iterative processing as avenues for development

  • Able to accommodate e.g. expert-novice differences in analogy

  • Needs to be embedded in larger performance models

COGS 511

Empirical phenomena for evaluation of models hummel and holyoak 97
Empirical Phenomena for evaluation of Models (Hummel and Holyoak, 97)

  • Access and its Relationship to Mapping

    • Semantic Similarity has greater impact than in mapping

    • Isomorphism has less impact than in mapping

    • Close analog and schema easier to access than far analog

    • Access is competitive

    • Familiar anolog accessed more readily

COGS 511

Empirical phenomena for evaluation of models hummel and holyoak 971
Empirical Phenomena for evaluation of Models (Hummel and Holyoak, 97)

  • Analogical Mapping

    • Isomorphism

    • Semantic Similarity

    • Pragmatic Centrality

    • Multiple Possible Mappings for One Analog

    • Correct Initial Correspondence Facilitates Subsequent Mappings

    • Difficulty finding mapping for unnatural analogy problems

    • Possible to map predicates with different number of arguments

COGS 511

Lisa learning and inference with schemas and analogies
LISA (Learning and Inference with Schemas and Analogies) Holyoak, 97)

  • Structure-sensitive connectionist model

  • Dynamic Binding: Units representing case roles are temporally bound to fillers of these roles.

  • Semantic Units in LTM; semantic similarity by dot product of vectors corresponding to a concept

  • Units which fire in synchrony are bound together; they fire out of synchrony otherwise

    • Has limited working memory capacity: possible to have 4-6 simultaneously active but mutually out of synchrony groups

    • One level restriction: can not deal with embedding well or different role bindings

    • Neuroscientifically plausible ??

    • Performs similar to human analogical reasoning

COGS 511

COGS 511 Holyoak, 97)

Hummel and Holyoak (1997)

COGS 511 Holyoak, 97)

Hummel and Holyoak (1997)

COGS 511 Holyoak, 97)

Hummel and Holyoak (1997)

Lisa learning and inference with schemas and analogies1
LISA (Learning and Inference with Schemas and Analogies) Holyoak, 97)

  • Drivers: Either base or target chosen as a driver; role-argument binding in driver activates distributed semantic representation and in turn localist concept nodes in the other domain.

  • Working memory allows role binding of higher order relations preceding role bindings of their arguments allow correspondences of those higher order relations to affect the correspondences for arguments

COGS 511

Probability problem
Probability Problem Holyoak, 97)

  • Based on Permutation-Combination problems given to College students

  • Eg. Assigning Cars to Mechanics for repair of Assigning Computers to Students to Study

  • First +/-: similarity of story line

  • Second +/-: similarity of roles and animacy (humans/artifacts) 0-neutral

  • Semantic similarity positively influences analogical mapping – similar objects, similar roles

  • Semantic similarity negatively influences analogical mapping-similar objects different roles

COGS 511

Hummel and Holyoak (1997) Holyoak, 97)

COGS 511

Soap opera experiments and model
Soap Opera Experiments and Model Holyoak, 97)

  • Pragmatic Centrality: Pragmatic focus affects mapping

  • Two similar soap opera plots: romantic, professional, cheating relations

  • Plot extension and Mapping task

  • CP: Consistent w. Pragmatic Emphasis; IP: Inconsistent w. Pragmatic Emphasis;

    CC: Consistent w. Cheating; IC: Inconsistent w. Cheating

COGS 511

COGS 511 Holyoak, 97)

Hummel and Holyoak (1997)

Pragmatic centrality on ambiguous maps
Pragmatic Centrality on Ambiguous Maps Holyoak, 97)

  • Two science fiction stories – economic and military relations

  • Planet P1

    • Richer(Aflu,Barebrute)

    • Stronger(Barebrute,Compak)

  • Planet P2

    • Richer(Grainwell,Hungerall)

    • Stronger(Millpower,Mightless)

      Barebrute is ambiguous in mappings

COGS 511

COGS 511 Holyoak, 97)

Hummel and Holyoak(1997)

Main results
Main Results Holyoak, 97)

  • Structurally consistent relational mappings

  • Cross-mappings possible

  • One interpretation at a run but varience possible wrt activation order of nodes in the driver (this is decided by the modeler, though)

  • Scaling up ??

  • Cannot make inferences

  • Coarse qualitative and quantitative evaluation metrics

COGS 511

Path mapping theory salvucci and anderson
Path-mapping Theory (Salvucci and Anderson) Holyoak, 97)

  • Integrating Analogical Mapping and General Problem Solving

  • Understanding Analogy within a General Theory of Cognition, i.e. ACT-R

  • Finer fitness with quantitative criteria (latency, correctness) than other theories of analogical mapping

  • Accounting for lower level behaviour such as typing and eye movements and incorporating incremental mapping

COGS 511

Components of the theory
Components of the Theory Holyoak, 97)

  • Representation

    • Analogies (analogs)

      • Objects

      • Relations

      • Roles: higher order conceptual structures linking objects and relations

        • Pointer to parent relation and its type

        • Pointer to child relation/object and its type

        • The relation the object fills in the relation

          Eg. The object ss-sun serves the center role in the relation ss-revolves

COGS 511

Path mapping
Path Mapping Holyoak, 97)

  • Path Mapping maps a single source object to a single target object by finding analogous paths between objects and their highest order parent relations (i.e. root relations).

  • Implemented as a set of production rules (7 productions + additional productions in different models)

  • Different analogical mapping tasks will interleave path mapping with different organizational knowledge.

  • Alternative paths may exist,but only one chosen at a time. Further mapping does not go up to root level, can take shortcuts.

COGS 511

Related parameters
Related Parameters Holyoak, 97)

  • Chunk Activations

  • Chunk Mismatches: degree to which chunks are dissimilar.

  • Production Matches and Utilities

COGS 511

COGS 511 Holyoak, 97)

Salvucci and Anderson (2001)

Accounting for existing lit
Accounting for Existing Lit. Holyoak, 97)

COGS 511

Salvucci and Anderson (2001)

Pp probability problem
PP: Probability Problem Holyoak, 97)

Values in parantheses-model predictions

COGS 511

Salvucci and Anderson (2001)

PP Holyoak, 97)

  • 1000 simulations; R=0.93

  • 5 common parameters (either estimated or preset through all models) + one additional parameter for miscelloneous error (0.36?)

  • Additional productions – setting the mapping subgoal and producing additional errors

COGS 511

So soap opera
SO: Soap Opera Holyoak, 97)

1000 simulations, R=0.98, additional parameter for

emphasized chunks, five additional productions

COGS 511

Salvucci and Anderson (2001)

Am attribute mapping
AM: Attribute Mapping Holyoak, 97)


COGS 511

Ordering and no of similar attributes affects ease of mapping

Bill is intelligent. Holyoak, 97)

Bill is tall.

Tom is timid.

Tom is tall.

Steve is intelligent.


Fido is clever.

Blackie is friendly.

Blackie is frisky.

Rover is clever.

Rover is friendly.

Ordering and No of Similar Attributes Affects Ease of Mapping

COGS 511

Other models
Other Models Holyoak, 97)

  • Sharing (Sh)

  • Country Mapping (CM)

  • Karla the Hawk Stories (KH)

    No quantitative fits given...

COGS 511

Story mapping
Story Mapping Holyoak, 97)

  • Experiments and Modelling

  • Collection and Modelling of Eye Tracking and Typing Data (for pressing three keys required-Key-time ratios)

  • Two stories; mapping task; study and mapping phases

  • One story (source story was removed during the mapping) vs two-story conditions

  • Significant effects of both adjacency and condition. (Adjacency effect is present in one story condition and also effects mapping time)

COGS 511

Model for Story Mapping Holyoak, 97)

COGS 511

Salvucci and Anderson (2001)

COGS 511 Holyoak, 97)

Salvucci and Anderson (2001)

Results and discussion
Results and Discussion Holyoak, 97)

  • “Excellent” qualitative and quantitative fits with experimental data. Eye movement and typing times: how fitted not explained; 8 additional parameters.

  • Modelling both incremental nature of analogical mapping and the effects of memory access and adjacency on mapping success.

COGS 511

Overall evaluation
Overall Evaluation Holyoak, 97)

  • Different effects and processes within a single phenomena

  • Value of qualitative fits (?)

  • Value for hybrid models

COGS 511

Lecture 9
Lecture 9 Holyoak, 97)

  • Next week: Computational Models of Consciousness – Maia and Cleeramans

COGS 511