1 / 40

Foundations of Cognitive Science

Foundations of Cognitive Science. Rutgers Center for Cognitive Science January 2002. Cognitive Science is. The coming together of components of 4 disciplines: Psychology (cognitive) Linguistics (syntax & semantics) Computer Science (AI) Philosophy (of mind)

truly
Download Presentation

Foundations of Cognitive Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Foundations ofCognitive Science Rutgers Center for Cognitive Science January 2002

  2. Cognitive Science is • The coming together of components of 4 disciplines: Psychology (cognitive) Linguistics (syntax & semantics) Computer Science (AI) Philosophy (of mind) • Around a shared commitment to…

  3. Computational Theory of Mind • The mind (brain) computes… • representations of selected aspects of the world • In doing this, it processes information

  4. History • Cognitive Science arose along with computer science • As an alternative to the behaviorist framework that preceded it • In behaviorist framework, mind does not compute and it does not base actions on representations • Experience causes it to rewire itself (plasticity) so as to behave more adaptively

  5. A Representation Is: • System of symbols • Representing system homomorphic to the represented system • Symbol system mediates interactions with the represented system

  6. A Symbol • Refers to something • Is subject to combinatorial operations within the symbol system

  7. Homomorphism • Having the same (mathematical) form • Example: Ohm’s law (I=V/R) has the same form as the law for the relation between between Speed (S), force (F), and viscous resistance (R) in a mechanical system (S=F/R) • Thus, electrical circuits can be used to represent mechanical systems (as they were in analog computers)

  8. A Representation Requires • Processes that establish reference (map from elements in the represented system to symbols in the representing system and vice versa) • Operations that combine symbols to yield other symbols, i.e., computational operations

  9. In a Successful (Valid) Representation • Something is true in the represented system iff (if and only if) the corresponding symbolic statement is true in the representing system

  10. Example • We have correctly measured the tensile strength of a cable and the weight of a load • Measurement = mapping from non-numerical entities like tensile strengths and weights to numbers, i.e, symbols • The cable lifts the load iff the weight of the load is less than the tensile strength (the numerical relation <= is formally same as breaks/doesn’t break)

  11. In a Computer… • Bit patterns (e.g., 1001110 = states of banks of switches) are the symbols • Digital transducers (of, for example, temperature) map from, in this case, temperatures to the symbols (bit patterns) • The CPU performs combinatorial operations (e.g., arithmetic operations like + & -) with the bit patterns: two bit patterns combine in the CPU to yield a third bit pattern

  12. Why Emergence of Cognitive Science Tied to Emergence of Computer Science • With Alan Turing (and others) computation becomes an object of mathematical thought rather than merely a tool • It becomes a physical concept • No longer an inherently mental (in sense of non-physical) process • Made it possible for materialists to think about the brain in computational terms

  13. Summary on Representations-1 • Cognitive sciences all ask, What representation do we (should we) have, and how is it (should it be) computed, and how does it determine action? • A representation is a functioning homomorphism between a represented system and a representing system • The elements in the representing system are symbols

  14. Summary on Representations-2 • Reference processes (e.g., perception, measurement) establish mappings from elements in the represented system to symbols; they establish the referents of the symbols • Operations performed on the symbols (aka computations) predict relations or events in the represented system

  15. Summary on Representations-3 • The translation of a symbols into actions in or on the represented system uses control variables (parameters) and decision variables--both are symbols • The validity (efficacy) of the representation is jointly dependent on the reference processes and the computational processes

  16. Information • Defined and quantified by Shannon (1948) • The amount of information conveyed by a signal is proportional to the logarithm of the reduction in the receiver’s uncertainty • The lantern signal reduced Paul Revere’s uncertainty about the redcoats’ route by a factor of two (one bit = log22)

  17. Information Conveyed by Neural Temperature Signal • Uncertainty reduction = width of prob. density function before signal, divided by width after

  18. Subjectivity of Information • The information communicated by a signal cannot be defined in the absence of knowledge of the receiver’s uncertainty • Changing receiver’s prior knowledge changes the information communicated w/o changing the signal itself • All signaling presupposes prior knowledge by receiver of possible states of the variable signaled

  19. Summary on Information • It is quantifiable • The measurement of communicated information requires knowledge of the receiver’s state of uncertainty prior to the receipt of the information-bearing signal • Its measurement rests on probabilistic considerations, because uncertainty is quantified by probabilities

  20. Decision Processes • Symbols translated into actions by control variables and decision variables • Control variable=parameter of an action, e.g., the direction to be run • Decision variable, symbol used in making a decision (binary choice): decision is “yes” iff the value of the decision variable exceeds a criterion (the decision criterion)

  21. Statistical Decision Theory • Information carried by symbols (signals) ambiguous for two reasons • Noise: the same signal can be generated by a range of values of the input variable • Collapsing: radically different states of the world can collapse to produce identical signals

  22. Two Aspects of Decision Theory • Signal detection theory: optimizing decisions in the face of the unavoidable uncertainties about the true values of variables (caused by noise) • Bayesian inference: factoring into the decision prior information about the relative likelihoods of alternatives that can produce identical signals

  23. Signal Detection Theory • Nearby stimuli produce over-lapping signal distributions • Overlap measured by d’= # of standard deviations by which means separated

  24. Signal Detection Theory (cont.) • Trade-off between true-positives and false alarms inevitable! • Relative percentage of the two kinds of responses depends on placement of decision criteria (“bias”)

  25. Placement Depends onPay-Off Matrix This matrix produces a low criterion = a “jumpy” decision maker

  26. Placement Depends OnPay-Off Matrix This one produces a high criterion = a “conservative” decision maker

  27. Signal Detection Summary • Noise is inescapable and makes signal distributions overlap • The closer the means are, and the greater the noise, the more they overlap • Overlap makes trade-off between desirable and undesirable outcomes of decisions inescapable • Where decision maker sets decision criterion determines trade-off • Placement determined by relative values of desirable and undesirable outcomes

  28. Bayesian Inference Bayes “theorem” is

  29. Bayes Theorem • Consequence of the definitions of conditional frequencies (probabilities) Def of simple probability Def of conditional probability

  30. Ergo (by substitution)

  31. Interpretations • Bayes’: A formula for evaluating the strength of the evidence that an observation provides for an hypothesis • Equivalently, a formula for determining the probability of a given state of the world given: 1) a sensory signal (y); 2) prior knowledge of the function relating that signal to state of the world [P(y|x)]; 3) a prior estimate of the probability of that state of the world [P(x)]

  32. Bayes’ Interpretation The posterior probabilityof the hypo-thesis The likelihood of the experience given the hypothesis The prior probabilityof the hypothesis The overall probability of the experience

  33. Note That • When prior is zero, the posterior is, too. If something is impossible, then it isn’t so, no matter what your experience. The likelihood of the experience given the hypothesis The prior probabilityof the hypothesis The posterior probabilityof the hypo-thesis The overall probability of the experience

  34. Note That • If the experience happens all the time, then it cannot be strong evidence for the hypothesis The likelihood of the experience given the hypothesis The prior probabilityof the hypothesis The posterior probabilityof the hypo-thesis The overall probability of the experience

  35. Note That • If the experience has no connection to the hypothesis, then P(s|h) = P(s), so P(h|s) = P(h); the experience says nothing about the hypothesis The likelihood of the experience given they hypothesis The prior probabilityof the hypothesis The posterior probabilityof the hypo-thesis The overall probability of the experience

  36. Note That • If P(h) is high, then P(h|s) cannot be much higher; an experience that confirms an already likely expectation is not very informative The likelihood of the experience given the hypothesis The prior probabilityof the hypothesis The posterior probabilityof the hypo-thesis The overall probability of the experience

  37. Note That • If P(s|h) = 0, then P (h|s) = 0; if the hypothesis says it can’t happen, but it does, then it’s curtains for the hypothesis The likelihood of the experience given the hypothesis The prior probabilityof the hypothesis The posterior probabilityof the hypo-thesis The overall probability of the experience

  38. Note That • If P(s|h) = 1, then the lower P(s) is, the stronger the evidence for h provided by the experience s (confirmation of counterintuitive predictions is strong evidence). The prior probabilityof the hypothesis The likelihood of the experience given the hypothesis The posterior probabilityof the hypo-thesis The overall probability of the experience

  39. Finally, Note That • P(s|h)/P(s)=P(h|s)/P(h); an experience can only be strong evidence for an hypothesis if the prior probability of the hypothesis is low The prior probabilityof the hypothesis The likelihood of the experience given the hypothesis The posterior probabilityof the hypo-thesis The overall probability of the experience

  40. Summary of Bayesian Inference • Normative model of how information should be processed • Controversy about when it is applicable (Can prior probabilities be specified?) • Controversy about whether mind does process information in this way even when it is in principle applicable

More Related