1 / 18

RESCORLA-WAGNER MODEL

RESCORLA-WAGNER MODEL. What are some characteristics of a good model? Variables well-described and manipulatable. Accounts for known results and able to predict non-trivial results of new experiments. Dependent variable(s) predicted in at least relative magnitude and direction.

uttara
Download Presentation

RESCORLA-WAGNER MODEL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RESCORLA-WAGNER MODEL

  2. What are some characteristics of a good model? Variables well-described and manipulatable. Accounts for known results and able to predict non-trivial results of new experiments. Dependent variable(s) predicted in at least relative magnitude and direction. Parsimonious (i.e., minimum assumptions for maximum effectiveness).

  3. STEPS IN MODEL BUILDING • IDENTIFICATION: WHAT’S THE QUESTION? • ASSUMPTIONS: WHAT’S IMPORTANT; WHAT’S NOT? • CONSTRUCTION: MATHEMATICAL FORMULATION • ANALYSIS: SOLUTIONS • INTERPRETATION: WHAT DOES IT MEAN? • VALIDATION: DOES IT ACCORD WITH KNOWN DATA? • IMPLEMENTATION: CAN IT PREDICT NEW DATA?

  4. PRINCIPAL THEORETICAL VARIABLE: ASSOCIATIVE STRENGTH, V

  5. ASSUMPTIONS 1. When a CS is presented its associative strength, Vcs, may increase (CS+), decrease (CS-), or remain unchanged. 2. The asymptotic strength () of association depends on the magnitude (I) of the UCS:  = f (UCSI). 3. A given UCS can support only a certain level of associative strength, . 4. In a stimulus compound, the total associative strength is the algebraic sum of the associative strength of the components. [ex. T: tone, L: light. VT+L =VT + VL] 5. The change in associative strength, V, on any trial is proportional to the difference between the present associative strength, Vcs, and the asymptotic associative strength, .

  6. Accounting for Known Data K

  7. Overshadowing k= .1, .6

  8. Accounting for Known Data

  9. Accounting for Known Data • Un-Blocking • What value can we change to allow associative strength to accrue to CS2?

  10. Accounting for Known Data V1= 90 V2= 0 k= .2

  11. Novel Predictions • Overexpectation effect • CS1+UCS… • CS2+UCS… • What happens when you present CS1+CS2?

  12. Weaknesses of Models • Errors of commission • Errors of omission

  13. Weaknesses of the R-W Model • Extinction curve not mirror image of Acquisition • Facilitated re-acquistion • What about time? • Spontaneous recovery • CS pre-exposure effect (latent inhibition) (see review article Miller et al. 1995)

  14. Importance of Context • UCS pre-exposure effect • Present UCS by itself • Then pair UCS with CS • Explained by R-W, how?

  15. Probability Space Random Control: 0 < P(UCS|CS) = P(UCS|~CS) Inhibitory CS: 0 < P(UCS|CS) < P(UCS|~CS) Conditioning: 0 < P(UCS|~CS) < P(UCS|CS)

  16. Combining Conditional Probability and Rescorla-Wagner • How can R-W account for no conditioning to the CS in the following: • 0 < P(UCS|CS) = P(UCS|~CS) • Hint: CS is a compound (CS + CTX) • ΔVCS= ΔVCS +ΔVCTX

  17. Contiguity in R-W Model If P(UCS|CS) = P(UCS|~CS), we really have CS = CS + CTX and ~CS = CTX. Then: P(UCS|CS + CTX) = P(UCS|CTX) V (CS + CTX) = VCS + VCTX (R-W axiom) V CS = V (CS +CTX) = VCS + VCTX but: V CTX = V ~CS so: V (CS + CTX) = V CS + V~CS = 0  No significant conditioning occurs to the CS Pavlovian Conditioning

More Related