B. F. Skinner's Molecular Interpretations - PowerPoint PPT Presentation

tariana
b f skinner s molecular interpretations l.
Skip this Video
Loading SlideShow in 5 Seconds..
B. F. Skinner's Molecular Interpretations PowerPoint Presentation
Download Presentation
B. F. Skinner's Molecular Interpretations

play fullscreen
1 / 32
Download Presentation
B. F. Skinner's Molecular Interpretations
164 Views
Download Presentation

B. F. Skinner's Molecular Interpretations

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. B. F. Skinner's Molecular Interpretations TxABA Houston Saturday, 3/5/05 Jack Michael, WMU

  2. Molecular is usually contrasted with molar. Molecular interpretation: Behavior is due to consequences over a brief period following each response (and any long term effects are due to the accumulation of the immediate effects). Molar interpretation: Behavior is due directly to consequences over the long term. • This distinction is currently introduced in learning textbooks (e.g. Catania, 1998) in terms of a. the reinforcement for avoidance responding w/o a warning S, b. why response frequency "matches" rfmt frequency--the matching law. These issues arose around 1960-61.

  3. The Skinner interpretations that I present today occurred before the molecular-molar contrast became explicit (Are Theories of Learning Necessary, 1950; Science and Human Behavior, 1953). Still Skinner was much concerned with the events immediately following reinforcement. Here is the statement in Schedules of Reinforcement (Ferster & Skinner, 1957, page 3) • Under a given schedule of rfmt, it can be shown that at the moment of reinforcement a given set of stimuli [including those resulting from the recent behavior of the organism] will usually prevail. . . Reinforcement occurs in the presence of such stimuli, and the future behavior of the organism is in part controlled by them or by similar stimuli according to a well-established principle of operant discrimination.

  4. pecking key key lights Pigeon Operant Chamber aperture light food aperture key lights grain hopper down Rfmt unavailable: aperture light off, grain hopper down where grain cannot be accessed. food aperture

  5. pecking key key lights Pigeon Operant Chamber aperture light food aperture key lights grain hopper up Rfmt available: aperture light on, grain hopper up. After 3 sec, light goes off and hopper goes back down. food aperture

  6. pecking key key lights Pigeon Operant Chamber aperture light food aperture key lights grain hopper Rfmt unavailable: aperture light off, grain hopper down where grain cannot be accessed. food aperture

  7. Fixed interval scallop: Low rate immediately after reinforce-ment, then increasing up to the time the next rfmt is due, then low after rfmt, and so on. Fixed interval 10 min rfmt total responses reinforcement Why the low rate after rfmt? time 10' 20' 30' Because the bird knows that it will not get any grain for pecking after reinforcement? Cognitive explanation. Skinner's explanation in terms of stimulus control: Because the stimulus conditions immediately after rfmt--food dust on the beak, residual effects of rapid head movements, etc.--have become an S∆ (S delta) for pecking because responses have never been reinforced in the presence of those stimulus conditions. *

  8. Spontaneous Recovery: What is it? 30 min extinction sessions Response rate at the start of extinction session 2 is greater than at the end of ext. session 1. And greater at the start of extinction session 3 than at end of extinction session 2. And so on. Day 1 end of extinction session 1 responses Day 2 start of extinction session 2 end of extinction session 2 start of extinction session 3 Day 3 30 0 minutes

  9. Spontaneous Recovery (cont'd.) Why does it happen? 30 min extinction sessions Some theories (Pavlov, Hull, others) contended that responding without reinforce-ment generates a form of inhibition* (an hypothesized neurochemical substance, or a hypothetical entity of some sort), but with the passage of time this entity dissipates. * Day 1 responses Day 2 Day 3 30 0 minutes Skinner offers another explanation based on operant stimulus control. (Skinner, 1950, p. 85) *For a thorough treatment of spontaneous recovery and its relation to the concept of inhibition see Catania, 1998.

  10. Stimulus Change Decrement: After an operant function-altering operation (reinforcement, extinction, punishment, recovery from punishment, and others), the changed function is seen at its maximum value when the stimulus conditions are exactly the same as during the function-altering operation. Any change from those conditions results in a decrement in the changed function. When the changed function is an increase in responding due to reinforcement, then a stimulus change results in less behavior than if the stimuli were the same as during reinforcement. When the changed function is a decrease due to extinction or punishment, then a stimulus change results in more behavior than if the stimuli were the same as during extinction or punishment.

  11. Demonstration of stimulus change decrement with respect to extinction 1800 • Nine pigeons were given a history of variable ratio (VR) reinforcement for pecking a yellow triangle. • In the session at right the triangle is yellow for the first 30 minutes with more than 1100 rsps per bird*. • When extinction starts key color is changed to red. • After 15 minutes the color was changed back to yellow. VR rfmt responses extinction 0 time in minutes 60 *Group data: Curve is based on the responses for all 9 birds.

  12. Spontaneous recovery analogy: A hypothetical experiment that is more like the actual spontaneous recovery situation 0 0 time in minutes time in minutes 10 10 300 • Phase 1: Pigeon is placed in the experimental chamber with red ceiling light flashing, • but the light fades to off in 2 minutes. • Key pecking gets VI 30 sec rfmt in presence and absence of the flashing red light. VI 30" Rfmt responses flashing light on but fading to off in 2 min flashing light off bright intensity of flashing red light off

  13. Spontaneous recovery analogy (cont'd.) • Phase 2: Ext. session, with flashing red light on at the beginning of the session but fades rapidly to off in 2 min, as during reinforcement sessions. • There is thus only 2 min extinction in flashing light before it goes off, • then about 8 min ext. with light off. • Then after 10 minutes flashing light is turned on. • Responding recovers as in Skinner's procedure.

  14. Spontaneous recovery analogy (cont'd.) • At the beginning of reinforce-ment sessions there are residual stimulus effects from being removed from the home cage, transported to the experimental chamber, etc. • These rapidly fade to off just like the flashing red light in the hypothetical experiment. • During an extinction session there is only a small extinction history in the presence of the residuals of handling stimuli. • There is much more extinction in the absence of these stimuli. • Thus more behavior occurs when they are again present at the beginning of the next ext. session than were occurring at the end of the previous ext. session in their absence.

  15. Spontaneous Recovery (still cont'd.) How About "Late" Spontaneous Recovery? Somewhat controversial. Some studies (e.g. Welker & McAuley, 1978) support Skinner's interpretation but some (e.g. Thomas & Sherman, 1986) do not. But the analysis illustrates Skinner's broad interpretation of the stimulus, and his concern for environment-behavior details. Pigeons were given 1-hour reinforcement sessions, then very brief extinction sessions, until no responding occurred in the brief sessions. Then when a session lasted the usual duration, responding occurred in the later part of the session. Extinction had occurred in the presence of the stimuli of having just been placed in the chamber, but there had been no extinction in the presence of stimuli consisting of having been in the chamber for a while. (Kendall,1965).

  16. Objections In the introduction to About Behaviorism (1974, pp.4-5) Skinner lists 20 common objections to behaviorism or to the science of behavior, all of which he asserts, and later argues, are wrong. Objections 10 and 11 pertain to the move from the animal laboratory to human behavior.

  17. These are cited when we refer to the animal research literature as a basis for solving a problem in human behavior. 10. It works with animals but not with people, therefore its picture of human behavior is confined to those features which humans share with animals. 11. Its achievements under laboratory control cannot be duplicated in daily life.

  18. "The Analysis of Complex Cases"(Ch. 14, SHB) A popular way to resist a behavioral approach is to cite a common event that contradicts a behavioral principle (with, I think, the hope that the whole behavioral thing will go away). Such examples often depend on failure to recognize the multiple control of behavior. One type of multiple control consists in an independent variable having more than one effect on behavior.

  19. For example a single occurrence of an aversive stimulus may Elicit unconditioned respondent behavior (painful S elicits heart rate changes, GSR, pupillary dilation, etc.) Respondently condition the organism so that a neutral S will have effects similar to those of the aversive S (make the neutral S into a CS for autonomic, and perceptual Rs). Evoke any behavior that has in the past terminated similar aversive stimuli (function as an MO/EO). Decrease the future frequency of any behavior that immediately precedes the occurrence of the aversive S (function as a punisher).

  20. The Principle of Satiation With some forms of reinforcement, the strength of the motivating/establishing operation is decreased as a function of consumption or contact with the reinforcer, and behavior evoked by that MO/EO becomes less frequent. Food ingestion results in a decrease in food-reinforced behavior. The critic says "But here is an example--giving a small child a piece of candy--where satiation doesn't work--so the principle must be invalid, (and perhaps we can forget about all this behavioral stuff!)."

  21. Giving a Child a Piece of Candy A person gives a small piece of candy to a child who is playing happily by himself (the person has given candy before). Much objectionable behavior emerges--asking for more candy, crying if it is not provided, perhaps a temper tantrum. We appear to have increased the relevant establishing operation, although our definition of satiation implies that we should have decreased it.

  22. But the sight & taste of candy is a stimulus condition with another effect besides satiation. It also functions as an SD (discriminative stimulus) for further asking. More than one piece of candy has usually been available at a time. Now assume that repeated candy-seeking is unsuccessful, a situation which evokes emotional behavior.

  23. Discriminative (SD), satiating, and emotional effects can be separated by never giving more than one piece of candy at a time. Then one piece will not be an SD for further asking, and the emotional behavior will have extinguished (or never been reinforced in the first place). It should then be possible to demonstrate a small decrease in the evocative strength of the establishing operation.

  24. Many social scientists believe that human social behavior requires its own special science. Social Behavior (Ch. 19 of SHB) From his behavioral perspective, Skinner argues that no social phenomena emerge that cannot be understood in terms of the way one person's behavior is affected by the behavior of another person, and vice versa. As a conditioned eliciting stimulus (CS), an operant discriminative SD, a conditioned reinforcer (Sr), a conditioned punisher (Sp), and as a conditioned establishing operation (CEO). The relations may be based on very complex contingency histories, but the contingencies don't differ in principle from those of the nonsocial environment.

  25. To counteract this view, social behavior is often described which seems beyond the scope of behavior analysis, for example, catching someone's eye. The surprising power of an apparently trivial event is the common experience of catching someone's eye (in a flirtation, under amusing circumstances, at a moment of common guilt, and so on--Skinner's e.gs. The change in behavior which follows may be considerable. This has led to the belief that some nonphysical 'understanding' passes between persons. But the rfmt history offers an alternative explanation. This is a stimulus that is very important because of the contingencies in which it is involved.

  26. Catching Someone's Eye (cont'd.) • Our behavior may be very different in the presence or absence of a particular person. • When we simply see such a person in a crowd, our available repertoire immediately changes. • If in addition, we catch his eye, we fall under the control of of an even more restrictive stimulus--he is not only present, he is watching us. • When we catch his eye, we also know that he knows that we are looking at him. A much narrower repertoire of behavior is under the control of this specific stimulus: if we behave in a way which he censures, it will be not only in opposition to his wishes, but brazen. • It may also be important that "we know that he knows that we know that he is looking at us" and so on. • But there is nothing other than the current environment and the organism's histories regarding similar environments.

  27. A few more examples of Skinner's concern for details

  28. Operant conditioning as a possible self-control technique? Science and Human Behavior, 237-238 The remarkable properties of language that are related to its indirect reinforcement. Verbal Behavior, 204-206 Conditioned perceptual responses. Science and Human Behavior, 266-275 Discrete and continuous repertoires. Science and Human Behavior, 116-119

  29. Review Spontaneous recovery was analyzed in terms of stimuli related to handling. Giving a child a single piece of candy as a criticism of the principle of satiation, analyzed in terms of multiple control. Catching someone's eye as an example of a social stimulus of surprising power, analyzed in terms of reinforcement contingencies. All exemplify Skinner's interpretation of behavior in terms of the details of environment-behavior relations. Thanks for your attention.

  30. References Catania, A. C. (1998). Learning, fourth edition.New Jersey: Prentice Hall. Ferster, C. B., & Skinner, B. F. (1957)Schedules of reinforcement. New York: Appleton-Century-Crofts. Kendall, S. F. (1965). Spontaneous recovery after extinction with periodic time-outs. Psychonomic Science, 2, 117-118. Skinner, B. F. (1950). Are theories of learning necessary? Psychological Review, 57, 193-216. (Page references are to Cumulative record, Definitive Edition, 1999.) Skinner, B. F. (1953). Science and human behavior. New York: Macmillan. Skinner, B. F. (1957). Verbal behavior. New York: Appleton-Century-Crofts. Skinner, B. F. (1974). About Behaviorism. New York: Knopf. Thomas, D. R., & Sherman, L. (1986). An assessment of the role of handling cues in "spontaneous recovery" after extinction. Journal of the Experimental Analysis of Behavior, 46, 305-314. Welker, R. L., & McAuley, K. (1978). Reduction in resistance to extinction and spontaneous recovery as a function of changes in transportational and contextual stimuli. Animal Learning and Behavior, 6, 451-457. [Sidman,Herrnstein, Herrnstein and Hineman, Hineman, Dinsmoor ]

  31. If you would like copies of the PowerPoint slides, go to the TxABA web site www.unt.edu/behv/txaba and then to a link called Conference Handouts, and download. Or if it is easier, email me and I will send the PowerPoint presentation as an attachment to a reply to your email. I will also attempt to answer any questions you might have regarding today's presentation. email address: <jack.michael@wmich.edu>