1 / 35

Reductive and Representational Explanation in Synthetic Neuroethology

Reductive and Representational Explanation in Synthetic Neuroethology. Pete Mandik Assistant Professor of Philosophy Coordinator, Cognitive Science Laboratory William Paterson University, New Jersey. Collaborators. Michael Collins, City University of New York Graduate Center

brayton
Download Presentation

Reductive and Representational Explanation in Synthetic Neuroethology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reductive and Representational Explanation in Synthetic Neuroethology Pete Mandik Assistant Professor of Philosophy Coordinator, Cognitive Science Laboratory William Paterson University, New Jersey

  2. Collaborators • Michael Collins, City University of New York Graduate Center • Alex Vereschagin, William Paterson University

  3. My Thesis • Even for the simplest cases of intelligent behavior, the best explanations are both reductive and representational

  4. Overview • Mental representation in folk-psychological explanation • Mental representation in non-humans • The problem of chemotaxis • Modeling the neural control of chemotaxis • What the representations are

  5. Mental reps in folk-psych • George is opening the fridge because: • George desires that he drinks some beer • George sees that the fridge is in front of him • George remembers that he put some beer in the fridge • George’s psychological states cause his behavior • George’s psychological states have representational content

  6. Mental reps in non-human animals • Rats and maze learning • After finding the platform the first time, rats remember its location and can swim straight to it on subsequent trial from novel starting positions. • Rats not only represent the location, but compute the shortest path.

  7. Mental reps in non-human animals • Ducks’ representation of rate of return • Every day two naturalists go out to a pond where some ducks are overwintering and station themselves about 30 yards apart. Each carries a sack of bread chunks. Each day a randomly chosen one of the naturalists throws a chunk every 5 seconds; the other throws every 10 seconds. After a few days experience with this drill, the ducks divide themselves in proportion to the throwing rates; within 1 minute after the onset of throwing, there are twice as many ducks in front of the naturalist that throws at twice the rate of the other. One day, however, the slower thrower throws chunks twice as big. At first the ducks distribute themselves two to one in favor of the faster thrower, but within 5 minutes they are divided fifty-fifty between the two “foraging patches.” … Ducks and other foraging animals can represent rates of return, the number of items per unit time multiplied by the average size of an item. • (Gallistel 1990; emphasis mine)

  8. Positive Chemotaxis • Movement toward the source of a chemical stimulus

  9. 2-D food finding • 2-Sensor Chemophile: • Steering muscles orient creature toward stimulus • Perception of stimulus being to the right fully determined by differential sensor activity Sensors  Brain  Steering Muscles 

  10. 1-D food finding • 1- Sensor “Lost” Creature • left/right stimulus location underdetermined by sensor activity • only proximity perceived • Adding memory can help Sensor  Brain  Steering Muscles 

  11. Things to Note: • Note that single-sensor gradient navigation is a “representation hungry” problem • Note the folk-psychological explanation of how a human would solve the problem • Note, in what follows, the resemblance to the explanation of the worm’s solution

  12. C. Elegans • Caenorhabditis Elegans

  13. C. Elegans

  14. C. Elegans • Feree and Lockery (1999). “Computational Rules for Chemotaxis in the Nematode C. Elegans.” Journal of Computational Neuroscience 6, 263-277

  15. C. Elegans

  16. C. Elegans

  17. C. Elegans

  18. The Extracted Rule:

  19. Zeroth Order • The simulations were run keeping only the terms up to the zeroth order: • This rule failed to produce chemotaxis for any initial position.

  20. First Order • Next the simulations were run keeping all terms up to the first order: • This rule accurately reproduced the successful chemotaxis performed by the network model.

  21. Problems • Remains open. . . • How the network controllers are working • What the networks themselves are representing and computing • Whether the networks are utilizing memory

  22. Framsticks • 3-D Artificial Life simulatorBy Maciej Komosinski • and Szymon Ulatowski • Poznan University of Technology, Poland • http://www.frams.poznan.pl/

  23. Framsticks

  24. Framsticks nematodes

  25. Memory in Chemotaxis • Experimental Set Up • 3 orientation networks: Feed-forward, Recurrent, and Blind • five runs each, for 240 million steps • mutations allowed only for neural weights • fitness defined as lifetime distance • Initial weights: Evolved CPGs with un-evolved (zero weights) orienting networks

  26. Results

  27. What the representations are • States of neural activation isomorphic to and causally correlated with environmental states • Sensory states • Memory states • Motor-command states

  28. Representation and Isomorphism • Isomorphism • One to one mapping between structures • structure = set of elements plus set of relations on those elements

  29. Representation and Isomorphism • Representation • Primarily: a relation between isomorphic structures • Secondarily: a relation between elements and/or relations in one structure and those in another

  30. Isomorphisms between multiple structures • Which of the many structures a given structure is isomorphic to, does a given structure represent? • The range of choices will be narrowed by the causal networks the structure is embedded in

  31. For further investigation • States of desire/motivation • Clearer in models of action selection, not intrinsic to the stimulus orientation networks • Modeling representational error and falsity • Error and falsity are distinct, but this is clearer in non assertoric attitudes

  32. Summing up • Single-sensor chemotaxis is a “representation hungry” problem • Even explanations of adaptive behaviors as simple as chemotaxis benefit from psychological state ascriptions

  33. Summing up • The psychological states in question are identical to neural states • The neural states in question are causally explanatory of intelligent behavior in virtue of isomorphisms between structures of neural activations and structures of environmental features

  34. Summing up • Therefore… • Even for the simplest cases of intelligent behavior, the best explanations are both reductive and representational

  35. THE END

More Related