1 / 36

Levels of Description & Mental Modules

Levels of Description & Mental Modules. Follow-up on questions raised last week: 1) Would Searle allow that a computer that modelled the neurochemical processes of the brain might have real intentionality (i.e. might really think)?

Download Presentation

Levels of Description & Mental Modules

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Levels of Description & Mental Modules

  2. Follow-up on questions raised last week: 1)Would Searle allow that a computer that modelled the neurochemical processes of the brain might have real intentionality (i.e. might really think)? No, because only the brain has the right kind of causal powers. See the Scientific American article by Searle.

  3. 2) According to a functional definition of a process, does the same input always yield the same output? Not necessarily. Might be a range of appropriate outputs. Pseudo-random generator. E.g. roulette wheel, dice E.g. ALICE

  4. Levels of Description Processes can be explained at different levels of description E.g. How does a car work? Explain to a user vs. explain to a mechanic vs. explain to a physics student

  5. Three levels of description: 1) Environmental level (also called intentional level): what does the thing do? What are the externally observable inputs and outputs (engineering: black box)? What are its capacities and limitations? 2) Computationallevel (also called design level or algorithmic level): how does the thing work? What method is used to give certain outputs to certain inputs? What is the internal organization (engineering: white box)? What are the inputs and outputs of its internal states? 3) Physicallevel: How are the processes physical realized? How can its workings be described as the actions of physical laws on physical materials?

  6. Example: multiplication by calculator vs. mind CalculatorMind Environmental level multiplication multiplication 14 x 5 = 70 14 x 5 = 70 Computationallevel repetitive addition memorize 14 + 14 + 14 + 14 + 14 multiplication table,do carrying 14 x 5 Physicallevel silicon chip, 1s and 0s neuronsfiring

  7. Each level of description supervenes on the lower level. What a calculator does (e.g. multiplication) depends on its program (e.g. repetitive addition) which depends on its physical make-up What the brain does (i.e. think) depends on its organization and computational processes which depend upon its physical make-up

  8. Cognitive science and the three levels Cognitive science focuses on the computational level of description Why? The environmental level is the view from outside. We can see what the brain does, what we want to know is how it does it. But learning the brain’s capacities and limitations does help us to figure out what processes are involved. The physical level is not very interesting. The mind could be realized in a different physical form (multiple realizability).Also, physical level has too fine granularity and is too complicated. Can hardly meaningfully explain a brain in terms of action of molecules, or even firing of neurons.However, learning about the physical level helps to understand the capacities and limitations of the brain.

  9. Levels within levels But, there is not just one computational level. The distinction between levels is sometimes unclear. There are levels within levels. The organization of the internal functional modules is a computational description of the whole brain The organization of smaller functional units of a functional module is a computational description of that functional module The organization of sub-routines within a functional unit is a computational level description of that functional unit

  10. Each module, functional unit or sub-routine can itself be described at an environmental level, a computational level or a physical level, e.g. what does this functional unit do (in relation to other functional units of the brain/module), how does it do it, and how is its operation physically realized? Attempts to describe the brain at a computational level can zoom in or zoom out to focus on different levels of detail.

  11. Homuncular functionalism One attempt to describe the functional architecture of the brain. Homunculus: little man in the brain Originally, a characterization of Descartes’ idea that the mind was situated in the brain (like a little man) doing the thinking, receiving inputs and sending out outputs. Also called the Cartesian theatre fallacy (“fallacy” because of infinite regress when again considering the brain of the little man and so on).

  12. Cognitive science use of the word “homunculus” refers to autonomous functional units of the brain. Homuncular functionalism is the attempt to describe the operation of the brain in terms of (computational-level descriptions of) the interaction of smaller and smaller units, until the most basic units can be explained in simple, mechanistic terms (ideally, explainable at a physical level of description). Undischarged homunculi: any homunculus that is posited but whose internal workings are not broken down into more basic levels of explanation. Undischarged homunculi are a problem for any description of the brain.

  13. We want to avoid resorting to “miracles” in explaining the mind.

  14. Modularity of Mind • A computational level description of the mind involves describing the mind’s functional architecture, i.e. how the different functions of the mind are carried out by different structures of the brain. • The mind can be described as organized into different functional modules, i.e. different functionally distinct faculties.

  15. Phrenology Precursor to theory of mental modules Popular in 19th century Specific mental faculties associated with particular locations in the brain Mental abilities and personality could be read off from bumps on the head

  16. Modern theory of mental modules put forward by Jerry Fodor in 1983. • Different parts of the brain are specialized in the performing of different types of function. • Modules: a division of labor in the brain. • Unlike in phrenology, the modules do not have to occupy a specific location – they can be spread out over different areas of the brain

  17. Perception modules The clearest example of modularity in the brain. “Basic” modules. Vision, hearing, smell touch and taste perception are each handled by distinct specialized areas of the brain.

  18. Characteristics of modules According to Fodor, a specialized faculty of the mind must meet the following criteria in order to be true modules: • Domain specificity • Inaccessibility • Informational encapsulation, modules need not refer to other psychological systems in order to operate • Automatic • Fast • Innate • Fixed neural architecture.

  19. 1) Domain specificity: modules are specialized and only operate on one kind of input, e.g. the vision module only operates on visual input – it does not respond to input from other sensory organs or from other parts of the brain. 2) Inaccessibility: you cannot perceive the internal workings of a module, e.g. you cannot look into your mind and find out how your visual system works. You can only perceive the output, e.g. you just see a scene in front of you

  20. 3) Information encapsulation: A module cannot be affected by input from other parts of the brain. Your visual perception is not affected by what you hear or feel or know. • Illustrated by the perseverance of optical illusions. Your visual system does not correct an optical illusion, even when you know that it is an illusion. • E.g. the Müller-Lyer illusion

  21. More illusions See animated version at: http://library.thinkquest.org/05aug/01744/shepard_tabletop_illusion.htm

  22. 4) Automatic: modules perform their function automatically and it is impossible to turn off the function, e.g. if someone brings a hot fried chicken into class, you cannot decide not to smell it (unfortunately) 5) Fast: modules work extremely fast, so, for example, you appear to sense things immediately. You are unaware of any delay between opening your eyes and seeing the world. 6) Innate: the capacities of a module are inborn, and not developed through experience 7) Fixed neural architecture: there are particular neural systems associated with particular modules

  23. Modules within modules Modules can very often be broken down to sub-modules. e.g. the visual system appears to contain the following modules: • Color processing module • Form processing module • Motion processing module

  24. Evidence for modularity 1) Neural imaging, e.g. fMRI (functional Magnetic Resonance Imaging) scans, show distinct structures of the brain are active when subjects are engaged in certain tasks. E.g.: Regions in the lateral temporal association cortex light up when subjects engage in an object recognition task. An fMRI scan

  25. 2)Brain injuries. An injury (or lesion) in a certain region of the brain results in characteristic problems, e.g. injury in the right side temporal lobe results in the inability to recognize objects (visual agnosia). A patient with this problem can describe the visual appearance of an object, and yet not know what it is. In “The Man who Mistook his Wife for a Hat” by Oliver Sacks, a man with visual aphasia was handed a rose. He described it as the object as “a convoluted red form with a linear green attachment”, but was unable to recognize it as a rose.

  26. 3)DissociationSingle Dissociation:Lesion in brain structure A disrupts function X but not function Y.Allows one to infer that functions X and Y are partially independent. Grailog (using ‘blank node’ for unnamed instance): Thing Classes subClassOf pairwiseDisjoint Lesion Brain Structure Function instanceOf disrupt A loc X unequal negation Y disrupt Instances loc hasLocation

  27. Double Dissociation:Lesion in brain structure A disrupts function X but not function Y.Lesion in brain structure B disrupts function Y but not function X.Allows one to infer that functions X and Y are mostly independent. Grailog: analogous

  28. Face recognition module The form recognition module itself contains at least onesub-module: the face recognition module. Double dissociation:Some people have a deficit (called prosopagnosia or face blindness) in function X = face recognition, but are good in function Y = object recognition. (People with face blindness can recognize friends and acquaintances by other visual cues,e.g. haircut, glasses, body shape, etc.)Fewer people have a deficit in function Y = object recognition, but are good in function X = face recognition.The brain region A = face area and a region that could be called B = object area might be involved in those deficits Noam Sagiv: Understanding Face Blindness http://sciencereview.berkeley.edu/articles.php?issue=1&article=briefs_3

  29. Modules vs. Central Processsing According to Fodor, the brain is divided into two types of functional unit: modules and the central processing system. Modules are automatic, fast-acting, unconscious. (Parts of Freud’s Id?) The central processing system is slow, voluntary and conscious.(Analogous to Freud’s Ego?) Modules present results of internal processing to the central processing system. The central processing system has access to the inputs of many systems, and takes care of the logical relations between the various contents and inputs and outputs. The operation of the central processing system is what you experience. You see and hear the results of the visual and auditory modules, you compare this perception to input from your memory, your imagination, etc. and form conclusions, make decisions, etc.

  30. Modules Central processing Domain specific Global Inaccessible Accessible Informationally Not informationally encapsulated encapsulated Fast Slow Automatic Voluntary Innate Affected by learning

  31. Ontologies (Formal) Ontology: Shared knowledge conceptualization (using a logical formalization) Overview: http://en.wikipedia.org/wiki/Strong_ontology Special cases: Taxonomy: Only subclass-superclass conceptualizations Partonomy: Only subpart-superpart conceptualizations

  32. Brain ontologies An OWL ontology enriched with rules for brain anatomical structures was developed at the University of Rennes, France Ammar Mechouche, Christine Golbreich, Bernard Gibaud: Semantic description of brain MRI images http://image.ntua.gr/swamm2006/resources/paper13.pdf Within the Foundational Model of Anatomy (FMA) Ontology, a Protege partonomy for brain anatomy was developed at the University of Washington, Seattle, USA Jose Mejino et al.: Challenges in Reconciling Different Views of Neuroanatomy in a Reference Ontology of Anatomy http://sigpubs.biostr.washington.edu/archive/00000214/

  33. Brain ontologies (cont.) Exploration of Foundational Model of Anatomy's brain partonomy hasPart http://www.na-mic.org/Wiki/index.php/Image:Fma.JPG

  34. Brain ontologies (cont.) BrainML (http://www.brainml.org) is an evolving standard XML metaformat to exchange neuroscience data and models. It includes a partonomy for neural structure / anatomy:http://www.brainml.org/viewVocabulary.do?versionID=786 CNS [central nervous system] brain forebrain cerebral cortex frontal lobe/area CMA [cingulate motor area] dorsal prefrontal cortex F1/MI [primary motor area] F2/F7 PMd [premotor area (dorsal)] F3/F6 SMA/preSMA [(pre-)supplementary motor area] F4/F5 PMv [premotor area (ventral)] lateral prefrontal cortex [FEF] frontal eye fields medial prefrontal cortex MEF/SEF [medial eye field, supplemental eye field] orbitofrontal cortex parietal lobe/area . . .

  35. Readings Optional readings: Sterelny, Kim, The Representational Theory of Mind, Chapter 2, pgs. 19-41 More optional readings: Pinker, Stephen, The Language Instinct, Ch. 3 “Mentalese”, pgs. 55-82.Review by Mark Alford, 2000. http://alford.fastmail.us/pinker.html Dennett, Daniel, Brainstorms, Ch. 6 “A Cure for the Common Code”, pgs. 90-108. http://cognet.mit.edu/library/books/view?isbn=0262540371

More Related