1 / 41

INTRODUCTION TO ARTIFICIAL INTELLIGENCE

INTRODUCTION TO ARTIFICIAL INTELLIGENCE. Massimo Poesio LECTURE 8 Concepts in the brain. USING BRAIN DATA TO IDENTIFY CATEGORY DISTINCTIONS. Studies of brain-damaged patients have been shown to provide useful insights in the organization of conceptual knowledge in the brain

guido
Download Presentation

INTRODUCTION TO ARTIFICIAL INTELLIGENCE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo PoesioLECTURE 8Concepts in the brain

  2. USING BRAIN DATA TO IDENTIFY CATEGORY DISTINCTIONS • Studies of brain-damaged patients have been shown to provide useful insights in the organization of conceptual knowledge in the brain • Some patients are unable to identify or name man made objects and others may not be able to identify or name natural kinds (like animals) • Warrington and Shallice 1984, Caramazza & Shilton 1998 • fMRI has been used to identify these distinctions in healthy patients as well • E.g.,Haxby et al 2000, Martin & Chao 2003 • See, e.g., Mahon & Caramazza 2011, Martin 2007 for review

  3. Warrington & Shallice 1984 • Warrington and Shallice (1984) reported a patient called JBR who following an acute lesion to the left temporal lobe (as a result of herpes encephalitis) had a selective deficit when asked to name pictures from just one semantic category – living things. • By contrast JBR was able to name non-living objects very well including those with low frequency names such as ‘accordion’ that were matched for the number of letters in the name and the visual complexity of the object. • Other patients have shown opposite pattern

  4. Evidence from semantic category deficits • Category-specific deficits (e.g., Warrington & McCarthy, 1983, 1987; Warrington & Shallice, 1984; Gainotti & Silveri, 1996) • Patients show impairments in processing living things vs. man-made objects and vice versa. • Interesting exceptions: fruits, vegetables & other foods; musical instruments • Modality-specific deficits • Patients are unable to name visually presented objects, but can name them from other modalities and can access other semantic information about visually presented stimuli (Beauvois, 1982) • Other visual processing is fine.

  5. A PET Study on categories (Nature 1996)

  6. Study • 16 adults (8M, 8F) participated in a PET (positron emission tomography) study. • Involves injecting subject with a positron emitting radioactive substance (dye) • Regions with more metabolic activity will absorb more of the substance and thus emit more positrons • Positron-electron collisions yield gamma rays, which are detected • Increased rCBF (regional changes in cerebral blood flow) was measured • When subjects viewed line drawings of animals and tools.

  7. The experiment • Subjects looked at pictures of animals and tools and named them silently. • They also looked at noise patterns (baseline 1) • And novel nonsense objects (baseline 2) • Each stimulus was presented for 180ms followed by a fixation cross of 1820 ms. • Drawings were controlled for name frequency and category typicality

  8. Left middle temporal gyrus ACC Premotor

  9. Calcarine Sulcus

  10. Conclusions • Both animal and tool naming activate the ventral temporal lobe region. • Tools differentially activate the ACC, pre-motor and left middle temporal region (known to be related to processing action words). • Naming animals differentially activated left medial occipital lobe (early visual processing) • The object categories appear to be in a distributed circuit that involves activating different salient aspects of the category.

  11. REPRESENTATION OF CONCEPTS IN THE BRAIN: COMPETING HYPOTHESES • Unitary Content Hypothesis • Semantic information is stored in an abstract, amodal format organized by category. • Multiple Semantics Hypothesis • Semantic information is stored in many modality-specific semantic subsystems. Information in each subsystem is stored in a modality specific format. • Our intuitive sense of information being organized by categories is based on strong connections between related parts of these modality-specific semantic systems.

  12. Unitary Content Hypotheses (UCH)(Caramazza et al., 1990; Caramazza & Shelton, 1998; Riddoch et al., 1988; Pylyshyn, 1973)

  13. Multiple Semantics Hypotheses (MSH)(Paivio, 1971; Beauvois, 1982; Shallice, 1987, 1988; McCarthy & Warrington, 1988)

  14. Representation of words in semantic memory: the Functional Web hypothesis A word is represented in the cortex as a functional web Spread over a wide area of cortex Includes perceptual information As well as specifically conceptual information For nominal concepts, mainly in Angular gyrus (?) For some, middle temporal gyrus (?) For some, supramarginalgyrus Plus phonological information

  15. Example: The concept DOG We know what a dog looks like Visual information, in occipital lobe We know what its bark sounds like Auditory information, in temporal lobe We know what its fur feels like Somatosensory information, in parietal lobe All of the above.. constitute perceptual information are subwebs with many nodes each have to be interconnected into a larger web along with further web structure for conceptual information

  16. Building a model of a functional web:First steps Each node in this diagram represents the cardinal node* of a subweb of properties For example C T M V *to be defined in a moment!

  17. Add phonological recognition For example, FORK Labels for Properties: C – Conceptual M – Motor P – Phonological image T – Tactile V – Visual C T M P V These are all cardinal nodes – each is supported by a subweb The phonological image of the spoken form [fork] (in Wernicke’s area)

  18. Add node in primary auditory area For example, FORK Labels for Properties: C – Conceptual M – Motor P – Phonological image PA – Primary Auditory T – Tactile V – Visual C T M P PA V Primary Auditory: the cortical structures in the primary auditory cortex that are activated when the ears receive the vibrations of the spoken form [fork]

  19. Add node for phonological production For example, FORK Labels for Properties: C – Conceptual M – Motor P – Phonological image PA – Primary Auditory PP – Phonological Production T – Tactile V – Visual C T M P PP V PA Arcuate fasciculus

  20. Part of the functional web for DOG(showing cardinal nodesonly) Each node shown here is the cardinal node of a subweb T M C For example, the cardinal node of the visual subweb PP P V PA

  21. An activated functional web(with two subwebs partly shown) T C PP PR PA V M C – Cardinal concept node M – Memories PA – Primary auditory PP – Phonological production PR – Phonological recognition T – Tactile V – Visual Visual features

  22. FROM WORDNET TO BRAINNET • Neural evidence, unlike the evidence used to compile dictionaries and WordNet, and like the evidence one gathers from corpora and certain behavioral experiments, is entirely objective (although it can be subjective in the sense of differing from subject to subject) • The objective of our research is to combine evidence from brain data, from corpora, and from behavioral experiments (all of which is rather noisy) to develop a new architecture for conceptual knowledge: BrainNet

  23. A CASE STUDY: ABSTRACT CONCEPTS • Until recently, most work on concepts in CL / neuroscience / psychology focused on concrete concepts • But the type of conceptual knowledge that really challenges traditional assumptions about its organization are `abstract concepts’ – or to be more precise, the set of categories of non-concrete concepts • Events / actions • States • ‘Urabstract’ concepts: LAW, JUSTICE, ART • We are carrying out explorations of abstract knowledge using fMRI Anderson et al 2012a, 2012b, 2013, submitted

  24. THEORIES OF ABSTRACT CONCEPTS IN AI AND COGNITIVE (NEURO)SCIENCE • In CL/AI: TAXONOMIC organization for both abstract and concrete concepts • ‘UPPER ONTOLOGIES’, e.g., DOLCE • In psychology: ‘concreteness’ scale • Best known Cognitive Neuroscience: Paivio’s DUAL CODE theory (Paivio, 1986) • CONCRETE: verbal system & visual system • ABSTRACT: verbal system only • Schwanenflugel & Akin 1994: CONTEXT AVAILABILITY • Barsalou’s SCENARIO-BASED MODEL (Barsalou, 1999): • Abstract knowledge organized around SCENARIOS

  25. PT Particular ED Endurant PD Perdurant Q Quality AB Abstract PED Physical Endurant NPED Non-physical Endurant AS Arbitrary Sum EV Event STV Stative TQ Temporal Quality PQ Physical Quality AQ Abstract Quality … Fact Set R Region M Amount of Matter F Feature POB Physical Object … NPOB Non-physical Object ACH Achievement ACC Accomplishment ST State PRO Process TL Temporal Location … … SL Spatial Location … TR Temporal Region PR Physical Region AR Abstract Region … … … … … T Time Interval … S Space Region … APO Agentive Physical Object NAPO Non-agentive Physical Object MOB Mental Object SOB Social Object ASO Agentive Social Object NASO Non-agentive Social Object SAG Social Agent SC Society The DOLCE UPPER ONTOLOGY

  26. THE OBJECTIVES OF OUR EXPERIMENT • Identify the representation in the brain of a variety of WordNet categories exemplifying both concrete and abstract concepts (abstract words chosen by inspecting the words rated as most abstract in the De Rosa et al norms 2005) • Really abstract: ATTRIBUTE, COMMUNICATION, EVENT, LOCATION, ‘URABSTRACT’ • A category of concrete objects: TOOLS • A complex category: SOCIAL-ROLE • Comparing two types of classification: • TAXONOMIC (as in WordNet) • DOMAIN (cfr. Barsalou’s hypothesis about abstract concepts being ‘situated’) • Two domains: LAW and MUSIC • Using WordNet Domain

  27. STIMULI

  28. STIMULI, 2: URABSTRACTS

  29. STIMULI, 3: SOCIAL ROLES

  30. THE OBJECTIVES OF OUR EXPERIMENT • Identify the representation in the brain of a variety of WordNet categories exemplifying both concrete and abstract concepts (abstract words chosen by inspecting the words rated as most abstract in the De Rosa et al norms 2005) • Really abstract: ATTRIBUTE, COMMUNICATION, EVENT, LOCATION, ‘URABSTRACT’ • A category of concrete objects: TOOLS • A complex category: SOCIAL-ROLE • Comparing two types of classification: • TAXONOMIC (as in WordNet) • DOMAIN (cfr. Barsalou’s hypothesis about abstract concepts being ‘situated’) • Two domains: LAW and MUSIC • Using WordNet Domain

  31. ABSTRACT CONCEPTS: DATA COLLECTION AND ANALYSIS • 7 right-handed native speakers of Italian • Task: • Words presented in white on grey screen for 10 sec • Cross in between, 7 sec • Subjects had to think of a situation in which the word applied • Scanner: 4T BrukerMedSpec MRI scanner, EPI pulse sequence • TR=1000ms, TE=33ms, 26° flip angle. • Voxel dimensions 3mm*3mm*5mm • Preprocessing: using UCL’s Statistical Parameter Mapping Software • Data corrected for head motion • Classification: using a single layer NN

  32. MAIN QUESTIONS • Can the taxonomic and domain classes be distinguished from the fMRI data? • Is there a difference in classification accuracy between taxonomy and domain? • Can the taxonomic and domain classes be predicted across participants?

  33. RESULTS WITHIN PARTICIPANTS (CATEGORY DISTINCTIONS) ALL CATEGORICAL DISTINCTIONS CAN BE PREDICTED ABOVE CHANCE THERE ARE SIGNIFICANT DIFFERENCES BETWEEN CATEGORIES

  34. RESULTS WITHIN PARTICIPANTS(DOMAIN)

  35. WITHIN PARTICIPANTS RESULTS SUMMARY • Can discriminate with accuracy well above chance both taxonomic and domain distinctions • Easiest categories to recognize: TOOL, ATTRIBUTE, LOCATION, • Then SOCIAL ROLE, COMMUNICATION • Main confusions: communication / event

  36. CATEGORY LOCALIZATION IN THE BRAIN Red: Attribute Blue: Tool Green: Location R+G=Yellow G+B=Cyan R+B=Pink R+G+B=White

  37. Red: Social-role Green: Communication Blue: Event R+G=Yellow G+B=Cyan R+B=Pink R+G+B=White Red: Social-role Green: Attribute Blue: Urabstract

  38. CROSS PARTICIPANTS RESULTS SUMMARY • Concrete categories TOOL and LOCATION can be predicted across participant; ATTRIBUTE can also be significantly classified; but less concrete classes become conflated with ATTRIBUTE. • In general DOMAIN can be predicted across participants, however domain membership is much better classified in the most abstract taxonomic classes (attribute, communication and urabstract)

  39. TAXONOMIC / DOMAIN ORGANIZATION LAW MUSIC Attributegiurisdizione jurisdiction sonorita' sonority cittadinanza citizenship ritmo rhythm impunita' impunity melodia melody legalita' legality tonalita’ tonality illegalita’ illegality intonazione pitch communicationdivieto prohibition canzone song verdetto verdict pentagramma stave ordinanza decree ballata ballad addebito accusation ritornello refrain ingiunzione injunction sinfonia symphony event arresto arrest concerto concert processo trial recital recital reato crime assolo solo furto theft festival festival assoluzione acquittal spettacolo show social-rolegiudice judge musicista musician ladro thief cantante singer imputato defendant compositore composer testimone witness chitarrista guitarist avvocato lawyer tenore tenor toolmanette handcuffs violino violin toga robe tamburo drum manganello truncheon tromba trumpet cappio noose metronomo metronome grimaldello skeleton key radio radio Location tribunale court/tribunal palco stage carcere prison auditorium auditorium questura police station discoteca disco penitenziario penitentiary conservatorio conservatory patibolo gallowsteatro theatre urabstractsgiustizia justicemusica music liberta' liberty blues blues legge law jazz jazz corruzione corruption canto singing refurtiva loot punk punk

  40. WHAT THE DATA SUGGESTS

  41. READINGS • Binder & Desai 2011, The Neurobiology of semantic memory, Cell (on the website)

More Related