1 / 62

In Defense of Contextual Vocabulary Acquisition:

In Defense of Contextual Vocabulary Acquisition:. How to Do Things with Words in Context William J. Rapaport Department of Computer Science & Engineering, Department of Philosophy, and Center for Cognitive Science State University of New York at Buffalo Buffalo, NY 14260

luke-jensen
Download Presentation

In Defense of Contextual Vocabulary Acquisition:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. In Defense ofContextual Vocabulary Acquisition: How to Do Things with Words in Context William J. Rapaport Department of Computer Science & Engineering, Department of Philosophy, and Center for Cognitive Science StateUniversityof New Yorkat Buffalo Buffalo, NY 14260 http://www.cse.buffalo.edu/~rapaport/CVA/

  2. “The meaning of things lies not in themselves but in our attitudes toward them.”− Antoine de Saint-Exupéry,Wisdom of the Sands (1948)

  3. “The meaning of things lies not in themselves but in our attitudes toward them.” words

  4. Terminology: Meaning of “Meaning” • “the meaning of a word” vs. “a meaning for a word” • “the”  single meaning • “of ”  meaning belongs to word • “a”  many possible meanings• depending on textual context, reader’s prior knowledge, etc. • “for”  reader constructs meaning, & gives it to word

  5. Overview • CVA project: • computational theory of how to “figure out” “a meaning for” an “unfamiliar word” from “context”. • Current status: • Have theory • Have computational implementation • Know that people do it. • Possibly best explanation of how we learn vocabulary • given # of words known (~45K),& # of years to learn them (~18) = ~2.5K words/year • but only taught ~10% in 12 school years • 2 groups of researchers say CVA can’t be done (well) • This talk: Why they’re wrong.

  6. Contextual Vocabulary Acquisition • CVA = active, deliberate acquisition of a meaning for a word in a text by reasoning from “context” • context = • “internalized” co-text • roughly: reader’s mental model of surrounding words in text • “integrated” with reader’s prior knowledge • including language knowledge • including previous hypotheses about word’s meaning • but not including external sources (dictionary, humans)

  7. What does ‘brachet’ mean?

  8. (From Malory’s Morte D’Arthur [page # in brackets]) 1. There came a white hart running into the hall with a white brachet next to him, and thirty couples of black hounds came running after them. [66] • As the hart went by the sideboard, the white brachet bit him.[66] • The knight arose, took up the brachet and rode away with the brachet.[66] • A lady came in and cried aloud to King Arthur, “Sire, the brachet is mine”.[66] • There was the white brachet which bayed at him fast.[72] 18. The hart lay dead; a brachet was biting on his throat, and other hounds came behind.[86]

  9. Computational CVA • Based on Karen Ehrlich’s CS Ph.D. dissertation (1995) • Implemented in SNePS KRRA system • KB: SNePS representation of reader’s prior knowledge • I/P: SNePS representation of word & co-text • Processing: • inferences drawn/belief revision during text input • N & V definition algorithms deductively search this “belief-revised, integrated” KB (the context) for definitional information • O/P: Definition frame • slots (features): classes, structure, actions, properties, etc. • fillers (values): info gleaned from context (= integrated KB)

  10. Cassie learns what “brachet” means:Background info about: harts, animals, King Arthur, etc.No info about: brachetsInput: formal-language (SNePS) version of simplified EnglishA hart runs into King Arthur’s hall.• In the story, B12 is a hart.• In the story, B13 is a hall.• In the story, B13 is King Arthur’s.• In the story, B12 runs into B13.A white brachet is next to the hart.• In the story, B14 is a brachet.• In the story, B14 has the property “white”.• Therefore, brachets are physical objects.(deduced while reading; Cassie believes that only physical objects have color)

  11. --> (defineNoun "brachet") Definition of brachet: Class Inclusions: phys obj, Possible Properties: white, Possibly Similar Items: animal, mammal, deer, horse, pony, dog, I.e., a brachet is a physical object that can be white and that might be like an animal, mammal, deer, horse, pony, or dog

  12. A hart runs into King Arthur’s hall.A white brachet is next to the hart.The brachet bites the hart’s buttock.--> (defineNoun "brachet") Definition of brachet: Class Inclusions: animal, Possible Actions: bite buttock, Possible Properties: white, Possibly Similar Items: mammal, pony,

  13. A hart runs into King Arthur’s hall. A white brachet is next to the hart. The brachet bites the hart’s buttock. The knight picks up the brachet. The knight carries the brachet. --> (defineNoun "brachet") Definition of brachet: Class Inclusions: animal, Possible Actions: bite buttock, Possible Properties: small, white, Possibly Similar Items: mammal, pony,

  14. A hart runs into King Arthur’s hall.A white brachet is next to the hart.The brachet bites the hart’s buttock.The knight picks up the brachet.The knight carries the brachet.The lady says that she wants the brachet.--> (defineNoun "brachet") Definition of brachet: Class Inclusions: animal, Possible Actions: bite buttock, Possible Properties: valuable, small, white, Possibly Similar Items: mammal, pony,

  15. A hart runs into King Arthur’s hall.A white brachet is next to the hart.The brachet bites the hart’s buttock.The knight picks up the brachet.The knight carries the brachet.The lady says that she wants the brachet. The brachet bays at Sir Tor. [background knowledge: only hunting dogs bay] --> (defineNoun "brachet") Definition of brachet: Class Inclusions: hound, dog, Possible Actions: bite buttock, bay, hunt, Possible Properties: valuable, small, white, I.e. A brachet is a hound (a kind of dog) that can bite, bay, and hunt, and that may be valuable, small, and white.

  16. General Comments • System’s behavior  human protocols • System’s definition  OED’s definition: = A brachet is “a kind of hound which hunts by scent”

  17. Implementation • SNePS (Stuart C. Shapiro & SNeRG): • Intensional, propositional semantic-network knowledge-representation & reasoning system • Formula-based & path-based reasoning • I.e., logical inference & generalized inheritance • SNeBR belief revision system • Used for revision of definitions • SNaLPS natural-language input/output • “Cassie”: computational cognitive agent

  18. How It Works • SNePS represents: • background knowledge + text information in a single, consolidated semantic network • Algorithms deductively search network for slot-fillers for definition frame • Search is guided by desired slots • E.g., prefers general info over particular info, but takes what it can get

  19. Noun Algorithm Find or infer: • Basic-level class memberships (e.g., “dog”, rather than “animal”) • else most-specific-level class memberships • else names of individuals • Properties of Ns (else, of individual Ns) • Structure of Ns (else …) • Functions of Ns (else …) • Acts that Ns perform (else …) • Agents that perform acts w.r.t. Ns & the acts they perform (else…) • Ownership • Synonyms Else do: “syntactic/algebraic manipulation” • “Al broke a vase”  a vase is something Al broke • Or: a vase is a breakable physical object

  20. Verb Algorithm • Infer: • properties of V • superclasses of V • transitivity information • similar actions (& delete dissimilar actions) • Conceptual-Dependency category • info about manner of V (“from”/“to”, transfer kind, instrument) • causes & effects • Also return class membership of: • agent • object • indirect object • instrument • [Also: preliminary work on adjective algorithm]

  21. Ongoing Research:From Algorithm to Curriculum • more robust algorithms • better N coverage needed • much better V coverage needed • no general adjective/adverb coverage yet • need “internal” context (morphology, etc.) • need NL interface • need acting component • need curriculum • CVA taught, but not well (emphasis on “guessing”) • we have explicit, teachable theory of how to do CVA • joint work w/ Michael Kibby, UB/LAI/Reading Clinic

  22. State of the Art: Vocabulary Learning Some dubious contributions: • Mueser 1984: “Practicing Vocabulary in Context” • BUT: “context” = definition !! • Clarke & Nation 1980: a “strategy” (algorithm?) • Look at word & context; determine POS • Look at grammatical context • E.g., “who does what to whom”? • Look at wider context • [E.g., search for Sternberg-like clues] • Guess the word; check your guess

  23. CVA: From Algorithm to Curriculum • “guess the word” = “then a miracle occurs” • Surely, we computer scientists can “be more explicit”!

  24. Terminology: “Guessing”? • Does reader … • “guess” a meaning?! • not computational! • “deduce” a meaning? • too restrictive; ignores other kinds of inference • “infer” a meaning? • too vague; ignores other kinds of reasoning (cf. Herbert Simon) • “figure out” a meaning? • just vague enough? • My preference: • The reader computes a meaning!

  25. Terminology: Co(n)text • “co-text” or “textual context” = surrounding words • “context” or “wide context” = • internalized co-text … • … integrated with … • … reader’s prior knowledge • “internalized” ≈ “mental model of” • involves local interpretation (cf. McKoon & Ratcliff) • pronoun resolution, simple inferences (e.g., proper names) • & global interpretation (“full” use of available PK) • can involve misinterpretation (see later slide) • “integrated” via belief revision: • new beliefs added by inference from text + prior knowledge • old beliefs removed (usually from prior knowledge base)

  26. Prior Knowledge Text PK1 PK2 PK3 PK4

  27. Prior Knowledge Text T1 PK1 PK2 PK3 PK4

  28. Integrated KB Text T1 internalization PK1 PK2 PK3 PK4 I(T1)

  29. B-R Integrated KB Text T1 internalization PK1 PK2 PK3 PK4 I(T1) inference P5

  30. B-R Integrated KB Text T1 internalization PK1 PK2 PK3 PK4 I(T1) T2 inference P5 I(T2) P6

  31. B-R Integrated KB Text T1 internalization PK1 PK2 PK3 PK4 I(T1) T2 inference T3 P5 I(T2) P6 I(T3)

  32. B-R Integrated KB Text T1 internalization PK1 PK2 PK3 PK4 I(T1) T2 inference T3 P5 I(T2) P6 I(T3)

  33. Note: All “contextual” reasoning is done in this “context”: B-R Integrated KB Text T1 internalization PK1 PK2 PK3 PK4 P7 I(T1) T2 inference T3 P5 I(T2) P6 I(T3)

  34. On Misinterpretation • Sign seen on truck parked outside of cafeteria at Student Union: Mills Wedding and Specialty Cakes

  35. On Misinterpretation • Sign seen on truck parked outside of cafeteria at Student Union: Mills Welding and Specialty Gases

  36. CVA as Science & Detection • CVA = hypothesis generation & testing • scientific task: • develop theory of word meaning • not guessing, but… • “In science, guessing is called ‘hypothesis formation’ ” (Loui) • detective work: • finding clues • not “who done it”, but “what does it mean” • susceptible to revision upon further evidence

  37. 2 Problematic Assumptions • CVA assumes that: • reader is consciously aware of the unfamiliar word • reader notes its unfamiliarity • CVA assumes that, between encounters: • reader remembers the word • reader remembers hypothesized meaning

  38. I. Are All Contexts Created Equal? • Beck, Isabel L.; McKeown, Margaret G.; & McCaslin, Ellen S. (1983), • “Vocabulary Development: Not All Contexts Are Created Equal” • Elementary School Journal 83(3): 177-181. • “it is not true that every context is an appropriate or effective instructional means for vocabulary development”

  39. Role of Prior Knowledge • Beck et al: • co-text “can give clues to the word’s meaning” • But “clue” is relative: • clues need other info to be interpreted as clues • Implication A1: • textual clues need to be supplemented with other information to compute a meaning. • Supplemental info = reader’s prior knowledge • has to be available to reader • will be idiosyncratic

  40. Do Words Have Unique, Correct Meanings? • Beck et al. (& others) assume: • A2: A word has a unique meaning • A3: A word has a correct meaning • Contra “unique”: A word’s meaning varies with: • co-text • reader(’s prior knowledge) • time of reading • “Correct” is a red herring (in any case, it’s fishy): • Possibly, words have author-intended meanings • but these need not be determined by co(n)text • Misunderstandings are universally unavoidable • Perfect understanding/dictionary definition not needed • “satisficing” understanding for passage comprehension suffices • reader always has opportunity of revising definition hypothesis

  41. Beck et al.’sCategories of Textual Contexts • What kinds of co-texts are helpful? • But keep in mind that we have different goals: • Beck et al.: • use co-text to teach “correct” word meanings • CCVA: • use context to compute word meaning for understanding

  42. Beck et al.’s Textual Context CategoriesTop-Level Kinds of Co-Text • Pedagogical co-texts: • artificially constructed, designed for teaching • only example is for a verb: • “All the students made very good grades on the tests, so their teacher commended them for doing so well.” • Natural co-texts: • “not intended to convey the meaning of a word” • 4 kinds (actually, a continuum)

  43. Beck et al.’s Textual Context Categories1. Misdirective (Natural) Co-Texts • “seem to direct student to incorrect meaning for a word” • sole example: • “Sandra had won the dance contest and the audience’s cheers brought her to the stage for an encore. ‘Every step she takes is so perfect and graceful,’ Ginny said grudgingly, as she watched Sandra dance.” • [[grudgingly]] =? admiringly • Is this a natural context? • Is this all there is to it?.. • A4: Co-texts have a fixed, usually small size • But larger co-text might add information • Prior knowledge can widen the co(n)text • ‘grudgingly’ is an adverb! • A5: All words are equally easy to learn • But N easier than V, V easier than Adj/Adv! (Granger/Gentner/..Gleitman..) • A6: Only 1 co-text can be used. • But later co-texts can assist in refining meaning

  44. Beck et al.’s Textual Context Categories2. Nondirective (Natural) Co-Texts • “of no assistance in directing the reader toward any particular meaning for a word” • sole example is for an adjective: • “Dan heard the door open and wondered who had arrived. He couldn’t make out the voices. Then he recognized the lumbering footsteps on the stairs and knew it was Aunt Grace.” • But: • Is it natural? • What about larger co-text? • An adjective! • Of no assistance? (see next slide)

  45. Syntactic Manipulation • Do misdirective & nondirective contexts yield no (or only incorrect) information? • Cf. algebraic manipulation (brings x into focus): • 2x + 1 = 7 • x = (7 − 1)/2 = 6/2 = 3 • Syntactic manipulation (bring hard word into focus): • “ ‘Every step she takes is so perfect and graceful,’ Ginny said grudgingly.” • ‘Grudgingly’ is the way that Ginny said “…” • So, ‘grudgingly’ is a way of saying something • In particular, ‘grudgingly’ is a way of (apparently) praising someone’s performance • “he recognized the lumbering footsteps on the stairs” • ‘lumbering’ is a property of footsteps on stairs

  46. Beck et al.’s Textual Context Categories3. General (Natural) Co-Texts • “provide enough information for reader to place word in a general category” • sole example is for an adjective: • “Joe and Stan arrived at the party at 7:00. By 9:30 the evening seemed to drag for Stan. But Joe really seemed to be having a good time at the party. ‘I wish I could be as gregarious as he is,’ thought Stan” • Same problems, but: • adjective is contrasted with Stan’s attitude • contrasts are good (so are parallel constructions)

  47. Beck et al.’s Textual Context Categories4. Directive (Natural) Co-Texts • “seem likely to lead the student to a specific, correct meaning for a word” • sole example is for a noun: • “When the cat pounced on the dog, he leapt up, yelping, and knocked over a shelf of books. The animals ran past Wendy, tripping her. She cried out and fell to the floor. As the noise and confusion mounted, Mother hollered upstairs, ‘What’s all the commotion?’ ” • Natural? Long! • Noun! • note that the sole example of a directive context is a noun, suggesting that it might be the word that makes a context directive

  48. Beck et al.’s Experiment • S’s given passages from basal readers • Researchers categorized co-texts & blacked out words • S’s asked to “fill in the blanks with the missing words or reasonable synonyms” • Results confirm 4 co-text types • Independently of results, there are methodological questions: • Are basal readers natural contexts? • How large were co-texts? • Instruction on how to do CVA? • A7: CVA “comes naturally”, so needs no training • A8: Fill-in-the-blank tasks are a form of CVA • No, they’re not! (see next slide)

  49. Beck et al.’s ExperimentCVA, Neologisms, & Fill-in-the-Blank • Serious methodological problem for all of us: • What if S knows the unknown word? • Filter out such S’s and words? • hard to do; what about testing familiar words? • Replace word with made-up “neologism”? • must be carefully chosen • Replace word with blank? • both kinds of replacement mislead S to find “correct missing/hidden word” • ≠ CVA! • Our (imperfect) solution: • use plausible-sounding neologism • tell S it’s like a foreign word with no English equivalent, hence need a descriptive phrase

  50. Beck et al.’s Conclusion • “less skilled readers … receive little benefit from” CVA • A9: CVA can only help in learning correct meanings. • But: • CVA uses same techniques as general reading comprehension: • careful, slow reading • careful analysis of text • directed search for information useful for computing a meaning • application of relevant prior knowledge • application of reasoning for purpose of extracting information from text • CVA, if properly taught & practiced, can improve general reading comprehension

More Related