1 / 45

Anaphora, Discourse and Information Structure

Anaphora, Discourse and Information Structure. Oana Postolache oana@coli.uni-sb.de EGK Colloquium April 29, 2004. Overview. Anaphora Resolution Discourse (parsing) Balkanet Information Structure. Joint work with Prof. Dan Cristea & Prof. Dan Tufis; Univ. of Iasi. Anaphora Resolution.

konane
Download Presentation

Anaphora, Discourse and Information Structure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Anaphora, Discourse and Information Structure Oana Postolache oana@coli.uni-sb.de EGK Colloquium April 29, 2004

  2. Overview • Anaphora Resolution • Discourse (parsing) • Balkanet • Information Structure Joint work with Prof. Dan Cristea & Prof. Dan Tufis; Univ. of Iasi

  3. Anaphora Resolution “If an incendiary bomb drops next to you, don’t loose your head. Put it in a bucket and cover it with sand”. Ruslan Mitkov (p.c.)

  4. Anaphora Resolution “Anaphora represents the relation between a term (named anaphor) and another (named antecedent), when the interpretation of the anaphor is somehow determined by the interpretation of the antecedent”. Barbara Lust, Introduction to Studies of Anaphora Acquisition, D. Reidel, 1986

  5. Anaphora Resolution Types Coreference resolution The anaphor and the antecedent refer to the same entity in the real world. Three blind mice, three blind mice. See how they run! See how they run! Functional anaphora resolution The anaphor and the antecedent refer to two distinct entities that are in a certain relation. When the car stopped, the driver got scared. Haliday & Hassan 1976

  6. Types of Coreference Pronominal coreference The butterflies were dancing in the air. They offered an amazing couloured show. Common nouns with different lemmas Amenophis the IVth's wife was looking through the window. The beautiful queen was sad. Common nouns with different lemmas and number A patrol was marching in the street. The soldiers were very well trained. Proper names The President of U.S. gave a very touching speech. Bush talked about the antiterorist war. Appositions Mrs. Parson, the wife of a neighbour on the same floor, was looking for help. Nominal predicates Maria is the best student of the whole class. Function-value coreference The visitors agreed on the ticket price. They concluded that 100$ was not that much.

  7. text RARE AR-model1 AR-model2 AR-model3 Coreference chains RARE – Robust Anaphora Resolution Engine

  8. b evokes centera a proposes centera centera RARE: Two main principles • Coreferential relations are semantic, not textual. Coreferential anaphoric relation a b text layer……………………………………………….. semantic layer……………………………………………

  9. b ……………………… REbprojectsPSb RE a projects PSa PSa PSb PSaproposescentera PSb evokes centera centera RARE: Two main principles 2. Processing is incremental a text layer………………………………………… projection layer……………………………………………………….. semantic layer………………………………….

  10. projected structures discourse entities reference expressions Terminology text layer ……………………….………………………………………… REa REb REc REd REx PSx projection layer ……………………………………………… DE1 DEm DEj semantic layer ………………………………………

  11. knowledge sources primary attributes heuristics/rules domain of referential accessibility What is an AR-model? text layer ……………………….………………………………………… REa REb REc REd REx PSx projection layer ……………………………………………… DE1 DEm DEj semantic layer ………………………………………

  12. Primary attributes • Morphological (number, lexical gender, person) • Syntactic (REs as constituents of a syntactic tree, quality of being adjunct, embedded or complement of a preposition, inclusion or not in an existential construction, syntactic patterns in which the RE is involved) • Semantic and lexical (RE’s head position in a conceptual hierarchy, animacy, sex/natural gender, concreteness, inclusion in a synonymy class, semantic roles) • Positional (RE’s offset in the text, inclusion in a discourse unit) • Surface realisation (zero/clitic/full/reflexive/possessive/ demonstrative/reciprocal pronoun, expletive “it”, bare noun, indefinite NP, definite NP, proper noun) • Other (domain concept, frequency of the term in the text, occurrence of the term in a heading)

  13. Knowledge sources • A knowledge source: a (virtual) processor able to fetch values to attributes on the projections layer • Minimum set: POS-tagger + shallow parser

  14. Matching Rules • CertifyingRules (applied first):certify without ambiguity a possible candidate. • Demolishing Rules (applied afterwards): rule out a possible candidate. • Scored Rules: increase/decrease a resolution score associated with a pair <PS, DE>.

  15. Domain of referential accesibility • Filter and order the candidate discourse entities: • a. Linearly • Dorepaal, Mitkov, ... • b. Hierarchically • Grosz & Sidner; Cristea, Ide & Romary ...

  16. The engine • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • completion(DE,PS) • re-evaluation

  17. knowledge sources primary attributes PSx psx The engine: Projection • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • completion(DE,PS) • re-evaluation text layer ……………………….………………………………………… REa REb REc REd REx projection layer ……………………………………………… PSd DEm DEn semantic layer ………………………………………

  18. DEn heuristics/rules The engine: Proposing • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • completion(DE,PS) • re-evaluation text layer ……………………….………………………………………… REa REb REc REd REx projection layer ……………………………………………… PSd PSx DEm DEn semantic layer ……………………………………… domain of referential accessibility

  19. The engine: Proposing (2) • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • apply certifying rules • apply demolishing rules • apply scored rules • sort candidates in desc. order of scores • use thresholds to: • propose a new DE • link the current PS to an existing DE • postpone decision • completion(DE,PS) • re-evaluation

  20. The engine: Completion • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • completion(DE,PS) • re-evaluation text layer ……………………….………………………………………… REa REb REc REd REx projection layer ……………………………………………… PSd PSx DEm DEn DEn semantic layer ………………………………………

  21. The engine: Completion (2) • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • completion(DE,PS) • re-evaluation text layer ……………………….………………………………………… REa REb REc REd REx projection layer ……………………………………………… PSd DEm DEn semantic layer ………………………………………

  22. The engine: Re-evaluation • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • completion(DE,PS) • re-evaluation text layer ……………………….………………………………………… REa REb REc REd REx projection layer ……………………………………………… PSd PSd DEm DEn DEn semantic layer ………………………………………

  23. The engine: Re-eval (2) • for_each RE in RESequence: • projection(RE) • proposing/evoking(PS) • completion(DE,PS) • re-evaluation text layer ……………………….………………………………………… REa REb REc REd REx projection layer ……………………………………………… DEm DEn semantic layer ………………………………………

  24. The Coref Corpus • 4 chapters from George Orwell’s novel “1984” summing up aprox. 19,500 words. • Preprocessed using a POS-tagger & a FDG-parser. • The NPs automatically extracted from FDG structure (some manual corrections were necessary, also adding other types of referential expressions). • Manual annotation of the coreferential links (each text was assigned to two annotators). • Interannotator agreement – as low as 60%. Our annotation is conformant with MUC & ACE

  25. The Coref Corpus

  26. Evaluation Success Rate = #correctly solved anaphors / all anaphors For the four texts we obtained values between 60% and 70%. (Mitkov 2000)

  27. Road Map • Anaphora Resolution • Discourse (parsing) • Balkanet • Information Structure

  28. Discourse Parsing Input: plain text Goal: - Automatically obtain a discourse structure of the text (resembling RST trees). - Apply the Veins Theory to produce focussed summaries. Cristea, Ide & Romary 1998

  29. H=1 3 5 V=1 3 5 H=1 3 H=5 V=1 3 5 V=1 3 5 H=1 V=1 3 5 H=3 V=1 3 5 H=1 V=1 3 5 5 1 3 4 2 H=4 H=2 V=1 3 4 5 H=3 V=1 2 3 5 V=1 3 5 Veins Theory: Quick Intro Head expression: the sequence of the most important units within the corresponding span of text Vein expression: the sequence of units that are required to understand the span of text covered by the node, in the context of the whole discourse Cristea, Ide & Romary 1998

  30. Focused Summaries We call focused summary on an entity X, a coherent excerpt presenting how X is involved in the story that constitutes the content of the text. - It is given by the vein expression of the unit to which X belongs.

  31. NP Detector AR-engine tagged corefs s-trees Discourse structure Plain text FDG parser Discouse Parser Veins Theory focused summary sentence tree extractor segments detector The method

  32. The method NP Detector AR-engine tagged corefs s-trees Discourse structure Plain text FDG parser Discouse Parser Veins Theory focused summary sentence tree extractor segments detector Conexor FDG parser http://www.connexor.com/m_syntax.html

  33. The method NP Detector AR-engine tagged corefs s-trees Discourse structure Plain text FDG parser Discouse Parser Veins Theory focused summary sentence tree extractor segments detector Extracts NPs from the FDG structure

  34. The method NP Detector AR-engine tagged corefs s-trees Discourse structure Plain text FDG parser Discouse Parser Veins Theory focused summary sentence tree extractor segments detector RARE...

  35. The method NP Detector AR-engine tagged corefs s-trees Discourse structure Plain text FDG parser Discouse Parser Veins Theory focused summary sentence tree extractor segments detector Detects the boundaries of clauses, based on learning methods. Georgiana Puscasu (2004): A Multilingual Method for Clause Splitting.

  36. The method NP Detector AR-engine tagged corefs s-trees Discourse structure Plain text FDG parser Discouse Parser Veins Theory focused summary sentence tree extractor segments detector • Proposes one or more tree structure(s) at the sentence level. • The leaves are the clauses previously detected. • Uses the FDG structure and the cue-phrases.

  37. The method NP Detector AR-engine tagged corefs s-trees Discourse structure Plain text FDG parser Discouse Parser Veins Theory focused summary sentence tree extractor segments detector

  38. The Discourse Parser • We have trees for each sentence; • The goal is to incrementally integrate these trees into a single structure corresponding to the entire text • The current tree is inserted at each node on the right frontier; each resulting structure is scored considering: • The coreference links • Centering Theory • Veins Theory * foot node Cristea, Postolache, Pistol (2004): Summarization through Discourse structure (submitted to Coling)

  39. The Discourse Parser • At the end of the process - set of trees corresponding to the input text, each with a score • T* = argmax score(Ti) • Veins(T*) • Extract the summary Ti

  40. Discusion & Evaluation • - We do obtain automatically coherent summaries! • - How to evauate? • - We have 90 summaries made by humans... • Construct a golden summary out of the 90 summaries and compare it with the system output? • Compare the sytem output with all 90 summaries and take the best result?

  41. Road Map • Anaphora Resolution • Discourse (parsing) • Balkanet • Information Structure

  42. Information Structure • Many approaches for IS: • Prague School Approach; • Formal account of English intonation; • Integrating different means of IS realization within one grammar framework; • Formal semantics of focus; • Formal semantics of topic; • Integrating IS within a theory of discourse interpretation; • IS-sensitive discourse context updating; Sgall et al; Steedman; Kruijff; Krifka, Rooth; Hendriks; Vallduvi, Kruijff-Korbayova

  43. Information Structure • Goals: • Improve/Create/Enlarge a corpus annotated at IS (and not only); Investigate means of continuing the annotation (at least partially) automatically Investigate how the (major) NLP tasks can benefit from IS. Find correlation between different features. System that detects IS

  44. Summary • Anaphora Resolution:RARE • Discourse Parsing:Veins theory • Balkanet:Multilingual WordNet • Information Structure

  45. References • Postolache, Oana. 2004. ‘‘A Coreference Resolution Model on Excerpts from a novel’’. ESSLLI’04, to appear. • Postolache, Oana. 2004. ‘‘RARE: Robust Anaphora Resolution Engine’’. M.Sci. thesis. Univ. of Iasi.

More Related