1 / 22

Automatically Generating Fictional and Factual Narratives

Automatically Generating Fictional and Factual Narratives. Birte Lönneker-Rodman Écritures de l’histoire, écritures de la fiction Colloque EHESS-CNRS, Paris, France March 16, 2006. Overview. Automatically Generated Narratives Models of Narrative and Generation

myra
Download Presentation

Automatically Generating Fictional and Factual Narratives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automatically Generating Fictional and Factual Narratives Birte Lönneker-Rodman Écritures de l’histoire, écritures de la fiction Colloque EHESS-CNRS, Paris, France March 16, 2006

  2. Overview • Automatically Generated Narratives • Models of Narrative and Generation • Structure and Acquisition of Knowledge • Author and Narrator in Generation • Fiction-Specific Language Use • Conclusion Lönneker-Rodman: Automatically Generating Narratives

  3. Automatically Generated Narratives Lönneker-Rodman: Automatically Generating Narratives

  4. Example I: Tale-Spin output, abbreviated • Once upon a time George Ant lived near a patch of ground. There was a nest in an ash tree. Wilma Bird lived in the nest. […] George was very thirsty. George wanted to get near some water. George walked from his patch of ground across the meadow […] to a river bank. George fell into the water. […] George wanted to get near the meadow. George couldn‘t get near the meadow… Lönneker-Rodman: Automatically Generating Narratives

  5. Example I, continued • … Wilma wanted to get near George. Wilma grabbed George with her claw. Wilma took George from the river […] to the meadow. George was devoted to Wilma. George owed everything to Wilma. Wilma let go of George. George fell to the meadow. The end. • (Meehan 1979; Meehan 1981) Lönneker-Rodman: Automatically Generating Narratives

  6. Example II: Newspaper reports • Today in Paris, a policeman was killed and four others wounded by anarchists who exploded a remote controlled bomb under the truck in which they were going from their office to a restaurant. The bomb contained two kilos of dynamite. • Anarchists killed a policemen and wounded four others today in Paris. They exploded a remote controlled bomb under the truck in which they were going […]. The bomb contained […]. • (Danlos 1987) Lönneker-Rodman: Automatically Generating Narratives

  7. Classification Criteria for Generators • Main purpose • Utilitarian (money) • Cognitive (theory) • Experimental (fun, discovery) • Genre/text type • Fictional (Story Generation) • Factual (non-fictional) (Natural Language Generation) • Interactivity • Narrative theory • "Level" (granularity) of generated output Lönneker-Rodman: Automatically Generating Narratives

  8. Models of Narrative and Generation Lönneker-Rodman: Automatically Generating Narratives

  9. Histoire Occurrences (Geschehen) Occurrences (Geschehen) Selection Story (Geschichte) Narrative (Erzählung) Discours Presentation of Narrative (Präsentation der Erzählung) (Schmid 1982; 2005) Two-Level and Four-Level Models of Narrative Lönneker-Rodman: Automatically Generating Narratives

  10. Deep Generation Content Determination Content Determination Document Planning Document Structuring Microplanning Surface Generation Surface Realization (McKeown/Swartout 1988) (Reiter/Dale 2000, adapted) Two-Level and Four-Level Models of Natural Language Generation Lönneker-Rodman: Automatically Generating Narratives

  11. Structure and Acquisition of Knowledge Lönneker-Rodman: Automatically Generating Narratives

  12. Knowledge Base and Memory • Generators need some knowledge (world knowledge, data, information, …) to start with (static representation) • Just read this data out (not really „Generation“) • Instantiate, combine, and transform this knowledge; keep track of changes in narrated world (Memory; dynamic representation) • Receive information on changes in the world from "host application" • meteorological service • hand-held Digital Assistant [Callaway et al. 2005] • battle simulator (Maybury 1999) Lönneker-Rodman: Automatically Generating Narratives

  13. Kidnapping Release of hero Start of counteraction vs. villain Fight in an open field Departure of hero Knowledge Representation in Graph Kidnapping Tormenting at night Call for help Release of hero Announcement of misfortune Start of counteraction vs. villain Fight in an open field Departure of hero (Maranda 1985) Lönneker-Rodman: Automatically Generating Narratives

  14. Representation of Goals and Plans • Example: Persuade someone by either asking them, giving them food or threatening them • (DE PERSUADE (ACTORAGENT ACTION RESULT) • (GOAL-EVAL ACTOR ACTION) • (APPEND (LIST ASK-PLAN) • (GEN-PLANS 'FOOD (GET-ISA 'FOOD AGENT) BARGAIN-PLAN) • (LIST THREAT-PLAN] • (Meehan 1981:237) Lönneker-Rodman: Automatically Generating Narratives

  15. Messages from "Host Application" • Battle event messages: timestamp, message, sender • (2774438460 (OCA 100 BEGIN MISSION EXECUTION) WOC-50 • TACTICAL-FIGHTER-WING) • (2774438460 (50-TACTICAL-FIGHTER-WING DISPENSE 4 F-16 • AIRCRAFT) OCA100) • (2774439140 (MOBILE-SAM2 FIRE A MISSILE AT OCA100) CLOCK) • (2774439140 (MOBILE-SAM1 FIRE A MISSILE AT OCA100) CLOCK) • […] • (Maybury 1999) Lönneker-Rodman: Automatically Generating Narratives

  16. Filling the Knowledge Base • Types (classes) of objects and events and relations between them: Ontology • Manual encoding • Semi-automated acquisition from texts (Web) • Tokens (instances) of objects and events • Manual input • Data from host application • Data from existing texts (Information Extraction) • Dynamic generation in planning FI, NONFI FI, NONFI NONFI NONFI FI Lönneker-Rodman: Automatically Generating Narratives

  17. Author and Narrator in Generation Lönneker-Rodman: Automatically Generating Narratives

  18. Authorship: Three main viewpoints FI • Generation shows that there is no author • Balpe 1995 • Authorship distributed • Knowledge encoder • Programmer of process algorithms • Interactor (e.g. provider of individual data, post-editor) • Lenoble 1995 • "Computer as author" • if no human intervention required when creating texts • otherwise: "authoring aid" • Reiter and Dale 2000 FI NONFI Lönneker-Rodman: Automatically Generating Narratives

  19. Narrator in Generation Narrator function always filled by generator? • Heterodiegetic • "Today in Paris, a policeman was killed and four others wounded by anarchists […]" (Danlos) • "George wanted to […] George was devoted to […]" (Tale-Spin) • Heterodiegetic, overt • "It pisses me off that a few shiftless students were out to make trouble on Beinecke plaza one day: they built a shanty town, Winny Mandela city, because they wanted […]" (PAULINE, Hovy 1988) • You-narrative • " […] It seems that you were mainly interested in the winter activities […] since you spent most of the time in front of the January and February frescos […]" (Callaway et al. 2005) NONFI FI NONFI NONFI Lönneker-Rodman: Automatically Generating Narratives

  20. Fiction-Specific Language Usein Generation Lönneker-Rodman: Automatically Generating Narratives

  21. Narrative Prose Generation • Character-to-character dialogue • Typographic marks • Speaker references himself, hearer, others, … • Interjections • Embedding the dialogue • Reference communicative action • utterance manner, … • Discourse markers • Now, it happened that […]. So, after following her […], whereupon he said, "Good day, Little Red Riding Hood, […]". • (Callaway and Lester 2002) FI, NONFI FI, NONFI FI  NONFI Lönneker-Rodman: Automatically Generating Narratives

  22. Conclusion • Generation confirms fictional vs. non-fictional (factual) distinction, but brings us back to the same old questions • Automatically generated factual narratives: reference to real world as precise as measuring instruments (observations) • No "perceptual" facet of perspective, but still: • Selection and inferences rely on world knowledge of generator • What about … • … Maybury's simulation? • … Inputting "fake" facts into Danlos's news report generator? • … Inputting "real" actors into Meehan's Tale-Spin? Lönneker-Rodman: Automatically Generating Narratives

More Related