1 / 42

Semantic and Pragmatic Processing with GETARUNS

Semantic and Pragmatic Processing with GETARUNS. Rodolfo Delmonte Department of Language Sciences Università Ca’ Foscari Email: delmont@unive.it Website: http://project.cgm.unive.it. Text3 by Sanford and Garrod. At the restaurant.

sirvat
Download Presentation

Semantic and Pragmatic Processing with GETARUNS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Semantic and Pragmatic Processing with GETARUNS Rodolfo Delmonte Department of Language Sciences Università Ca’ Foscari Email: delmont@unive.it Website: http://project.cgm.unive.it

  2. Text3by Sanford and Garrod At the restaurant. John went into a restaurant. There was a table in the corner. The waiter took the order. The atmosphere was warm and friendly. He began to read his book. rodolfo delmonte

  3. SHARED TASK TEXT LEVEL PARAMETERS • Text level semantic representation • Implicit argument recovery (CNI, DNI, INI) • Standard Implicatures recovery • Anaphora resolution (Sentence and Discourse level) • Centering and Topic Hierachy (see Sidner) • Spatiotemporal location inferencing • Possessional and other semantic relations inference • Idiomatic expression reconstruction • WSD rodolfo delmonte

  4. F-structure representation rodolfo delmonte

  5. rodolfo delmonte

  6. DISCOURSE MODEL in SS FACT is anInfon(Index, Relation(Property), List of Arguments - with Semantic Roles, Polarity - 1 affirmative, 0 negation, Temporal Location Index, Spatial Location Index) rodolfo delmonte

  7. Discourse Structure root:new(1-1) clause:1-1 topics:[expected:id3:'John'] main_fact:go([id3:'John', id2:restaurant], 1, id2) ref_int:tint(tes(f1_t31), []) temp_rel:contains(tes(f1_t31), tr(f1_t31)) disc_rel:narration disc_seg:1-[1] disc_dom:objective p_o_view:narrator rodolfo delmonte

  8. Standard Implicatures • A standard implicature is a conversational implicature based on an addressee's assumption that the speaker is being cooperative by directly observing the conversational maxims. • In the following exchange, A assumes that B is being cooperative, truthful, adequately informative, relevant, and clear. Thus, A can infer that B thinks A can get fuel at the garage: • A: I’ve just run out of petrol. • B: Oh; there’s a garage just around the corner. rodolfo delmonte

  9. Standard Implicatures At the restaurant. John went into a restaurant. There was a table in the corner. • From sentence 1 we infer that John was inside the restaurant • People (John) in restaurants look for a table where to sit • Restaurant have corners - meronimic relation • The main location has been asserted; and John is the Main Topic of discourse rodolfo delmonte

  10. CNI, DNI, INI • There are three cases of Null Instantiation, according to Framenet framework • The text presents a case of Indefinite Null Instantiation in the take_order sentence • Whenever a waiter takes an order there must be someone ordering • In this case GETARUNS produces an existential in the f-structure, with a GOAL semantic role, which is tentatively bound to the current Main Topic of discourse rodolfo delmonte

  11. Subjective Domains • The text continues with what is called a “psychological atmosphere statement” by S&G. • Again, the Centering and Topic Hierarchy allows us to derive what could be holder of Point of View • This has been made possible by considering the presence of an INI in the previous statement • “John thinks” is the intended governing predicate of the utterance rodolfo delmonte

  12. rodolfo delmonte

  13. Subjective Point of View • The presence of a Subject of Point of View allows the correct treatment of Anaphora Resolution • His is bound by internal Pronominal Binding • He is left unbound being a matrix Subject • The system high level modules take care of external pronouns by searching SPoV rodolfo delmonte

  14. rodolfo delmonte

  15. SHARED TASKSENTENCE LEVEL PARAMETERS • Predicate Argument Structure • Attachment correctness • Anaphora and reference resolution • Word sense disambiguation • Quantification • Negation, modals, conditionals, disjunctions • Tense and aspect • Plurals • Comparison phrases • Time expressions • Measurement expressions • Question interpretation • Clarity to a naive reader rodolfo delmonte

  16. Text One • THROW is understood as being an event that takes place from a CLIFF and with a SPEED. However the SPEED is HORIZONTAL but the CLIFF is not HIGH – this relation has been missed. The OBJECT falls from a height of the same CLIFF. rodolfo delmonte

  17. Text One entity(ind,id2,9,facts([ fact(infon111, coincide, [arg:id24, arg:id29], 1, tes(sn59_t13), id20), fact(infon4, isa, [ind:id2, class:object], 1, id1, univ), fact(infon5, inst_of, [ind:id2, class:thing], 1, univ, univ), fact(id9, throw, [theme_unaff:id2, agent:id8], 1, tes(sn42_t11), univ), fact(id17, fall, [actor:id2, modal:id16], 1, tes(f1_t12), univ), fact(id29, take, [actor:id26, theme_aff:id2], 1, tes(finf1_t13), id20)])). rodolfo delmonte

  18. Text Two • The main topic is CANCER. From the Discourse World we know that, CANCER is CAUSED by a VIRUS and that RESEARCHERs have been LOOKing for other CANCERs which receive a different semantic identifier but inherit all the properties , rodolfo delmonte

  19. Text Two entity(class,id28,2,facts([ in(infon79, id28, id3), fact(infon75, cause, [ind:id28], 1, id25, id26), fact(infon76, of, [arg:id28, specif:id28], 1, univ, univ), fact(infon77, inst_of, [ind:id28, class:stato], 1, univ, univ), fact(infon78, isa, [ind:id28, class:cancer], 1, id25, id26), fact(*, inst_of, [ind:id28, class:stato], 1, univ, univ), fact(*, isa, [ind:id28, class:cancer], 1, id1, univ), fact(*, cause, [theme_aff:id28, agent:id2], 1, tes(f2_t21), univ), fact(*, isa, [arg:id28, arg:cancer], 1, id25, id26), fact(*, look, [actor:id27, locat:id28], 1, tes(f3_t23), id26)])). rodolfo delmonte

  20. Text Two • The VIRUS is understood as the AGENT entity(ind,id2,11,facts([ fact(infon4, isa, [ind:id2, class:virus], 1, id1, univ), fact(infon5, inst_of, [ind:id2, class:animal], 1, univ, univ), fact(id4, cause, [theme_aff:id3, agent:id2], 1, tes(f2_t21), univ), fact(infon82, isa, [arg:id2, arg:virus], 1, id25, id26), fact(id29, cause, [agent:id2], 1, tes(f2_t23), id26)])). rodolfo delmonte

  21. Text Two • The system also understands that those EVENTs, were KNOWn for some time, as shown by the ID8 which is bound in the discourse by means of THAT to the event id4 listed above, rodolfo delmonte

  22. Text Two entity(ind,id8,1,facts([ fact(infon21, prop, [arg:id8, disc_set:[id4:cause: [theme_aff:id3, agent:id2]]], 1, id6, id7), fact(infon31, isa, [arg:id8, arg:that], 1, id6, id7), fact(id12, know, [tema_nonaff:id8, actor:id11], 1, tes(f2_t22), id7)])). rodolfo delmonte

  23. Text Two • However the system has not bound IT to THAT so we do not know what LEADs to a vaccine, nor do we know what prevents from what. All IT are unbound. rodolfo delmonte

  24. Text Four • The text is not completely and consistently represented but most of the relations are fully understood. In particular consider THEY in the third sentence which is rightly bound to the SET of two trainers asserted in the Discourse World. The school is always coindexed. The last sentence contains a first plural pronoun WE which is interpreted as being coindexed with the narrator, but also wrongly with the location of the text. rodolfo delmonte

  25. Text Five • The text is not completely and consistently represented but most of the relations are fully understood. We still know a lot about the main Entities, the PROPELLANT and rodolfo delmonte

  26. Text Five entity(ind,id19,8,facts([ fact(infon42, inst_of, [ind:id19, class:sub], 1, univ, univ), fact(infon43, isa, [ind:id19, class:propellant], 1, id18, nil), fact(infon44, isa, [arg:id19, arg:propellant], 1, id18, univ), fact(id20, explode, [agent:id19], 1, tes(f1_t53), univ), fact(infon108, isa, [arg:id19, arg:propellant], 1, id30, univ), fact(id38, use, [theme_aff:id19, actor:id37], 1, tes(f2_t55), univ), fact(id41, make, [theme_aff:id19, actor:id40, loc_origin:id31], 1, tes(sn32_t55), univ), fact(id20, explode, [agent:id19], 1, tes(f1_t53), univ), fact(infon50, sub, [prop:id20], 1, id18, univ)])). rodolfo delmonte

  27. Text Five entity(ind,id32,1.2,facts([ in(infon91, id32, id31), fact(infon89, inst_of, [ind:id32, class:sub], 1, univ, univ), fact(infon90, isa, [ind:id32, class:nitrocellulose], 1, id30, nil), fact(*, nitrocellulose, [ind:id32], 1, id30, nil), fact(*, produce, [ind:id32], 1, id30, nil), fact(*, repackage, [ind:id32], 1, id30, nil), fact(*, of, [arg:id32, specif:id31], 1, univ, univ), fact(*, of, [arg:id32, specif:id31], 1, univ, univ), fact(*, of, [arg:id32, specif:id31], 1, univ, univ), fact(*, inst_of, [ind:id32, class:col], 1, univ, univ), fact(*, isa, [ind:id32, class:chunk], 1, id30, nil), fact(*, make, [theme_aff:id19, actor:id40, loc_origin:id32], 1, tes(sn32_t55), univ)])). rodolfo delmonte

  28. Text Five • The relation intervening between CHUNKS and NITROCELLULOSE endows transitivity to the EVENTS taking place so that both are involved in REPACKAGE, PRODUCE, MAKE. • We also know that a CREWMAN was OPERATING at a center and that the GUN CREW was KILLed, by an unknown AGENT, id26. • We know that EVENTS happened during WORLD_WAR_II. Also notice that IT SUBJect of SUSPECT is correctly computed as an expletive. rodolfo delmonte

  29. Text Six • Here two of the sentences are parsed by the partial system. However the main relations are well understood. The FARM and the COMMUNITY provide FOOD and EARNs a REVENUE. • Most of the sentences are parsed by the partial system. However questions can be asked and get a reply, even though the generator does not handle uncountable nouns like MONEY properly. rodolfo delmonte

  30. Text Seven • The most difficult text is fully parsed but not satisfactorily semantically represented. We only know few things, and they are all unrelated. • There is no way to related WIND to TURBINE and to ENERGY in a continuous way. • I assume that scientific language requires a different setup of semantic rules of inference, which can only be appropriately specified in a domain ontology. rodolfo delmonte

  31. BLUE ;;; (3.2) "There was a table in the corner." ;;; Intermediate logical form (LF): (NP ((VAR _X2 "the" "corner") (VAR _X1 "a" "table" (PP "in" _X2))) (_X1)) ;;; Final semantic representation: isa(corner01,corner_n1), isa(table01,table_n1), is-inside(table01,corner01).

  32. BLUE ;;; (3.5) "He began to read his book." ;;; Intermediate logical form (LF): (DECL ((VAR _X1 "he" NIL)) (S (PAST) _X1 "begin" (DECL ((VAR _X2 "his" "book")) (S (TO) _X1 "read" _X2)))) ;;; Final semantic representation: isa(he01,person_n1), isa(book01,book_n1), isa(read01,read_v1), isa(begin01,begin_v1), agent(begin01,he01), object(begin01,[ agent(read01,he01), object(read01,book01),]).

  33. BLUE • Only a sentence by sentence semantic representation • Correct PAS except for sentence 2 • No quantification for indefinites • No temporal interpretation (tense, aspect, time adjuncts) • No anaphora resolution • Incomplete WSD • Shared Task Score • fair

  34. BOXER • Some textual semantic representation • Correct PAS • No temporal interpretation • Anaphora resolution • Quantification • Possessional relation assertion • No WSD • Shared Task Score • excellent

  35. GETARUNS rodolfo delmonte

  36. LXGRAM • Only a sentence by sentence semantic representation • Correct PAS - except for sentence 2 • No temporal interpretation • No Anaphora resolution (only possessive) • Quantification • No Possessional relation assertion • No Word Sense Disambiguation • Shared Task Criteria • fair rodolfo delmonte

  37. ONTOSEM • Only a sentence by sentence semantic representation • Propositional Attitude • Correct PAS • No temporal interpretation (time location of table?) • No Anaphora resolution (Animal??) • No quantification • Possessional relation assertion • Word Sense Disambiguation (Dining-Table, Order, etc.) • Shared Task Criteria • excellent

  38. TEXTCAP • Only a sentence by sentence semantic representation • Correct PAS - except for sentence 2 • No temporal interpretation • No Anaphora resolution • No quantification • Possessional relation assertion • No Word Sense Disambiguation • Shared Task Criteria • fair

  39. TRIPS • Only a sentence by sentence semantic representation • Speech Act specification • Correct PAS (wrong attachment of sentence 2) • Anaphora resolution • Possessional relation assertion • No quantification • No temporal interpretation • Word Sense Disambiguation (Dining-Table, Order, etc.) • Shared Task Criteria • excellent

More Related