1 / 97

INTRODUCCIÓ

INTRODUCCIÓ. La major part de les aplicacions interessants de llenguatge requeririen obtenir la representació del significat de les oracions. Estructura predicat argument.

rmcdonnell
Download Presentation

INTRODUCCIÓ

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INTRODUCCIÓ • La major part de les aplicacions interessants de llenguatge requeririen obtenir la representació del significat de les oracions.

  2. Estructura predicat argument La estructura predicat argument descriu les relacions semàntiques que es donen entre les entitats que apareixen en la oració. -who does what to whom, -how, where, why?

  3. Estructura predicat argument I eat sushi PRED: eat; ARG1: I; ARG2: sushi.

  4. Un exemple més complex. • En frases complexes tenim més de una proposició. • Mary loves the man who bougth the blue car • P1: Mary loves the man. • P2: The man bought the car. • P3: blue car.

  5. Perquè serveix la estructura sintàgmatica? • Per obtenir la estructura predicat argument es necessari computar primer la estructura sintagmàtica o al menys una estructura de dependències. • La estructura sintagmàtica i les corresponents ‘regles’ son les que permeten que el llenguatge sigui composicional.

  6. Però es cert això?

  7. Resum • Historia (per entendre els objectius de les teories) • Why phrase structures? (problemes) • Why dependency grammars? (problemes) • Why a probabilistic approach?(Brute force vs. theory) • Estat actual del nostre model i recerca futura.

  8. Una previa:Com millorar els resultats? • Augmentant el training size? • Mètodes estadístics més eficients? • O millorant les teories?

  9. Historia Grammars as computational theories

  10. Grammars as computational theories • Cognition is computation. • A grammar is a form of computation.

  11. Computational theories (Marr 1980) • What is the goal of the computation? • Why is it appropriate? • What is the logic of the strategy by which it can be carried out?

  12. Chomsky’s Goal A syntactic theory has as a goal to explain the capacity of speakers to judge as acceptable (or ‘generate’) well formed sentences and to rule out ill-formed ones.

  13. Justification • Syntax is indepedent of semantics. • Speakers can judge as ill or well-formed new sentences that they have never heard before.

  14. Quin es el origin dels sintagmes? • No es semàntic. Un NP no es un NP perquè es correspongui amb un argument semàntic. • Es un NP en base a trets purament sintàctics. Regularitats en la distribució de les paraules en les frases. • Tests que determinen que es un constituïen (un sintagma) i que no ho és.

  15. Constituency Tests • “Tests of constituency are basic components of the syntactician’s toolbox. By investigating which strings of words can and cannot be moved, deleted, coordinated or stand in coreference relations, it is possible to draw inferences about the internal structure of sentences.” (Phillips, 1998, p. 1)

  16. Chomsky assumed that, given the independence of syntax, a theory of syntax can be developed without a semantic theory and ignoring the mapping process, following only the well-formedness goal.

  17. Mapping Goal A syntactic theory has as a goal to explain the capacity of native speakers to map sentences into the corresponding conceptual representations and vice versa.

  18. Mapping Goal • The mapping goal tries to figure out how linguistic expressions can be mapped in the respective propositional representations in in the most simple and direct way.

  19. Mapping Goal • (3.a) IBMP gave the company the patent. • (3.b) PRED: gave; ARG1: IBMP; ARG2: the patent; ARG3: the company. • (4.a) Low prices. • (4.b) PRED: low; ARG1: prices.

  20. Well-Formedness Goal • (3.a) IBMP gave the company the patent. • (3.b)** IBMP company gave the the patent. • (4.a) Low prices. • (4.b)** Prices low

  21. Direct mapping The carpenter gave the nurse the book. PRED: gave; ARG1: the carpenter; ARG2: the book; ARG3: the nurse.

  22. El mapping pot ser directe en expresions simples • Aixo es cert per oracions simples. • Culicover, Peter W. and Andrzej Nowak. Dynamical Grammar. Volume Two of Foundations of Syntax. Oxford University Press.  2003. • Roger Schank i col·laboradors en els anys 70.

  23. 5941 Mr. NNP - (A0* * * * *5941 Nakamur NNP - *) * * * *5941 cites VBZ cite (V*) * * * *5941 the DT - (A1* * * * *5941 case NN - * * * * *5941 of IN - * * * * *5941 a DT - * (A0* (A0* (A0* (A0* <---------4NLDs5941 custome NN - * *) *) *) *) 5941 who WP - * (R-A0*) (R-A0*) (R-A0*) (R-A0*)5941 wants VBZ want * (V*) * * *5941 to TO - * (A1* * * *5941 build VB build * * (V*) * *5941 a DT - * * (A1* * *5941 giant JJ - * * * * *5941 tourism NN - * * * * *5941 complex NN - * * *) * *5941 in IN - * * (AM-LOC** *5941 Baja NN - * *) *) * *5941 and CC - * * * * *5941 has VBZ - * * * * *5941 been VBN - * * * * *5941 trying VBG try * * * (V*) *5941 for IN - * * * (AM-TMP**5941 eight CD - * * * * *5941 years NNS - * * * *) *5941 to TO - * * * (A1* *5941 get VB get * * * * (V*5941 around IN - * * * * *)5941 Mexican NNP - * * * * (A1*5941 restric NNS - * * * * *5941 on IN - * * * * *5941 foreign JJ - * * * * *5941 ownersh NN - * * * * *5941 of IN - * * * * *5941 beachfr JJ - * * * * *5941 propert NN - *) * * *) *)5941 . . - * * * * *

  24. Direct mapping • Per Culicover en frases mes complexes no es possible. • Mary loves the man who bougth the blue car • P1: Mary loves the man. • P2: The man bought the car. • P3: blue car.

  25. Direct mapping • No es possible? • Mary loves the man who bougth the blue car • P1: PRED: loves; ARG1: Mary; ARG2: the man. • P2: PRED: bought; ARG1:the man; ARG2: the car.. • P3: PRED:blue; ARG1: car.

  26. Why phrase structures? • Why dependency grammars?

  27. No son necessaries • Es pot aconseguir composicionalitat sense computar estructura sintagmàtica • Es pot fer un mapping directe a la estructura predicat argument sense computar ni una estructura de dependències ni una sintagmàtica. • Es simplfica considerablement el procés de parsing i el tractament de la ambigüitat.

  28. Temes per poder entrar a fons • Why phrase structures? • Why dependency grammars? • Why a probabilistic approach? • (al menys la versió “brute-cutre force”)

  29. D-SemMap V1.0

  30. Vectors and propositions • A proposition can be represented by a vector of features (Hinton, 1981). • In order to represent the proposition the vector is divided into “slots”. • Each element of the proposition is represented in one slot.

  31. Vectors and propositions Module 2Semantic classes “Mary drives a bus” action human artifact entity SLOT 0 SLOT 1 SLOT 2 SLOT 3 Types & Backs Module 1 POS “Mary drives a bus” V MA N DT N SLOT 0 SLOT 1 SLOT 2 SLOT 3 Types & Backs

  32. MODULE 1 Output Layer Yamada (2003) Nivre (2004) Magerman (1994) Ratnaparky (1999) Input Layer Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat.

  33. carpenter bought a shirt with Credit-card The PUT 0 PUT 2 PUT 1 PUT 3 PUT 1 PUT 2 PUT 3 IIN DT DT V MA PE PA N C N C N C carpenter bought a shirt with Credit-card The Output Hidden Slot 1 Slot 0 Slot 2 Slot 3 V MA PE PA DT N C DT N C IIN N C Subcategorization backtracking

  34. Module 2 supervises argument position MODULE 1 MODULE 2 P1) PRED: V MA PE PA( bought) ARG1: N PR(Mary) ARG2: DT N C( a shirt ) ARG3: IIN N C(with pockets) P1) PRED: get, transfer, give, pay, ARG1: entity, person ARG2: entity, object, artifact(shirt) ARG3: artifact, part-of-dress Subcategorization and Selectional Restrictions Parsing strategy: Attaches first to the current proposition

  35. Binding problem

  36. “Mary bought a shirt with pockets” BACK CLEARP IZ_IN0 MODULE 1 MODULE 2 P1) PRED: V MA PE PA( bought) ARG1: N PR(Mary) ARG2: DT N C ( a shirt ) ARG3: IIN N C(with pockets) P1) PRED: get, transfer, give, pay, ARG1: entity, person ARG2: entity, object, artifact ARG3: artifactpart-of-dress ARG3:artifactpart-of-dress Parsing strategy: Attaches first to the current proposition

  37. “Mary bought a shirt with pockets” MODULE 1 MODULE 2 P1: PRED: V MA PE ( bought) ARG1: N PR(Mary) ARG2: DT N C ( a shirt ) ARG3: P1: PRED: get, transfer,pay, accept ... ARG1: entity, person,... ARG2: entity, object, artifact, shirt ARG3: P2: PRED: ARG1: DTN C(a shirt) ARG2: IIN N C(with pockets) ARG3: P2: PRED: ARG1: entity, object, artifact, shirt ARG2: artifact part-of-dress ARG3:

  38. PARSING COMPLEX SENTENCES

  39. Elementary expressions

  40. “a blue shirt” MODULE 1 MODULE 2 PRED: JJ ( blue ) (SLOT 0) ARG1: DT N C( a shirt ) (SLOT 1) PRED: colour, blue (SLOT 0) ARG1: entity, object, artifact (SLOT 1) “the governement`s minister” ARG1: entity,person... (SLOT 1) ARG2: entity, person... (SLOT 2) TYPE: POS (SLOT type) ARG1: DT N C( the minister ) (SLOT 1) ARG2: N CPOS( goverDTent’s ) (SLOT 2) TYPE: POS (SLOT type)

  41. Complex sentences • A complex sentence is any sentence that is formed by more than one elementary expression • A complex sentence requires more than one proposition for its semantic representation

  42. MODULE 1 Output Layer Input Layer Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat.

  43. Non Invariant Solution Output Layer Input Layer Input Word COMPLETE SENTENCE STRUCTURE (OPERATIONS WITH VECTORS)

  44. Invariant solution (a kind of shift and reduce parser) Output Layer STACK Stored Context Input Layer Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat. Focus of attention (current context)

  45. MODULE 1 Output Layer STACK Stored Context Input Layer Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat. Focus of attention (current context)

  46. parte activada de la MLP foco de atención memoria a largo plazo (MLP) Modelos concéntricos (Cowan, 1988, 1995, 1999; Oberauer, 2002)

  47. A Neurons whose receptive fields are invariant (translation and scale), higher visual areas (inferotemporal cortex) Covert attention retinotopic visual neurons (as found in V1 and V2) A J L A

More Related