1 / 12

CPE 641 Natural Language Processing

CPE 641 Natural Language Processing. HPSG II Asst. Prof. Nuttanart Facundes, Ph.D. HPSG. Highly structured representation of grammatical categories, encoded as typed feature structures. A set of descriptive constraints on the modeled categories expressing linguistic generalizations.

arin
Download Presentation

CPE 641 Natural Language Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CPE 641 Natural Language Processing HPSG II Asst. Prof. Nuttanart Facundes, Ph.D.

  2. HPSG • Highly structured representation of grammatical categories, encoded as typed feature structures. • A set of descriptive constraints on the modeled categories expressing linguistic generalizations.

  3. HPSG Theory Consists of: • A lexicon licensing basic words • Lexical rules licensing derived words • Schemata licensing phrases • Statements about word/constituent order • Other rules/principles

  4. Basic Lexical Entry

  5. Phrasal Structures

  6. Declarative characterization of natural language • HPSG uses ‘ontology’ – declaration of what exists • It uses ‘type hierarchy’, defining which type has which appropriate attributes with which appropriate values.

  7. Expressing the theory • HPSG theory is specified using a specific description language. (AVMs)

  8. Semantics in HPSG • Encoded as the value of CONTENT feature • Based on ‘event semantics’

  9. Minimal Recursion Semantics (MRS) • MRS semantics was developed to provide meaning representations which do not force scope ambiguities to be resolved. • MRS is also useful to keep track of how much meaning remains to be encoded or decoded in generating sentences.

  10. MRS representation • MRS representation of John makes pizza

  11. MRS is a representation of the meaning of expressions as a set of Elementary Predications (EPs). • An EP contains a semantic relation and argument(s) associated with that relation.

  12. MRS features & values

More Related