1 / 6

NAACL Workshop on MTE

NAACL Workshop on MTE. 3 rd in a Series of MTE adventures 3 June, 2001. Structure of Workshop. One-day workshop Constrained by this – less flexibility in choice First part of day – review of MTE and ISLE Second part of day – hands-on exercise Much more constrained than this workshop.

damian
Download Presentation

NAACL Workshop on MTE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NAACL Workshop on MTE 3rd in a Series of MTE adventures 3 June, 2001

  2. Structure of Workshop • One-day workshop • Constrained by this – less flexibility in choice • First part of day – review of MTE and ISLE • Second part of day – hands-on exercise • Much more constrained than this workshop

  3. First Exercise • Information Extraction and MT • Why named entity scoring alone is not sufficient • What else might go into the scoring process • Users will be given • Data set (pre-translated) • Templates to fill out • Different source texts from different engines • Score the ability to populate templates • Mark where information came from

  4. Second Exercise • Question-Answering as a follow-on to MT process: • Given a set of questions and related documents • Answer the questions • Scoring ability of system to support task • Marking text as to where information came from • Looking at the types of information (where versus when) preserved / lost • Inspired by the TREC Q/A tasks

  5. Analysis of Exercises • Given number of participants can: • Look for inner-annotator agreement • Look for stronger correlation of scores • Look at specific linguistic constructs • Expanding / comments on ISLE • According to experiment results

  6. Some Specifics • Language pairs • Chinese > English • Arabic > English • Spanish > English • Follow-up reporting for later

More Related