1 / 17

LING 581: Advanced Computational Linguistics

LING 581: Advanced Computational Linguistics. Lecture Notes February 9th. Administrivia. Thanks for everyone’s presentations last time. Next time, presentation for today’s homework exercise. Did people go to Chomsky’s lecture on language day before yesterday?.

ulani
Download Presentation

LING 581: Advanced Computational Linguistics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LING 581: Advanced Computational Linguistics Lecture Notes February 9th

  2. Administrivia • Thanks for everyone’s presentations last time. Next time, presentation for today’s homework exercise. • Did people go to Chomsky’s lecture on language day before yesterday? http://education.azpm.org/azconnection/video-segment/2012/2/9/119-what-is-special-about-language/

  3. EVALB • Did everyone install and test EVALB? ../evalb-p sample.prmsample.gldsample.tst

  4. EVALB Results: Sent. Matched Bracket Cross Correct Tag ID Len. Stat. Recal Prec. Bracket gold test Bracket Words Tags Accracy ============================================================================ 1 4 0 100.00 100.00 4 4 4 0 4 4 100.00 2 4 0 75.00 75.00 3 4 4 0 4 4 100.00 3 4 0 100.00 100.00 4 4 4 0 4 3 75.00 4 4 0 75.00 75.00 3 4 4 0 4 3 75.00 5 4 0 75.00 75.00 3 4 4 0 4 4 100.00 6 4 0 50.00 66.67 2 4 3 1 4 4 100.00 7 4 0 25.00 100.00 1 4 1 0 4 4 100.00 8 4 0 0.00 0.00 0 4 0 0 4 4 100.00 9 4 0 100.00 80.00 4 4 5 0 4 4 100.00 10 4 0 100.00 50.00 4 4 8 0 4 4 100.00 11 4 2 0.00 0.00 0 0 0 0 4 0 0.00 12 4 1 0.00 0.00 0 0 0 0 4 0 0.00 13 4 1 0.00 0.00 0 0 0 0 4 0 0.00 14 4 2 0.00 0.00 0 0 0 0 4 0 0.00 15 4 0 100.00 100.00 4 4 4 0 4 4 100.00 16 4 1 0.00 0.00 0 0 0 0 4 0 0.00 17 4 1 0.00 0.00 0 0 0 0 4 0 0.00 18 4 0 100.00 100.00 4 4 4 0 4 4 100.00 19 4 0 100.00 100.00 4 4 4 0 4 4 100.00 20 4 1 0.00 0.00 0 0 0 0 4 0 0.00 21 4 0 100.00 100.00 4 4 4 0 4 4 100.00 22 44 0 100.00 100.00 34 34 34 0 44 44 100.00 23 4 0 100.00 100.00 4 4 4 0 4 4 100.00 24 5 0 100.00 100.00 4 4 4 0 4 4 100.00 ============================================================================ 87.76 90.53 86 98 95 16 108 106 98.15 === Summary === -- All -- Number of sentence = 24 Number of Error sentence = 5 Number of Skip sentence = 2 Number of Valid sentence = 17 Bracketing Recall = 87.76 Bracketing Precision = 90.53 Complete match = 52.94 Average crossing = 0.06 No crossing = 94.12 2 or less crossing = 100.00 Tagging accuracy = 98.15 -- len<=40 -- Number of sentence = 23 Number of Error sentence = 5 Number of Skip sentence = 2 Number of Valid sentence = 16 Bracketing Recall = 81.25 Bracketing Precision = 85.25 Complete match = 50.00 Average crossing = 0.06 No crossing = 93.75 2 or less crossing = 100.00 Tagging accuracy = 96.88

  5. EVALB Paper on evalb http://www.aclweb.org/anthology-new/H/H91/H91-1060.pdf

  6. EVALB • Motivation: grammars differ (9 systems) • Differences: • treatment of punctuation • auxiliary verb elements • infinitival “to” • Null (or empty) nodes

  7. EVALB • Example (Brown corpus): Doesn’t have to be a valid sentence of English after we do the reduction • Step 1:

  8. EVALB • After step 1: • Step 2:

  9. EVALB • After step 2: • Step 3: Gold standard

  10. EVALB

  11. EVALB [5] HOW TO CREATE A GOLDFILE FROM THE PENN TREEBANK The gold and parsed files are in a format similar to this: (TOP (S (INTJ (RB No)) (, ,) (NP (PRP it)) (VP (VBD was) (RB n't) (NP (NNP Black) (NNP Monday))) (. .))) To create a gold file from the treebank: tgrep -wn '/.*/' | tgrep_proc.prl will produce a goldfile in the required format. ("tgrep -wn '/.*/'" prints parse trees, "tgrep_process.prl" just skips blank lines). For example, to produce a goldfile for section 23 of the treebank: tgrep -wn '/.*/' | tail +90895 | tgrep_process.prl | sed 2416q > sec23.gold One line per parse You don’t have or need the ancient program tgrep…

  12. EVALB ./tsurgeon.sh -treeFile wsj_0927.mrg -s ( (S (NP-SBJ-1 (NNP H.) (NNP Marshall) (NNP Schwarz)) (VP (VBD was) (VP (VBN named) (S (NP-SBJ (-NONE- *-1)) (NP-PRD (NP (NP (NN chairman)) (CC and) (NP (NN chief) (JJ executive) (NN officer))) (PP (IN of) (NP (NP (NNP U.S.) (NNP Trust) (NNP Corp.)) (, ,) (NP (NP (DT a) (JJ private-banking) (NN firm)) (PP (IN with) (NP (NP (NNS assets)) (PP (IN under) (NP (NN management))) (PP (IN of) (NP (QP (IN about) ($ $) (CD 17) (CD billion)) (-NONE- *U*)))))))))))) (. .))) ( (S (NP-SBJ (NP (NNP Mr.) (NNP Schwarz)) (, ,) (ADJP (NP (CD 52) (NNS years)) (JJ old)) (, ,)) (VP (MD will) (VP (VB succeed) (NP (NNP Daniel) (NNP P.) (NNP Davison)) (NP-TMP (NNP Feb.) (CD 1)) (, ,) (SBAR-TMP (RB soon) (IN after) (S (NP-SBJ (NNP Mr.) (NNP Davison)) (VP (VBZ reaches) (NP (NP (NP (DT the) (NN company) (POS 's)) (JJ mandatory) (NN retirement) (NN age)) (PP (IN of) (NP (CD 65))))))))) (. .))) ( (S (NP-SBJ-1 (NP (NNP Mr.) (NNP Schwarz)) (, ,) (SBAR (WHNP-2 (WP who)) (S (NP-SBJ (-NONE- *T*-2)) (VP (VBZ is) (NP-PRD (NP (NN president)) (PP (IN of) (NP (NNP U.S.) (NNP Trust))))))) (, ,)) (VP (MD will) (VP (VB be) (VP (VBN succeeded) (NP (-NONE- *-1)) (PP-LOC (IN in) (NP (DT that) (NN post))) (PP (IN by) (NP-LGS (NP (NNP Jeffrey) (NNP S.) (NNP Maurer)) (, ,) (NP (CD 42)) (, ,) (SBAR (WHNP-3 (WP who)) (S (NP-SBJ (-NONE- *T*-3)) (VP (VBZ is) (NP-PRD (NP (JJ executive) (NN vice) (NN president)) (PP (IN in) (NP (NP (NN charge)) (PP (IN of) (NP (NP (DT the) (NN company) (POS 's)) (NN asset-management) (NN group)))))))))))))) (. .))) ( (S (NP-SBJ (NP (NNP U.S.) (NNP Trust)) (, ,) (NP (NP (DT a) (JJ 136-year-old) (NN institution)) (SBAR (WHNP-2 (WDT that)) (S (NP-SBJ (-NONE- *T*-2)) (VP (VBZ is) (NP-PRD (NP (CD one)) (PP (IN of) (NP (NP (DT the) (JJS earliest) (NN high-net) (JJ worth) (NNS banks)) (PP-LOC (IN in) (NP (DT the) (NNP U.S.)))))))))) (, ,)) (VP (VBZ has) (VP (VBN faced) (NP (NP (VBG intensifying) (NN competition)) (PP (IN from) (NP (NP (JJ other) (NNS firms)) (SBAR (WHNP-3 (WDT that)) (S (NP-SBJ (-NONE- *T*-3)) (VP (VBP have) (VP (VP (VBN established) (NP (-NONE- *RNR*-1))) (, ,) (CC and) (VP (ADVP-MNR (RB heavily)) (VBN promoted) (NP (-NONE- *RNR*-1))) (, ,) (NP-1 (NP (JJ private-banking) (NNS businesses)) (PP (IN of) (NP (PRP$ their) (JJ own))))))))))))) (. .))) • You can then redirect standard output to a file …

  13. EVALB Example • cat ~/research/TREEBANK_3/parsed/mrg/wsj/23/*.mrg > wsj_23.mrg • Putschapter 23 in one file (stillmulti-linetrees) • ./tsurgeon.sh -treeFile wsj_23.mrg -s > wsj_23.gold • File wsj_23.gold contains one tree per line

  14. Homework • WSJ corpus: sections 00 through 24 • Evaluation: on section 23 • Training: normally 02-21 (20 sections) • How does the Bikel Collins vary in precision and recall? • if you randomly pick 1, 2, 3 up to 20 sections to do the training with… • plot graph with evalb… • Present your results next time

  15. BikelCollins Training • Relevant WSJ PTB files

  16. BikelCollins Parsing • Parsing • Command wsj_XX.obj.gz file • Input file format (sentences)

  17. Bikel Collins Parsing • You can extract the sentences in section 23 for parsing yourself or you can download • wsj-23.txt • from the course webpage

More Related