1 / 50

Carol Beer (Little Britain)

Carol Beer (Little Britain). Computer says “no”. Question Answering. Lecture 1 (Today): Introduction; History of QA; Architecture of a QA system; Evaluation. Lecture 2 (Friday): Question Classification; NLP techniques for question analysis; POS-tagging; Parsing; Semantic analysis; WordNet.

lexi
Download Presentation

Carol Beer (Little Britain)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Carol Beer (Little Britain)

  2. Computer says “no”

  3. Question Answering • Lecture 1 (Today):Introduction; History of QA; Architecture of a QA system; Evaluation. • Lecture 2 (Friday):Question Classification; NLP techniques for question analysis; POS-tagging; Parsing; Semantic analysis; WordNet. • Lecture 3 (Next Monday):Retrieving Answers; Document pre-processing; Tokenisation; Stemming; Lemmatisation; Named Entity Recognition; Anaphora Resolution; Matching; Use of knowledge resources; Reranking; Sanity checking.

  4. What is Question Answering? ?

  5. Information Pinpointing Information required: Average number of car accidents per year in Sweden. Two ways of getting this information: • Ask Google or a similar search engine (good luck!) • Ask a QA system the question:What’s the rate of car accidents in Sweden?

  6. QA vs IR • Traditional method for information access: IR (Information Retrieval) • Think of IR as finding the “right book in a library” • Think of QA as a “librarian giving you the book and opening it on the page with the information you’re looking for”

  7. QA vs IE • Traditional method for information access: IE (Information Extraction) • Think of IE as finding answers to a pre-defined question (i.e., a template) • Think of QA as asking any question you like

  8. What is Question Answering? • Questions in natural language, not queries! • Answers, not documents!

  9. Why do we need QA? • Information overload problem • Accessing information using traditional methods such as IR and IE are limited • QA increasingly important because: • Size of available information grows • There is duplicate information • There is false information • More and more “computer illiterates” accessing electronically stored information

  10. Information Avalanche • Available information is growing*: • 1999: 250MB pp for each person on earth • 2002: 800MB pp for each person on earth • People want specific information * source: M.de Rijke 2005

  11. People ask Questions* * source: M.de Rijke 2005

  12. Why is QA hard? (1/3) • Questions are expressed in natural language (such as English or Italian) • Unlike formal languages, natural languages allow a great deal of flexibility • Example: • What is the population of Rome? • How many people live in Rome? • What’s the size of Rome? • How many inhabitants does Rome have?

  13. Why is QA hard? (2/3) • Answers are expressed in natural language (such as English or Italian) • Unlike formal languages, natural languages allow a great deal of flexibility • Example: …is estimated at 2.5 million residents… … current population of Rome is 2817000… …Rome housed over 1 million inhabitants…

  14. Why is QA hard? (3/3) • Answers could be spread across different documents • Examples: • Which European countries produce wine?[Document A contains information about Italy, and document B about France] • What does Bill Clinton’s wife do for a living?[Document A explains that Bill Clinton’s wife is Hillary Clinton, and Document B tells us that she’s a politician]

  15. History of QA (de Rijke & Webber 2003) • QA is by no means a new area! • Simmons (1965) reviews 15 implemented and working systems • Many ingredients of today’s QA systems are rooted in these early approaches • Database oriented systems, domain independent, as opposed to today’s systems that work on large sets of unstructured texts

  16. Examples of early QA systems • BASEBALL (Green et al. 1963)Answers English questions about scores, locations and dates of baseball games • LUNAR (Woods 1977)Accesses chemical data on lunar material compiled during the Apollo missions • PHLIQA1 (Scha et al. 1980)Answers short questions against a database of computer installations in Europe

  17. Recent work in QA • Since the 1990s research in QA has by and large focused on open-domain applications • Recently interest in restricted-domain QA has increased, in particular in commercial applications • Banking, entertainment, etc.

  18. Architecture of a QA system corpus query IR Question Analysis question documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  19. Question Analysis • Input:Natural Language Question • Output:Expected Answer Type(Formal) Representation of Question • Techniques used:Machine learning, parsing

  20. Document Analysis • Input:Documents or Passages • Output:(Formal) Representation of Passages that might contain the answer • Techniques used:Tokenisation, Named Entity Recognition, Parsing

  21. Answer Retrieval • Input:Expected Answer TypeQuestion (formal representation)Passages (formal representation) • Output:Ranked list of answers • Techniques used:Matching, Re-ranking, Validation

  22. Example Run corpus query IR Question Analysis question documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  23. Example Run How long is the river Thames? corpus query IR Question Analysis question documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  24. Example Run length river thames corpus query IR Question Analysis question documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  25. Example Run corpus MEASURE query IR Question Analysis question documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  26. Example Run corpus query IR Question Analysis question Answer(x) & length(y,x) & river(y) & named(y,thames) documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  27. Example Run A: NYT199802-31 B: APW199805-12 C: NYT200011-07 corpus query IR Question Analysis question documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  28. Example Run A: 30(u) & mile(u) & length(v,u) & river(y) B: 60(z) & centimeter(z) & height(v,z) & dog(z) C: 230(u) & kilometer(u) & length(x,u) & river(x) corpus query IR Question Analysis question documents/passages answer-type Document Analysis question representation passage representation Answer Extraction answers

  29. Example Run corpus query IR Question Analysis question documents/passages C: 230 kilometer A: 30 milesB: 60 centimeter answer-type Document Analysis question representation passage representation Answer Extraction answers

  30. Evaluating QA systems • International evaluation campaigns for QA systems (open domain QA): • TREC (Text Retrieval Conference)http://trec.nist.gov/ • CLEF (Cross Language Evaluation Forum)http://clef-qa.itc.it/ • NTCIR (NII Test Collection for IR Systems)http://www.slt.atr.jp/CLQA/

  31. TREC-QA (organised by NIST) • Annual event, started in 1999 • Difficulty of the QA task increased over the years: • 1999: Answers in snippets, ranked list of answers; • 2005: Exact answers, only one answer. • Three types of questions: • Factoid questions • List questions • Definition questions

  32. QA@CLEF • CLEF is the “European edition” of TREC • Monolingual (non-English) QA • Bulgarian (BG), German (DE), Spanish (ES), Finnish (FI), French (FR), Italian (IT), Dutch (NL), Portuguese (PT) • Cross-Lingual QA • Questions posed in source language, answer searched in documents of target language • All combinations possible

  33. Open-Domain QA • QA at TREC is considered “Open-Domain” QA • Document collection is Acquint Corpus(over a million documents) • Questions can be about anything • Restricted-Domain QA • Documents described a specific domain • Detailed questions • Less redundancy of answers!

  34. TREC-type questions • Factoid questions • Where is the Taj Mahal? • List questions • What actors have played Tevye in `Fiddler on the Roof'? • Definition/biographical questions • What is a golden parachute? • Who is Vlad the Impaler?

  35. What is a correct answer? • Example Factoid Question • When did Franz Kafka die? • Possible Answers: • Kafka died in 1923. • Kafka died in 1924. • Kafka died on June 3, 1924 from complications related to Tuberculosis. • Ernest Watz was born June3,1924. • Kafka died on June 3, 1924.

  36. What is a correct answer? • Example Factoid Question • When did Franz Kafka die? • Possible Answers: • Kafka died in 1923. • Kafka died in 1924. • Kafka died on June 3, 1924 from complications related to Tuberculosis. • Ernest Watz was born June3,1924. • Kafka died on June 3, 1924. Incorrect

  37. What is a correct answer? • Example Factoid Question • When did Franz Kafka die? • Possible Answers: • Kafka died in 1923. • Kafka died in 1924. • Kafka died on June 3, 1924 from complications related to Tuberculosis. • Ernest Watz was born June3,1924. • Kafka died on June 3, 1924. Inexact (under-informative)

  38. What is a correct answer? • Example Question • When did Franz Kafka die? • Possible Answers: • Kafka died in 1923. • Kafka died in 1924. • Kafka died on June 3, 1924 from complications related to Tuberculosis. • Ernest Watz was born June3,1924. • Kafka died on June 3, 1924. Inexact (over-informative)

  39. What is a correct answer? • Example Question • When did Franz Kafka die? • Possible Answers: • Kafka died in 1923. • Kafka died in 1924. • Kafka died on June 3, 1924 from complications related to Tuberculosis. • Ernest Watz was born June3,1924. • Kafka died on June 3, 1924. Unsupported

  40. What is a correct answer? • Example Question • When did Franz Kafka die? • Possible Answers: • Kafka died in 1923. • Kafka died in 1924. • Kafka died on June 3, 1924 from complications related to Tuberculosis. • Ernest Watz was born June3,1924. • Kafka died on June 3, 1924. Correct

  41. Answer Accuracy # correct answers Answer Accuracy = --------------------------- # questions

  42. Correct answers to list questions System A: France Italy Example List Question Which European countries produce wine? System B: Scotland France Germany Italy Spain Iceland Greece the Netherlands Japan Turkey Estonia

  43. Evaluation metrics for list questions • Precision (P): # answers judged correct & distinct P = ---------------------------------------------- # answers returned • Recall (R): # answers judged correct & distinct R = ------------------------------------------------ # total answers • F-Score (F): 2*P*R F = ------------ P+R

  44. Correct answers to list questions System A: France Italy Example List Question Which European countries produce wine? System B: Scotland France Germany Italy Spain Iceland Greece the Netherlands Japan Turkey Estonia P = 1.00 R = 0.25 F = 0.40 P = 0.64 R = 0.88 F = 0.74

  45. Other evaluation metrics System A: Ranked answers (Accuracy = 0.2) System B: Ranked answers (Accuracy = 0.1)

  46. Mean Reciprocal Rank (MRR) • Score for an individual question: • The reciprocal of the rank at which the first correct answer is returned • 0 if no correct response is returned • The score for a run: • Mean over the set of questions in the test

  47. MRR in action System A: MRR = (.2+1+1+.2)/10 = 0.24 System B: MRR = (.5+.33+.5+.25+1+.5+.5+.5)/10=0.42

  48. Open-Domain Question Answering • TREC QA Track • Factoid questions • List questions • Definition questions • State-of-the-Art • Hard problem • Only few systems withgood results

  49. Friday • QA Lecture 2: • Question Classification • NLP techniques for question analysis • POS-tagging • Parsing • Semantic analysis • Use of lexical resources such as WordNet

  50. Question Classification (preview) • How many islands does Italy have? • When did Inter win the Scudetto? • What are the colours of the Lithuanian flag? • Where is St. Andrews located? • Why does oil float in water? • How did Frank Zappa die? • Name the Baltic countries. • Which seabird was declared extinct in the 1840s? • Who is Noam Chomsky? • List names of Russian composers. • Edison is the inventor of what? • How far is the moon from the sun? • What is the distance from New York to Boston? • How many planets are there? • What is the exchange rate of the Euro to the Dollar? • What does SPQR stand for? • What is the nickname of Totti? • What does the Scottish word “bonnie” mean? • Who wrote the song “Paranoid Android”?

More Related