1 / 14

Reconciling facts: how to check the consistency of facts created from web crawling

Reconciling facts: how to check the consistency of facts created from web crawling. Dr Rob Stacey True Knowledge Ltd. True Knowledge . Open Domain question answering Semantic query language Structured and Unstructured knowledge acquisition >300 million facts 20k+ classes

elda
Download Presentation

Reconciling facts: how to check the consistency of facts created from web crawling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reconciling facts: how to check the consistency of facts created from web crawling Dr Rob Stacey True Knowledge Ltd.

  2. True Knowledge Open Domain question answering Semantic query language Structured and Unstructured knowledge acquisition >300 million facts 20k+ classes Billions of inferred facts

  3. Answering Questions Who was prime minister of the UK when Bernie Ecclestone was a teenager?

  4. Answering Questions

  5. Local time questions What is the time in Covent Garden now?

  6. Local time questions

  7. Is Madonna married?

  8. Answering a question Query processing • Triple representation [london] [is an instance of] [city] • Temporal knowledge represented by “facts about facts” [fact: [“123”]] [applies for timeperiod] [<1970 onwards>] • Richness within entity representation • “parametered” objects • [integer: [“8128”]] • [group: [london]; [san francisco]] * Actually 4 with negative relation

  9. Mining the facts Achieving 96% accuracy with the freetext of Wikipedia

  10. Reconciliation- What system assessment can do Accept incoming knowledge Contradict knowledge Make knowledge superfluous Uses user assessments and scoring to determine which facts are believed

  11. How assessment works • Run a negative version of the query • [married] ~[applies to] [madonna] • If the query is unknown the fact is new to the knowledge base • If the result is no then fact is either superfluous or an endorsement • If the result is yes the there is a contradiction

  12. Fact already proven The assertion may simply be an existing fact – if so more weight is added to the truth of that fact If the fact is different then it is superfluous to the system, though still valid is it removes the need for inference.

  13. Contradiction One of two facts must be wrong Assessment scoring decides which fact to believe The loser is contradicted and not believed or used in query processing

  14. Thanks &QuestionsDr Rob Stacey - True Knowledge Ltd.

More Related