1 / 107

语义网的逻辑基础 Logical Foundation of the S emantic Web

语义网的逻辑基础 Logical Foundation of the S emantic Web. 主讲: 黄智生 Zhisheng Huang Vrije University Amsterdam, The Netherlands huang@cs.vu.nl 助教: 胡伟 Wei Hu Southeast University whu@seu.edu.cn . 课程时间表 Schedule. 讲座5:本体管理与推理( I) Lecture 5: Ontology Management and Reasoning (I).

reynard
Download Presentation

语义网的逻辑基础 Logical Foundation of the S emantic Web

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 语义网的逻辑基础Logical Foundation of the Semantic Web 主讲: 黄智生 Zhisheng Huang Vrije University Amsterdam, The Netherlands huang@cs.vu.nl 助教: 胡伟 Wei Hu Southeast University whu@seu.edu.cn

  2. 课程时间表Schedule

  3. 讲座5:本体管理与推理(I)Lecture 5: Ontology Management and Reasoning (I) • 本体推理与管理(Reasoning and Management of Ontologies) • 不一致本体的推理(Reasoning with Inconsistent Ontologies) • 多版本本体的推理与管理(Reasoning with Multi-version Ontologies) • 本体修改与本体演化(Ontology Revision and Ontology Evolution) • 结论和讨论 (Conclusion and Discussion)

  4. Ontology Reasoning and Inconsistency Management

  5. 语义网核心研究课题:SEKT Project • Semantically Enabled Knowledge Technologies (SEKT) • A European research and development project launched under the EU Sixth Framework Programme. • .

  6. Duration and Partners • Three year project: January 2004 – December 2006. • 13 partners: • 公司:BT(英国电信), Empolis GmbH, iSOCO(Spain), Kea-pro GmbH, Ontoprise, Sirma AI EOOD(Bulgaria), (+SIEMENS西门子公司) • 大学:Jozef Stefan Institute(Slovenia), Univ. Karlsruhe(Germany), Univ. Sheffield(U.K.), Univ. Innsbruck(O), Univ. Autonoma Barcelona(Spain), Vrije Universteit Amsterdam(The Netherlands)

  7. Case Studies • Legal Domain (iSOCO) • Telecom Domain (BT) • Siemens

  8. SEKT Activities and Relationships

  9. Core Tasks: WP3

  10. SEKT WP3 Architecture

  11. Inconsistency and the Semantic Web • The Semantic Web is characterized by • scalability, • distribution, and • multi-authorship • All these may introduce inconsistencies.

  12. Ontologies will be inconsistent • Because of: • mistreatment of defaults • polysemy • migration from another formalism • integration of multiple sources • … • (“Semantic Web as a wake-up call for KR”)

  13. Example: Inconsistency by mistreatment of default rules MadCow Ontology • Cow  Vegetarian • MadCow  Cow • MadCow  Eat.BrainofSheep • Sheep  Animal • Vegetarian  Eat.  (Animal PartofAnimal) • Brain  PartofAnimal • ...... • theMadCow MadCow • ...

  14. Example: Inconsistency through imigration from other formalism DICE Ontology • Brain  CentralNervousSystem • Brain  BodyPart • CentralNervousSystem  NervousSystem • BodyPart  NervousSystem

  15. Inconsistency and Explosion • The classical entailment is explosive: P, ¬ P |= Q Any formula is a logical  consequence of a contradiction. • The conclusions derived from an inconsistent ontology using the standard reasoning may be completely meaningless

  16. Why DL reasoning cannot escape the explosion • The derivation checking is usually achieved by the satisfiability checking. •  |=   {¬} is not satisfiable. • Tableau algorithms are approaches based on the satisfiability checking •  is inconsistent =>  is not satisfiable =>  {¬} is not satisfiable.

  17. Two main approaches to deal with inconsistency • Inconsistency Diagnosis and Repair • Ontology Diagnosis(Schlobach and Cornet 2003) • Reasoning with Inconsistency • Paraconsistent logics • Limited inference (Levesque 1989) • Approximate reasoning(Schaerf and Cadoli 1995) • Resource-bounded inferences(Marquis et al.2003) • Belief revision on relevance (Chopra et al. 2000)

  18. What an inconsistency reasoner is expected • Given an inconsistent ontology, return meaningful answers to queries. • General solution: Use non-standard reasoningto deal with inconsistency •  |= : the standard inference relations  | : nonstandard inference relations

  19. Reasoning with inconsistent ontologies: Main Idea Starting from the query, • select consistent sub-theory by using a relevance-based selection function. • apply standard reasoning on the selected sub-theory to find meaningful answers. • If it cannot give a satisfying answer, the selection function would relax the relevance degree to extend consistent sub-theory for further reasoning.

  20. New formal notions are needed • New notions: • Accepted: • Rejected: • Overdetermined: • Undetermined: • Soundness: (only classically justified results) • Meaningfulness: (sound & never overdetermined)soundness +

  21. Some Formal Definitions • Soundness:  | =>` (` consistent and `|=). • Meaningfulness: sound and consistent ( | =>  ¬). • Local Completeness w.r.t a consistent ` : (`|= =>  |). • Maximality: locally complete w.r.t a maximal consistent set `. • Local Soundness w.r.t.a consistent set `:  | => `|=).

  22. Selection Functions Given an ontology T and a query , a selection function s(T,,k)returns a subset of the ontology at each step k>0.

  23. General framework • Use selection function s(T,,k),with s(T,,k)  s(T,,k+1) • Start with k=0: s(T,,0) |= or s(T,,0) |=  ? • Increase k, untils(T,,k) |= or s(T,,k) |=  • Abort when • undetermined at maximal k • overdetermined at some k

  24. Inconsistency Reasoning Processing: Linear Extension

  25. Proposition: Linear Extension • Never over-determined • May undetermined • Always sound • Always meaningful • Always locally complete • May not maximal • Always locally sound

  26. Direct Relevance and K Relevance • Direct relevance(0-relevance). • there is a common name in two formulas: C()  C()  R()  R()I() I(). • K-relevance: there exist formulas 0, 1,…, k such that  and 0, 0 and 1 , …, k and are directly relevant.

  27. Relevance-based Selection Functions • s(T,,0)= • s(T,,1)= { T:  is directly relevant to }. • s(T,,k)= { T:  is directly relevant to s(T,,k-1)}.

  28. PION Prototype PION: Processing Inconsistent ONtologies http://wasp.cs.vu.nl/sekt/pion

  29. An Extended DIG Description Logic Interface for Prolog (XDIG) • A logic programming infrastructure for the Semantic Web • Similar to SOAP • Application independent, platform independent • Support for DIG clients and DIG servers.

  30. XDIG • As a DIG client, the Prolog programs can call any external DL reasoner which supports the DIG DL interface. • As a DIG server, the Prolog programs can serve as a DL reasoner, which can be used to support additional reasoning processing, like inconsistency reasoning multi-version reasoning, and inconsistency diagnosis and repair.

  31. XDIG package • The XDIG package and the source code are now available for public download at the website: http://wasp.cs.vu.nl/sekt/dig/ • In the package, we offer five examples how XDIG can be used to develop extended DL reasoners.

  32. Answer Evaluation • Intended Answer (IA):PION answer = Intuitive Answer • Cautious Answer (CA):PION answer is ‘undetermined’, but intuitive answer is ‘accepted’ or ‘rejected’. • Reckless Answer (RA):PION answer is ‘accepted’ or ‘rejected’, but intuitive answer is ‘undetermined’. • Counter Intuitive Answer (CIA):PION answer is ‘accepted’ but intuitive answer is ‘rejected’, or vice verse.

  33. Preliminary Tests with Syntactic-relevance Selection Function

  34. Observation • Intended answers include many undetermined answers. • Some counter-intuitive answers • Reasonably good performance

  35. Intensive Tests on PION • Evaluation and test on PION with several realistic ontologies: • Communication Ontology • Transportation Ontology • MadCow Ontology • Each ontology has been tested by thousands of queries with different selection functions.

  36. Summary • we proposed a general framework for reasoning with inconsistent ontologies • based on selecting ever increasing consistent subsets • choice of selection function is crucial • query-based selection functions are flexible to find intended answers • simple syntactic selection works surprisingly well

  37. Extension • Semantic Relevance Based Selection Functions • K-extension • Variants of over-determined processing strategies • Integrating with the diagnosis approach

  38. Using Semantic Distances for Reasoning with Inconsistent Ontologies • Google distances are used to develop semantic relevance functions to reason with inconsistent ontologies. • Assumption: two concepts appear morefrequently in thesame web page, they are semantically more similar (relevant).

  39. Google Distances (Cilibrasi and Vitanyi 2004) • Google distance is measured in terms of the co-occurrence of two search items in the Web by Google search engine. • Normalized Google Distance (NGD) is introduced to measure the similarity/light-weight semantic relevance • NGD(x,y)= (max{log f(x), log f(y)}-log f(x,y))/(log M-min{log f(x),log f(y)} where f(x) is the number of Google hits for x f(x,y) is the number of Google hits for the tuple of search items x and y M is the number of web pages indexed by Google.

  40. Semantic Distances • Define semantic distances (SD) between two formulas in terms of semantic distances between two concepts/roles/individuals (NGD)

  41. Postulates for Semantic Distances

  42. Semantic Distances Semantic distance are measured by the ratio of the summed distance of the difference between two formulae to the maximal distance between two formulae.

  43. Proposition • The semantic distance SD satisfies the properties Range,Reflexivity, Symmetry, Maximum Distance, and Intermediate Values.

  44. Example: MadCow NGD(MadCow, Grass)=0.7229 NGD(MadCow, Sheep)=0.6120

  45. Implementation: PION PION: Processing Inconsistent ONtologies http://wasp.cs.vu.nl/sekt/pion

  46. Answer Evaluation • Intended Answer (IA):Query answer = Intuitive Answer • Cautious Answer (CA):Query answer is ‘undetermined’, but Intutitve answer is ‘accepted’ or ‘rejected’. • Reckless Answer (RA):Query answer is ‘accepted’ or ‘rejected’, but Intutive answer is ‘undetermined’. • Counter Intuitive Answer (CIA):Query answer is ‘accepted’ but Intuitive answer is ‘rejected’, or vice versa.

  47. Syntactic approach vs. Semantic approach: quality of query answers

  48. Syntactic approach vs. Semantic approach: Time Performance

  49. Summary • The run-time of the semantic approach is much better than the syntactic approach, while the quality remains comparable. • The semantic approach can be parameterised so as to stepwise further improve the run-time with only a very small drop in quality.

More Related