1 / 0

Ontology, ontologies and ontological reasoning 3: ontological reasoning

Ontology, ontologies and ontological reasoning 3: ontological reasoning. Joost Breuker Leibniz Center for Law University of Amsterdam. Overview. Semantic Web and OWL Use of ontologies Reasoning with ontologies TRACS: testing the Dutch Traffic Regulation

blaine
Download Presentation

Ontology, ontologies and ontological reasoning 3: ontological reasoning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ontology, ontologies and ontological reasoning3: ontological reasoning

    Joost Breuker Leibniz Center for Law University of Amsterdam
  2. Overview Semantic Web and OWL Use of ontologies Reasoning with ontologies TRACS: testing the Dutch Traffic Regulation HARNESS: DL-based legal assessment Frameworks and the limits of DL based reasoning Problem solving and reasoning
  3. What the Semantic Web is intended for Dream part 2 “In communicating between people using the Web, computers and networks have as their job to enable the information space, and otherwise get out of their way. But doesn’t it make sense to also bring computers more onto action, to put their analytic power to work. In part two of the dream, that is just what they do. The first step is putting data on the web in a form that machines can naturally understand, or converting it to that form. This creates what I call a Semantic Web -- a web of data that can be processed directly or indirectly by machines.” [p191]
  4. A decade later (W3C): infrastructural standards for SW Semantics are represented by ontologies Ontologies are represented by a KR formalism Note: ontologies were specifications in KE (using Ontolingua; CML, cf UML in SE) On top of a layer cake of data-handling formalisms KR formalism is intended for reasoning Even suitable for blind trust (OWL-DL is decidable)
  5. Legal ontologies (from NuriaCasellas, 2008/9)
  6. Legal ontologies (from NuriaCasellas, 2008/9)
  7. HOWEVER, in practice Not one of these ontologies is used for reasoning Use: Information management (documents) That is also what the current Semantic Web efforts are about (not only in legal domains) Core ontologies (reuse?) Why using OWL?
  8. OWL: DL-based knowledge representation OWL-DL is a unique result of 40 years of research in AI about KR Semantic networks, KL-ONE, CLASSIC, LOOM,.. Concept oriented representation Very suitable for ontologies vs Rule based KR End 80-ies: logical foundations A KR formalism defines what can be correctly and completely inferred On the basis of the semantics (model theory) of the formalism Problem: finding a sub-set of predicate logic that is decidable (and computationally tractable)
  9. History of KR (Hoekstra, 2009)
  10. OWL’s semantic web context OWL
  11. OWL’s further requirements/problems …besides the expressivity/decidability trade-off The RDF layer (OO based) was a serious obstacle Its expressiveness was incompatible with DL research Ian Horrocks, Peter F. Patel-Schneider, and Frank van Harmelen. From SHIQ and RDF to OWL: The making of a web ontology language. Journal of Web Semantics, pages 7–26, 2003. No unique naming assumption USA/EU team: KR & KA community NB: improved expressivity in OWL 2! Still: OWL is not a self-evident for novices…
  12. Reasoning with OWL Main inference: classification on the basis of properties of concepts Reasoner (inference engine) `classifier’ Complete for DL based Rule based reasoners are not complete Eg Prolog, unless `closed world assumption’ For the Web this assumption cannot hold! T(erminology)-Box: ontology (knowledge) Classes (concepts, universals) and properties (relations, attributes, features,..) A(ssertions)-Box: some situation (information) Individuals (instances) with properties
  13. When is REASONING with ontologies indicated (...except for consistency checking etc.) In understanding/modeling situations Situation = events and states of entities in space (and over time) Three main modes Text understanding (stories, cases) Nb exc. expository discourse… Scene understanding (robotics) Problem solving (knowledge engineering) Understanding the problem description
  14. Modelling situations Situations described as instances of concepts and relations: A-Box Egmereological (topological) and/or dependency relations Initial model An ontology describing the concepts and properties is applied to infer the implied information: = classifying the A-Box  implied information is made explicit (`added’) static: description of a situation dynamic: explaining changes between situations Joost Breuker
  15. situation 1 plus a `process’ structural (topological) descriptions of objects in space Joost Breuker
  16. should predict this Inferred: positions, breaking, etc Joost Breuker
  17. Or: inferring explanations Given Situation 1 and Situation 2, we should be able to infer the processes involved This is a causal explanation NB: teleological explanation: There must have been a Situation 0 with an intention to throw a ball (or worse: to get rid of an ugly teapot..etc.)
  18. move/fall move/fall break move/fall collide move/fall move/fall A-Box: events & states of objects floor desk teapot ball T-1 T-2 Joost Breuker SIKS-course, may 2006
  19. support move/fall support move/fall break move/fall collide move/fall move/fall identifying implied processes floor desk teapot ball T-1 T-2 Joost Breuker SIKS-course, may 2006
  20. support move/fall support move/fall break move/fall collide move/fall move/fall identifying causation floor desk teapot ball Joost Breuker SIKS-course, may 2006
  21. support move/fall support move/fall break move/fall collide move/fall move/fall more detailed: even limiting causal effects of collisions Why does the desk not move? floor desk teapot ball Joost Breuker SIKS-course, may 2006
  22. When is REASONING with ontologiesrequired When all possible situations have to be modeled Typical examples: Model based & qualitative reasoning systems Testing system models ..legal case assessment All possible combinations  completeness & consistency eg OWL-DL NB: in knowledge systems, situations are usually modeled implicitly in user-system dialogues: Asking user (values of/presence of) parameters Heuristics; human limitations in handling combinatorics
  23. For instance: TRACS (1990 – 1994) Testing a new Dutch traffic code (RVV-90) art. 3 Vehicles should keep to the right art. 6 Two bicycles may ride next to each other art. 33 A trailer should have lights at the back Questions Consistent? Complete? In what respect different from RVV-66 (old one)? These can only be answered when we can model all possible situations distinguished by this code
  24. Traffic participants: a part of the ontology (`world knowledge’) traffic-participant pedestrian driver driver of motor vehicle bicyclist autocyclist motorcycle driver bus driver lorry driver car driver
  25. Simple example of ontological reasoning Ontology (T-Box) Subsumes (Physical_object, Car) Right-of (Physical_object, Physical_object) Inv(Right_of, Left_of) Case description (A-Box) Is-a (car1, Car) Is-a (car1, Car) Right_of (car1, car2) Classifier (eg Pellet)  Left_of (car2, car1) (A-Box) …simple as that, but necessary
  26. WORLD KNOWLEDGE BASE REGULATION KNOWLEDGEBASE REGULATION APPLIER SITUATION GENERATOR SITUATION APPLICABLE RULES VALIDATOR CONSISTENTLY APPLICABLE RULES META-LEGAL KNOWLEDGE BASE CONFLICT RESOLVER TRESPASSED/ NON-TRESPASSED RULES Architecture of TRACS (Breuker & den Haan, 94)
  27. Btw: some surprising results Tram on tramway Car on bicycle lane
  28. Just a prototype… About 105 possible combinations Analysis of redundancy (symmetry) Still: too many for humans to inspect! But: Differences with old regulation Differences with foreign regulations ( ontology the same?) …political decisions…
  29. HARNESS: OWL 2 DL also for normative reasoning Normative reasoning simultaneously with ontological reasoning using OWL-DL Estrella, 6th framework, 2006-2008 http://www.estrellaproject.org/ Saskia van de Ven, Joost Breuker, Rinke Hoekstra, Lars Wortel, and Abdallah El-Ali. Automated legal assessment in OWL 2. In Legal Knowledge and Information Systems. Jurix 2008: The 21st Annual Conference, Frontiers in Artificial Intelligence and Applications. IOS Press, December 2008. AndrásFörhécz and GyörgyStrausz, Legal Assessment Using Conjunctive Queries, Proceedings LOAIT 2009
  30. Representing norms in OWL DL 2 Norm Generic case description is a conjunction of generic situation (σ) descriptions Generic case description is a class () A deontic qualification (P,O,F) is associated with  Case description is an individual (C) Description is itself composed of classes/individuals as defined in the ontology!
  31. Watch this….
  32. What did you see? Event 1 (Saskia entering, shows ID) Event 2 (Joost entering, shows ID) Event 3 (Radboud entering) (nb : Radboud is president of Jurix)
  33. JURIX 2009 Regulation (= set of norms) For entering a U-building, identification is required The President does not need an identification to enter a U-building
  34. Generic situations and generic case For (entering a U-building), (an identification) is required The (President) does not need (an identification) to (enter a U-building) Step 1: Modeled as: Σ1  σ1∧σ2∧σ3 Σ2  σ4∧σ2(∧~σ3)
  35. Generic situations and generic case For (entering a U-building), (an identification) is required [for each person] The (President) does not need (an identification) to (enter a U-building) Step 1: Modeled as: Σ1  σ1∧σ2∧σ3 Σ2  σ4∧σ2(∧~σ3)
  36. Step 2: Adding deontic qualification to the norms Permitted(Σ1) σ1∧σ2∧σ3 (1) Obliged(Σ1) Forbidden(Σ1) σ1∧σ2(~∧σ3) (this is a `design pattern’ which separates conditions (person, entering and identity from a forbidden generic case) (2) Permitted(Σ2) σ4∧σ2(∧~σ3) President
  37. Step 3 Normative assessment: classifying C(ase) Case: President Radboud enters U-building C: {s4,s2} Classifying C: Σ1 subsumes Σ2 exception C is Disallowed-by Σ1 C is Allowed-by Σ2 Etc, etc… This is not viewed as a logical conflict by Pellet due to the fact that this individual is classified by two different norms (classes) HARNESS selects subsumed (Σ2)
  38. Experimental user interface (Protégé plug-in) Compliance Violation
  39. An important advantage Three knowledge bases: domain ontology (T-Box) norms (T-Box) case description (individuals & properties; A-Box) OWL-DL reasoner (Pellet) `classifies’ case in terms of concepts and of norms simultaneously in an intertwined fashion Hybrid or only-rule-based solutions cannot preserve all (inferred) information of the ontology as Pellet/OWL 2 does
  40. Knowledge, ontology and meaning There is more to knowledge than ontology Ontology (terminology) provides the basic units for understanding Regular combinations: patterns of concepts Scripts & frames: experience, heuristics, associations Meaningful experience can only be based upon understanding! Synthetic learning vs (further) abstraction
  41. What’s further new Monotonic and deductive: Unique & against accepted wisdom Exceptions do not lead to conflict Advantage: Reasoning is sound and complete (trust) No rule formalism allows this with the same expressiveness Full use of OWL 2 DL’s expressiveness No loss in translation Disadvantage: Modeling in DL is found to be more intellectually demanding than modeling in rules anyway Obligation design pattern is not very intuitive
  42. A serious problem in the use of (OWL-) DL DL representations are `variable free’ (most) rule formalisms have variables Moreover: in OWL names of individuals are not taken as identifiers of individuals (no unique naming assumption) It is not possible to track changes of a particular individual A-Box: colour(block1,red); colour(block1,blue) OWL: …there are (now) two block1’s!
  43. A serious problem in the use of (OWL-) DL (2) Also: it is (almost) impossible to `enforce’ identity of individuals in OWL Example: transaction
  44. OWL’s restriction on the form of graphs `diamond of individuals’ what OWL allows: tree’s
  45. An approximate solution: a special design pattern Constraining the identity: Rinke Hoekstra and Joost Breuker. Polishing diamonds in OWL2. In Aldo Gangemi and Jérôme Euzenat, editors, Proceedings of the 16th International Conference on Knowledge Engineering and Knowledge Management (EKAW 2008), LNAI/LNCS. Springer Verlag, October 2008.) Rinke Hoekstra. Ontology Representation - Design Patterns and Ontologies that Make Sense, volume 197 of Frontiers of Artificial Intelligence and Applications. IOS Press, Amsterdam, June 2009.
  46. The DL view has a limited scope Excellent for axiomatic grounding of the terms that form the lowest level of granularity of a knowledge base More complex knowledge structures (frameworks) will require also rules `Hybrid’ solution: “In the hybrid approach there is a strict separation between the ordinary predicates, which are basic rule predicates and ontology predicates, which are only used as constraints in rule antecedents. Reasoning is done by interfacing an existing rule reasoner with an existing ontology reasoner” Problem: rule formalism has to be `DL-safe’ OWL/rule combination still (W3C) research issue
  47. Frameworks: complex knowledge structures Stereotypical pattern of relationships Dependency and/or part-of structures (`causal’. `mereological’; `how’) Ontology as background (`what’) Learning by experience Reoccurring events or structures Justification Plans <-> rituals 70-ies: Scripts (Schank), Frames (Minsky) Restaurant; House
  48. She is the … ?
  49. Three types of frameworks Situational frameworks Dependencies between events/actions Eg scripts, busineness processes, etc Mereological frameworks Structures, configurations of objects `topo-mereology’ Epistemological frameworks Dependencies between roles in reasoning Problem solving methods Hypotheses  evidence -> conclusion Valente’sFOLaw
  50. Reasoning with frameworks As frameworks usually have lots of restrictions on the identity of objects (eg agents, roles) the approximate solutions may become very problematic; even impossible So we will need also rules Preferably in a hybrid architecture… We can split-up the reasoning in 2 steps Static situation modeling by ontological reasoning (classification) Inferring all properties of a collection of entities Modeling change by frameworks …and if necessary: iterate…
  51. Qualitative reasoning: ontologies and frameworks qualitative = quantities on an ordinal-scale eg: neg-max, negative, zero, positive, pos-max points,intervals Also: rates (increase, decrease) Special calculus Prediction of behaviour of a system on the basis of a situation description Structural description plus initial values `scenario’ Behaviour is derived from Processes (changes) triggered by a set of conditions Model fragments (= a framework)
  52. QR - 2 Eg heat-exchange-1 (conduction) Conditions Tobj1 > Tobj2, share(surfobj1,surfobj2), phase(obj1,solid), etc Process Tobj1 >decrease Tobj2, max = Tobj1 = Tobj2 Heat-exchange-2 (convection; Boyle) Conditions … phase(obj1, liquid or gas)… Simplified: If conditions match situation: transition etc. from state to next state Ambiguities due to coarse grainsize Branching in causal chains (behaviour graph)
  53. GARP-3, Qualitative reasoning architecture situation description Where is the ontology?
  54. (called: GARP-3, simple)
  55. GARP and Ontologies GARP is written in Prolog No real distinction between ontology (entities) and model fragments (processes) However: As ontologies become largely available on the Web, cast in OWL, GARP has now OWL import/export Clearer distinction See Ken Forbus, Qualitative Modelling. In Frank van Harmelen, Vladimir Lifschitz and Bruce Porter (Eds), Handbook of Knowledge Representation, Amsterdam, Elsevier, 2008
  56. And now for a movie http://hcs.science.uva.nl/QRM/Garp3NNR.mov
More Related