slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Association in Level 2 Fusion PowerPoint Presentation
Download Presentation
Association in Level 2 Fusion

Loading in 2 Seconds...

play fullscreen
1 / 19

Association in Level 2 Fusion - PowerPoint PPT Presentation

91 Views
Download Presentation
Association in Level 2 Fusion
An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Association in Level 2 Fusion • Mieczyslaw M. Kokar • Christopher J. Matheus • Jerzy A. Letkowski • Kenneth Baclawski • Paul Kogut SPIE

  2. Overview • Data association in Level 1 and Level 2 • Ontologies • Reasoning process • Examples of reasoning about associations • Useful OWL constructs • Confidence of association • Conclusion SPIE

  3. Level 1 Data Association • Data to object • Measurement to object • Measurement to track • Object to object • Track to track • ID to ID • Approach • Define a measure of distance (quantitative) • Minimize distance SPIE

  4. Level 2 Data Association Problem • Data may be text • Identification of “the same” individuals – called “co-reference” • No quantitative properties of objects • Text processing tools used • Possibly translated to logic • Logic based tools (reasoners or theorem provers) are used for deriving conclusions • Why not for data association? SPIE

  5. Ontology • An explicit specification of a conceptualization: the objects, concepts, and other entities that are assumed to exist in some area of interest and the relationships that hold among them (Genesereth & Nilsson, 1997) • Definitions associate the names of entities in the universe of discourse (e.g., classes, relations, functions, or other objects) with human-readable text describing what the names mean, and formal axioms that constrain the interpretation and well-formed use of these terms. A statement of a logical theory. (Gruber) • An agent commits to an ontology if its observable actions are consistent with the definitions in the ontology (knowledge level). • A common ontology defines the vocabulary with which queries and assertions are exchanged among agents. SPIE

  6. An Ontology (in UML) SPIE

  7. Ontology Query OWLOntology Query TextDocument TextDocument AeroSWARMOntologicalAnnotator Annotation consistent ConsVISorConsistencyChecker unknown inconsistent Identifying Associations in Level 2 Fusion SPIE

  8. Ontology Fragment in OWL <owl:Class rdf:ID="Person"> <rdfs:label xml:lang="en">Person</rdfs:label> <rdfs:subClassOf> <owl:Restriction> <owl:onProperty rdf:resource="#personName"/> <owl:cardinality rdf:datatype="http://www.w3.org/2000/10/XMLSchema#nonNegativeInteger">1</owl:cardinality> </owl:Restriction> </rdfs:subClassOf> <rdfs:subClassOf> <owl:Restriction> <owl:onProperty rdf:resource="#personAge"/> <owl:cardinality rdf:datatype="http://www.w3.org/2000/10/XMLSchema#nonNegativeInteger">1</owl:cardinality> </owl:Restriction> </rdfs:subClassOf> </owl:Class> <owl:Class rdf:ID="Leader"> <rdfs:label xml:lang="en">Leader</rdfs:label> <rdfs:subClassOf rdf:resource="#Person"/> <rdfs:subClassOf> <owl:Restriction> <owl:onProperty rdf:resource="#leaderOf"/> <owl:cardinality rdf:datatype="http://www.w3.org/2000/10/XMLSchema#nonNegativeInteger">1</owl:cardinality> </owl:Restriction> </rdfs:subClassOf> </owl:Class> <owl:Class rdf:ID="Organization"> <rdfs:label xml:lang="en">Organization</rdfs:label> <rdfs:subClassOf> <owl:Restriction> <owl:onProperty rdf:resource="#ledBy"/> <owl:cardinality rdf:datatype="http://www.w3.org/2000/10/XMLSchema#nonNegativeInteger">1</owl:cardinality> </owl:Restriction> </rdfs:subClassOf> </owl:Class>

  9. Ontology (cont.) <owl:ObjectProperty rdf:ID="personName"> <rdfs:domain rdf:resource="#Person" /> <rdfs:range rdf:resource="http://www.w3.org/2000/10/XMLSchema#string" /> </owl:ObjectProperty> <owl:ObjectProperty rdf:ID="personAge"> <rdfs:domain rdf:resource="#Person" /> <rdfs:range rdf:resource="http://www.w3.org/2000/10/XMLSchema#string" /> </owl:ObjectProperty> <owl:ObjectProperty rdf:ID="orgName"> <rdfs:domain rdf:resource="#Organization" /> <rdfs:range rdf:resource="http://www.w3.org/2000/10/XMLSchema#string" /> </owl:ObjectProperty> <owl:ObjectProperty rdf:ID="orgType"> <rdfs:domain rdf:resource="#Organization" /> <rdfs:range rdf:resource="http://www.w3.org/2000/10/XMLSchema#string" /> </owl:ObjectProperty> <owl:ObjectProperty rdf:ID="leaderOf"> <rdfs:domain rdf:resource="#Leader" /> <rdfs:range rdf:resource="#Organization" /> </owl:ObjectProperty> <owl:ObjectProperty rdf:ID="ledBy"> <owl:inverseOf rdf:resource="#leaderOf" /> </owl:ObjectProperty>

  10. Annotation <Leader rdf:ID="A1"> <personName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">Osama</personName> <personAge rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">old</personAge> <leaderOf rdf:resource="#X"/> </Leader> <Leader rdf:ID="A2"> <personName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">Osama</personName> <personAge rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">old</personAge> <leaderOf rdf:resource="#X"/> <owl:sameAs rdf:resource="#A1"/> </Leader> <Leader rdf:ID="B1"> <personName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">Strossen</personName> <personAge rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">mature</personAge> <leaderOf rdf:resource="#Y"/> </Leader> <Organization rdf:ID="X"> <orgName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">al-Queda</orgName> <orgType rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">terrorist</orgType> <ledBy rdf:resource="#A1"/> <headquarters rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">Unknown</headquarters> </Organization> <Organization rdf:ID="Y"> <orgName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">ACLU</orgName> <orgType rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">non-profit</orgType> <ledBy rdf:resource="#B1"/> <headquarters rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">USA</headquarters> </Organization>

  11. Association Through Reasoning • The OWL language specifies names of individuals using a Uniform Resource Identifier (URI). • Characteristics of individuals (such as position or velocity) are defined using properties. • An OWL property that specifies a characteristic of an individual is called a DatatypeProperty. • Relations between individuals are also specified using properties. An OWL property of this kind is called an ObjectProperty. • There are two special OWL properties that can be used for explicit data association: sameAs and differentFrom. • A whole set of URIs can refer to different individuals by using the AllDifferent construct. • OWL properties are binary. • OWL properties are many-to-many unless one specifies constraints on the properties. SPIE

  12. Use of ConsVISor(a rule-based consistency verification tool)http://vistology.com/consvisor Ground Truth:• Two names represent the same individual and the ontology has support for this. • Two names represent two different individuals and the ontology has support for this. • Hypotheses (statements added to an annotation): • The names represent the same individual in the world (expressed in OWL as sameAs) • The names represent different individuals in the world (expressed in OWL as differentFrom) ConsVISor’s Decision: ConsVISor can decide that a given annotation (with the added hypothesis) is either “consistent ”or “inconsistent ”. SPIE

  13. Association and Consistency • Consider a case where ConsVISor returns consistent for both hypotheses. • Since it is consistent to believe either that they are the same or that they are different we cannot make any association claims at all (not enough information in the annotations) • The converse case in which both hypotheses produce inconsistent results should never occur unless there is an inconsistency in the underlying ontology. • The interesting cases occur when one hypothesis is consistent and the other is inconsistent. • If the sameAs hypothesis is consistent and the differentFrom inconsistent, they must be co-references. • Conversely, if the sameAs hypothesis is inconsistent and the differentFrom hypothesis is consistent, the two references cannot refer to the same individual. SPIE

  14. Ground Truth: Hypothesis: ConsVISor’s Decision: Annotation Example: http://vistology.com/exp/1.owl  consistent inconsistent <sameAs>  Impossible! http://vistology.com/exp/3.owl  consistent Two individuals are the same <differentFrom> inconsistent  http://vistology.com/exp/4.owl http://vistology.com/exp/5.owl  consistent Two individuals are different <sameAs> inconsistent  http://vistology.com/exp/6.owl http://vistology.com/exp/7.owl <differentFrom> consistent  inconsistent Impossible! 

  15. Reasoning about Associations <Leader rdf:ID="A1"> <personName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">Osama</personName> <personAge rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">old</personAge> <leaderOf rdf:resource="#X"/> </Leader> <Leader rdf:ID="B1"> <personName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">Strossen</personName> <personAge rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">mature</personAge> <leaderOf rdf:resource="#Y"/><owl:sameAs rdf:resource="#A1"/> </Leader> <Organization rdf:ID="X"> <orgName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string"> al-Queda</orgName> <orgType rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">terrorist</orgType> <ledBy rdf:resource="#A1"/> </Organization> <Organization rdf:ID="Y"> <orgName rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">ACLU</orgName> <orgType rdf:datatype="http://www.w3.org/2000/10/XMLSchema#string">non-profit</orgType> <ledBy rdf:resource="#B1"/> </Organization> ConsVISor’s Decision:inconsistent

  16. Useful OWL Constructs • FunctionalProperty and InverseFunctionalProperty • If a property is functional then a given individual can be related to at most one other value or individual. • If a property is inverse functional, then an individual or value can be used by at most one individual. • For example, one specifies that a leadership is functional by asserting that ledBy is an instance of FunctionalProperty or that its inverse leaderOf is an instance of InverseFunctionalProperty. If one of these is specified and if two individual names are both known to refer to the leader of the same organization, then those names must represent the same individual. • Cardinality constraints. • A cardinality constraint restricts the number of individuals (or values) that a given individual can be related to. • A maxCardinality constraint specifies an upper bound on this number, a minCardinality constraint specifies a lower bound. • A cardinality constraint specifies the exact number. • The maxCardinality and cardinality constraints are especially useful for proving that two individual names refer to the same individual. SPIE

  17. Useful OWL Constructs Cont. • disjointWith • When two classes are specified to be disjoint, then they cannot have any instances (individuals) in common. • complementOf • an extreme form of disjointness • oneOf • members of a class must all come from a given list of instances. • unionOf • a class is the union of some other classes. • subClassOf, equivalentClass • subPropertyOf, equivalentProperty • domain and range • SymmetricProperty and TransitiveProperty SPIE

  18. Confidence • No notion of uncertainty in OWL (so far) • OWL reasoners don’t have any way to deal with uncertainty even if an ontology has an explicit way for incorporating uncertainty into its annotations • ConsVISor can provide annotations involved in the determination of an inconsistency (trace) • Using an external process it is possible in some cases to combine the uncertainties of the implicated annotations to make statements about the uncertainty associated with an association claim • E.g., map ontology/annotation into a Bayesian net • Work in progress SPIE

  19. Conclusion • Showed examples of reasoning about co-reference in annotations (in OWL) • Merge two annotation files • Insert sameAs or differentFrom hypothesis • Use ConsVISor consistency checker • Consistent - plausible • Inconsistency – evidence against • OWL becoming a de-facto standard • Discussed useful OWL constructs • Confidence – work in progress (Bayesian nets) • Using ConsVISor in a system for generating and testing association hypotheses – future work SPIE