slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Knowledge Acquisition and Problem Solving PowerPoint Presentation
Download Presentation
Knowledge Acquisition and Problem Solving

Loading in 2 Seconds...

play fullscreen
1 / 95

Knowledge Acquisition and Problem Solving - PowerPoint PPT Presentation


  • 77 Views
  • Uploaded on

CS 785 Fall 2004. Knowledge Acquisition and Problem Solving. Semantic Networks and Object Ontologies. Gheorghe Tecuci tecuci@gmu.edu http://lac.gmu.edu/. Learning Agents Center and Computer Science Department George Mason University. Overview. Semantic networks. Ontology representation.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

Knowledge Acquisition and Problem Solving


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
    Presentation Transcript
    1. CS 785 Fall 2004 Knowledge Acquisition and Problem Solving Semantic Networks and Object Ontologies Gheorghe Tecuci tecuci@gmu.eduhttp://lac.gmu.edu/ Learning Agents Center and Computer Science Department George Mason University

    2. Overview Semantic networks Ontology representation Reasoning with an ontology Development and maintenance of an ontology Sample ontology for COG analysis Ontology development tools Exercises and readings

    3. Semantic networks The underlying idea of semantic networks is to represent knowledge in the form of a graph in which the nodes represent objects, situations, or events, and the arcs represent the relationships between them.

    4. Semantic networks with binary relations What does this semantic network express?

    5. Representing non-binary predicates What does this semantic network express? How can one encode the additional information that Clyde owned nest-1 from Spring 90 to Fall 90? A binary relation cannot encode this additional information. We need to represent ownership as a node rather than a relation.

    6. Representing non-binary predicates ownership(owner, ownee, start-time, end-time) A semantic network representing "ownership" as a node.

    7. Overview Exercise Represent the following sentences into a semantic network: Birds are animals. Birds have feathers, fly and lay eggs. Albatros is a bird. Donald is a bird. Tracy is an albatros.

    8. Overview Semantic networks Ontology representation Reasoning with an ontology Development and maintenance of an ontology Sample ontology for COG analysis Ontology development tools Exercises and readings

    9. Ontology representation 1. What is an ontology 2. Sample application domain 3. Concepts, instances, and generalization 4. Object features 5. Definition of instances and concepts

    10. 1. What is an ontology Every knowledge-based agent has a conceptualization or a model (i.e. an abstract, simplified view) of its world which consists of representations of the objects, concepts, and other entities that are assumed to exist, and the relationships that hold among them. An ontology is a specification of the terms that are used to represent the agent’s world. In an ontology, definitions associate the names of entities in the agent’s world (e.g., classes, individual objects, relations, tasks) with human-readable text describing what the names mean, and formal axioms that constrain the interpretation and use of these terms.

    11. Why is the definition of an agent’s ontology important? Enables an agent to communicate with other agents, because they share a common vocabulary (terms) which they both understand. Enables knowledge sharing and reuse among agents. Ontological commitment: Agreement among several agents to use a shared vocabulary in a coherent and consistent manner.

    12. Object ontology We define an objectontology as a hierarchical description of the objects from the agent’s domain, specifying their properties and relationships. It includes both descriptions of types of objects (called concepts) and descriptions of specific objects (called instances).

    13. The generality of the object ontology An object ontology is characteristic to an entire application domain, such as military or medicine. In the military domain the object ontology will include descriptions of military units and of military equipment. These descriptions are most likely needed in almost any specific military application. Because building the object ontology is a very complex task, it makes sense to reuse these descriptions when developing a knowledge base for another military application, rather than starting from scratch.

    14. 2. Sample application domain Center of Gravity Analysis Identify COG candidates Test COG candidates Identify potential primary sources of moral or physical strength, power and resistance from: Test each identified COG candidate to determine whether it has all the necessary critical capabilities: Which are the critical capabilities? Are the critical requirements of these capabilities satisfied? If not, eliminate the candidate. If yes, do these capabilities have any vulnerability? Government Military People Economy Alliances Etc.

    15. 3. Concepts, instances, and generalization A concept is a representation of a set of instances. state_government Represents the set of all entities that are governments of states. This set includes “government_of_US_1943” and government_of_Britain_1943” instance_of instance_of government_of_US_1943 government_of_Britain_1943 state_government government_of_US_1943 government_of_Britain_1943 Provide another example of a concept.

    16. Characterization of instances An instance is a representation of a particular entity in the application domain. state_government instance_of instance_of government_of_US_1943 government_of_Britain_1943 Represents the entity called “government_of_US_1943” “instance_of” is the relationship between an instance and the concept to which it belongs. Provide another example of an instance and of a concept.

    17. Intuitive definition of generalization Intuitively, a concept P is said to be more general than (or a generalization of) another concept Q if and only if the set of instances represented by P includes the set of instances represented by Q. state_government Example: democratic_government representative_ democracy totalitarian_ government parliamentary_ democracy state_government “subconcept_of” is the relationship between a concept and a more general concept. subconcept_of democratic_government

    18. Intuitive definition of generalization (cont.) What are the possible relationships between two concepts A and B, from a generalization point of view? - A is more general than B - B is more general than A - There is no generalization relationship between A and B Provide examples of concepts A and B in each of these three situations.

    19. Intuitive definition of generalization (cont.) A B . A B How could one prove that A is more general than B? Show that all the instances of B are also instances of A Is this always a practical procedure? No if A and B have an infinite number of instances. How can one prove that A is not more general than B? Show that B contains an instance which is not an instance of A. Is this a more practical procedure? Yes, even for concepts with an infinite number of instances.

    20. A generalization hierarchy governing_body ad_hoc_ governing_body established_ governing_body other_type_of_ governing_body state_government group_governing_body feudal_god_ king_government other_state_ government dictator other_ group_ governing_ body democratic_ government monarchy deity_figure representative_ democracy parliamentary_ democracy government_ of_Italy_1943 democratic_ council_ or_board autocratic_ leader totalitarian_ government government_ of_US_1943 government_ of_Britain_1943 chief_and_ tribal_council theocratic_ government police_ state military_ dictatorship fascist_ state religious_ dictatorship theocratic_ democracy communist_ dictatorship religious_ dictatorship government_ of_Germany_1943 government_ of_USSR_1943

    21. Partially learned concepts and version spaces Question: What is the concept represented by the following two positive examples P = {government_of_US_1943, government_of_Britain_1943} and the following negative example N = {GMU_governing_body_2004} ? governing_body established_governing_body . state_government democratic_government . . representative_ democracy totalitarian_ government parliamentary_ democracy

    22. Partially learned concepts and version spaces Question: What is the concept represented by the following two positive examples P = {government_of_US_1943, government_of_Britain_1943} and the following negative example N = {GMU_governing_body_2004} ? . Upper bound of the VS state_government democratic_government Lower bound of the VS . . Version Space (VS) upper bound: {state_government} lower bound: {democratic_government}

    23. Object descriptions The objects in the application domain may be described in terms of their properties and their relationships with each other. parliamentary_democracy instance_of has_as_head_of_government Winston_Churchill government_of_Britain_1943 has_as_legislative_body parliament_of_Britain_1943 government_of_Britain_1943 has_as_legislative_body parliament_of_Britain_1943 Similarly, a general object or concept can be described as being a subconcept of an even more general concept and having additional features.

    24. Feature definition An object feature is itself characterized by several features which include: documentation, domain and range. The domain is the concept that represents the set of objects that could have that feature. The range is the set of possible values of the feature. has_as_political_leader "Indicates the head of a governing body" subconcept_of documentation domain has_as_head_of_government governing_body range person

    25. Partially learned feature has_as_strategic_goal Unconditional_surrender_of_European_Axis_1943 Allied_Forces_1943 documentation "Indicates the strategic goal of an entity” plausible upper bound: agent domain has_as_strategic_goal plausible lower bound: force plausible upper bound: agent_goal range plausible lower bound: force_goal

    26. Feature hierarchy has_as_controlling_leader D: agent R: person has_as_religious_leader D: governing_body R: person has_as_god_king D: governing_body R: person has_as_monarch D: governing_body R: person has_as_military_leader D: governing_body R: person has_as_political_leader D: governing_body R: person has_as_commander_in_chief D: force R: person has_as_head_of_government D: governing_body R: person has_as_head_of_state D: governing_body R: person

    27. Exercise Consider the following feature hierarchy: Is there any relationship between: DOMAIN 1D1 Feature 1 - BD1 and 1D1? RANGE 1R1 DOMAIN BD1 A2D1 and 1D1? Feature B RANGE BR1 AD1 and BD1? DOMAIN AD1 Feature A RANGE AR1 1D1 and 1R1? DOMAIN A2D1 Feature A1 Feature A2 RANGE A2R1

    28. 5. Definition of instances and concepts When designing a knowledge base, one has to first specify some basic concepts, as well as the features that may characterize the instances and the concepts from the application domain. Once basic concepts and features are specified, one can define new concepts and instances as logical expressions of the specified concepts and features.

    29. Basic representation unit conceptk ISA concepti FEATURE1 value1 . . . FEATUREn valuen This is a necessary definition of ‘conceptk’. It defines ‘conceptk’ as being a subconcept of ‘concepti’ and having additional features. This means that if ‘concepti’ represents the set Ci of instances, then ‘conceptk’ represents a subset Ck of Ci. The elements of Ck have the features ‘FEATURE1’,..., ‘FEATUREn’ with the values ‘value1’,..., ‘valuen’, respectively.

    30. Overview Semantic networks Ontology representation Reasoning with an ontology Development and maintenance of an ontology Sample ontology for COG analysis Ontology development tools Exercises and readings

    31. Reasoning with an ontology 1. Transitivity of “instance_of” and “subconcept_of” 2. Inheritance 3. Concept expressions 4. Ontology matching 5. Rules as ontology-based representations

    32. 1. Transitivity of “instance_of” and “subconcept_of” established_governing_body established_governing_body subconcept_of state_government state_government subconcept_of subconcept_of democratic_government democratic_government subconcept_of representative_democracy instance_of instance_of government_of_US_1943 government_of_US_1943

    33. 2. Inheritance Properties associated with concepts in a hierarchy are assumed to be true of all subconcepts and instances. has_as_dominant_ psychosocial_factor democratic_ government “will of the people” representative_ democracy has_as_political_leader government_ of_US_1943 President_Roosevelt

    34. Exceptions to default inheritance Properties associated with concepts in a hierarchy are assumed to be true of all subconcepts and instances. How can we deal with exceptions (i.e. sub-concepts or instances that do not have the inherited property)? Explicitly override the inherited property, as illustrated in the following slide.

    35. Exceptions to default inheritance: illustration has_as_dominant_ psychosocial_factor democratic_ government “will of the people” representative_ democracy has_as_political_leader government_ of_US_1943 President_Roosevelt has_as_dominant_psychosocial_factor parliamentary_ democracy {“will of the people” “will of the parliament”} has_as_dominant_psychosocial_factor “will of the people” government_ of_Britain_1943 has_as_political_leader Winston_Churchill

    36. Multiple inheritance An object (instance or concept) may inherit properties from several super-concepts. How can we deal with the inheritance of contradictory properties? Explicitly state the property of the object, instead of inheriting it:

    37. 3. Concept expressions One can define more complex concepts as logical expressions involving the basic concepts from the object ontology. In the following expression, for instance, ?O1 represents a force that has as industrial factor an industrial capacity that generates essential war materiel from the strategic perspective of a multi-member force that includes ?O1. ?O1 is forcehas_as_industrial_factor ?O2 ?O2 is industrial_capacity generates_essential_war_materiel_from_ the_strategic_perspective_of ?O3 ?O3 is multi_member_force has_as_member ?O1

    38. 4. Ontology matching Ontology matching allows one to answer complex questions about the knowledge represented in the ontology, as illustrated in the following: Question: Is there any force ?O1 that has as industrial factor an industrial capacity that generates essential war materiel from the strategic perspective of a multi-member force that includes ?O1? Answer: Yes, US_1943 is a force that has as industrial factor industrial_capacity_of_US_1943 that generates essential war materiel from the strategic perspective of the Allied_Forces_1943 which is a multi-member force that includes US_1943.

    39. Ontology matching: example Object ontology Question Answer ?O1  US_1943 force ?O2  industrial_capacity_ of_US_1943 subconcept-of force single_member_force ?O3  Allied_forces_1943 subconcept-of instance-of single_state_force ?O1 industrial_capacity instance-of has_as_industrial_factor instance-of US_1943 industrial_capacity ?O2 has_as_industrial_factor multi_state_force instance-of generates_essential_war_materiel_from_ the_strategic_perspective_of has_as_member Industrial_capacity_of_US_1943 multi_state_force instance-of subconcept-of multi_state_alliance ?O3 subconcept-of generates_essential_war_materiel_from_ the_strategic_perspective_of equal_partner_multi_state_alliance Is there any force ?O1 that has as industrial factor an industrial capacity that generates essential war materiel from the strategic perspective of a multi-member force that includes ?O1? has_as_member instance-of Allied_forces_1943

    40. 5. Rules as ontology-based representations of PSS Condition: IF Identify and test a strategic COG candidate corresponding to the economy of a force which is an industrial economyThe industrial economy is ?O1 ?O1 is an industrial economy ?O2 is an industrial capacity which generates essential war material from the strategic perspective of ?O3, a multi-state force which has as member ?O4 (a force which has as economy ?O1 and as industrial factor ?O2). Condition?O1 is industrial_economy ?O2 is industrial_capacitygenerates_essential_war_materiel_from_ the_strategic_perspective_of ?O3 ?O3 is multi_state_force has_as_member ?O4 ?O4 is force has_as_economy ?O1 has_as_industrial_factor ?O2 THEN Identify a strategically critical element as a COG candidate with respect to an industrial economy The strategically critical element is ?O2 The industrial economy is ?O1 Test a strategically critical element which is a strategic COG candidate with respect to an industrial economy The strategically critical element is ?O2 The industrial economy is ?O1 This rule will be applicable only if the current ontology contains instances of the concepts ?O1, ?O2, ?O3, and ?O4 represented in the condition. A rule is an ontology-based representation of a problem solving step (PSS).

    41. Overview Semantic networks Ontology representation Reasoning with an ontology Development and maintenance of an ontology Sample ontology for COG analysis Ontology development tools Exercises and readings

    42. Ontology maintenance Maintaining the consistency of the object ontology is a complex knowledge engineering activity because the object and feature definitions interact in complex ways. Example: Deleting an object concept requires the updating of all the knowledge base elements that refer to it (e.g. the rules that contain it in their conditions; the features that contain it in their ranges or domains; the concepts that inherit its features).

    43. Potential consequence of editing operations: Illustration domain domain f A f A If we delete the link from B to A, then C can no longer have the feature f A A B B f f C C 7 7 Initial State Modified State Explain why. Because C is no longer in the domain of f.

    44. Steps in ontology development • Define the basic concepts, and their organization into a hierarchical structure (the generalization hierarchy). • Define the generic features, using the previously defined concepts to specify their domains and ranges. • Use the defined concepts and feature to define instances. • Extend the ontology with new concepts, features, and instances. • Repeat steps 1,2,3,4 until the ontology is judged to be complete enough.

    45. Steps in ontology development: illustration 1. Define basic concepts 2. Use the basic concepts to define features 3. Use the defined concepts and features to define instances.

    46. General features of semantic networks and ontologies Representational adequacy: High for binary relationships. Inferential adequacy: Good for certain types of inferential procedures. Inferential efficiency: very high The structure used for representing knowledge is also a guide for the retrieval of the knowledge. Acquisitional efficiency: very low

    47. Overview Semantic networks Ontology representation Reasoning with an ontology Development and maintenance of an ontology Sample ontology for COG analysis Ontology development tools Exercises and readings

    48. The object hierarchy Overall organization Scenario Forces and goals Economic factors Geographic factors Military factors Political factors Other objects

    49. Overall organization of the COG object ontology object resource_or_infrastructure_element agent_goal scenario agent strategic_cog_relevant_factor force_goal force other_relevant_factor strategic_goal operational_goal theater_strategic_goal psychosocial_factor political_factor demographic_factor military_factor economic_factor historic_factor geographic_factor international_factor subconcept_of

    50. Scenario object scenario agent war_scenario force brief_description Sicily_1943 opposing_force “WWII Allied invasion of Sicily in 1943” description “The Allied decision to invade Sicily following the successful operation in North Africa was a critical element of World War II [WWII]. The commitment of such a large force to continue operations in the Mediterranean theater meant that the cross-channel invasion of Europe would be delayed. American military leaders strongly favored the cross-channel invasion at the earliest possible opportunity. This meant giving this invasion force first priority for troops, shipping and equipment. The British favored an indirect approach that would see a major effort continue in the Mediterranean. The Allies settled on the Mediterranean approach at the Casablanca conference in January 1943 and began planning for Operation Husky, the invasion of Sicily. Situated ninety miles off the north coast of Africa and two and one-half miles from the toe of the Italian peninsula, Sicily was both a natural bridge between Africa and Europe and a barrier dividing the Mediterranean Sea. It was an unsinkable air and naval fortress from which Axis forces interdicted Allied sea lines of communications through the Mediterranean. …” has_as_opposing_force Allied_Forces_1943 has_as_opposing_force European_Axis_1943