Expert Systems Presented by Mohammad Saniee December 2, 2003 Department of Computer Engineering Sharif University of Technology
Expert Systems A branch of Artificial Intelligence that makes an extensive use of specialized knowledge to solve problems at the level of an human expert. Natural Lang. Vision Understanding Robotics Expert Systems Neural Networks
Why do we need Expert Systems • Increased availability • Permanence • Reduced Danger • Reduced Cost • Multiple expertise • Increased Reliability • Explanation facility • Fast Response • Steady, emotional & complete response • Intelligent tutor
Expert System building Process • Selecting a specific Domain • Scoping the project – The purpose/functionality of the expert system. • Identifying Human resources such as the Domain expert / Knowledge Engineer, etc. • Knowledge Acquisition • Designing user interface • Implementing the expert system • Maintenance and update of Knowledge Base/System
Expert System components • Working Memory • A global database of facts used by the system • Knowledge Base • Contains the domain knowledge • Inference Engine • The brain of the Expert system. Makes logical deductions based upon the knowledge in the KB. • User Interface • A facility for the user to interact with the Expert system. • Explanation Facility • Explains reasoning of the system to the user • Knowledge Acquisition Facility • An automatic way to acquire knowledge
Expert System Structure Inference Engine Memory Working Knowledge Base Explanation Facility Knowledge Acquisition Facility User Interface
Knowledge Types • The knowledge base of expert system contains both factual and heuristic knowledge. • Factual knowledge is that knowledge of the task domain that is widely shared, typically found in textbooks or journals, and commonly agreed upon by those knowledgeable in the particular field. • The capital of Italy is Rome • A day consists of 24 hours • Bacteria type a causes Flu type B • Heuristic knowledge is the less rigorous, more experiential, more judgmental knowledge of performance. • For instance, in a medical expert system - if patient has spots, it’s probably chickenpox • In a mechanical trouble shooting system - if engine doesn’t turn over, check battery
Knowledge Representation • Knowledge representation formalizes and organizes the knowledge. The two most widely used representation are • Production Rules: A rule consists of an IF part and a THEN part (also called a condition and an action). if the IF part of the rule is satisfied; consequently, the THEN part can be concluded, or its problem-solving action taken. Rule based expert Systems use this representation, e.g., IF the stain of the organism is gram negative AND the morphology of the organism is rod AND the aerobicity of the organism is anaerobic THEN there is strongly suggestive evidence (0.8) that the class of the organism is Enterobacter iaceae. • Frames or units: The unit is an assemblage of associated symbolic knowledge about the represented entity Typically, a unit consists of a list of properties of the entity and associated values for those properties
Rules Based Expert System Expert systems that represent domain knowledge using production rules. Two type of Rule based systems: • Forward chaining Systems • Backward chaining systems
Forward Chaining Systems Forward chaining Systems support chaining of IF-THEN rules to form a line of reasoning. The chaining starts from a set of conditions and moves toward a conclusion. Question: Does employee John get a Computer? Rule: If John is an employee, he gets a computer. Fact: John is an employee Conclusion: John Gets a computer.
Forward Chaining • The rules are of the form: left hand side (LHS) ==> right hand side (RHS). • The execution cycle is • Select a rule whose left hand side conditions match the current state as stored in the working storage. • Execute the right hand side of that rule, thus somehow changing the current state. • Repeat until there are no rules which apply.
Forward Chaining • Facts are represented in a working memory which is continually updated. • Rules represent possible actions to take when specified conditions hold on items in the working memory. • The conditions are usually patterns that must match items in the working memory, while the actions usually involve adding or deleting items from the working memory.
First we'll look at a very simple set of rules: IF (lecturing X) AND (marking- practicals X) THEN ADD (overworked X) IF (month February) THEN ADD (lecturing Alison) IF (month February) THEN ADD (marking- practicals Alison) IF (overworked X) OR (slept-badly X) THEN ADD (bad-mood X) IF (bad-mood X) THEN DELETE (happy X) IF (lecturing X) THEN DELETE (researching X) Here we use capital letters to indicate variables (month February) (happy Alison) (researching Alison) Rule 2 & 3 let’s assume rule 2 chosen (lecturing Alison) (month February) (happy Alison) (researching Alison) Rule 3 & 6 apply, assume rule 3 chosen, This cycle continues and we end up with (bad-mood Alison) (overworked Alison) (marking- practicals Alison) (lecturing Alison) (month February) Forward Chaining (example)
Example of Forward Chaining System • XCON • Developed by DEC to configures computers. • Starts with the data about the customer order and works forward toward a configuration based on that data. • Written in the OPS5 (forward chaining rule based) language.
Backward Chaining System If the conclusion is known (goal to be achieved) but the path to that conclusion is not known, then reasoning backwards is called for, and the method is backward chaining. • The consequence part of rule specifies combinations of facts (goals) to be matched against Working Memory. • The condition part of the rule is then used as a set of further sub-goals to be proven / satisfied.
Backward Chaining example Question: Does employee John get a computer? Statement: John gets a computer. Rule: If employee is a programmer, then he gets a computer. Backward Chaining: Check the rule base to see what has to be “true” for john to get a computer. A programmer. Is it a fact that john is programmer. If true, then he gets a computer
Backward Chaining • Start with a goal state • System will first check if the goal matches the initial facts given. If it does, the goal succeeds. If it doesn't, the system will looks for rules whose conclusions match the goal. • One such rule will be chosen, and the system will then try to prove any facts in the preconditions of the rule using the same procedure, setting these as new goals to prove. • Needs to keep track of what goals it needs to prove its main hypothesis.
IF (lecturing X) AND (marking- practicals X) THEN (overworked X) IF (month February) THEN (lecturing Alison) IF (month February) THEN (marking- practicals Alison) IF (overworked X) THEN (bad-mood X) IF (slept-badly X) THEN (bad-mood X) IF (month February) THEN (weather cold) IF (year 1993) THEN (economy bad ) initial facts: (month February) (year 1993) Goal that has to be proved: (bad-mood Alison) The goal is not satisfied by initial facts. Rules 4 & 5 apply. Assume 4 chosen New Goal( overworked Alison) Rule 1 applies New Goal (lecturing Alison) Backward Chaining (example)
Conflict Resolution (I) • Conflict Resolution is a method that is used when more than one rule is matched on the facts asserted. There are several approaches • First in first serve • It involves firing the first rule that matches the content of the working memory or the facts asserted. • Last in first serve • The rule applied will be the last rule that is matched. • Prioritization: • The rule to apply will be selected based on priorities set on rules, with priority information usually provided by an expert or knowledge engineer.
Conflict Resolution (II) • Specificity - The rule applied is usually the most specific rule, or the rule that matches the most facts. • Recency - The rule applied is the rule that matches the most recently derived facts. • Fired Rules - Involves not applying rules that have already been used.
First we'll look at a very simple set of rules: IF (lecturing X) AND (marking- practicals X) THEN ADD (overworked X) IF (month February) THEN ADD (lecturing Alison) IF (month February) THEN ADD (marking- practicals Alison) IF (overworked X) OR (slept-badly X) THEN ADD (bad-mood X) IF (bad-mood X) THEN DELETE (happy X) IF (lecturing X) THEN DELETE (researching X) IF (marking – praticals X) THEN ADD(Needsrest X) Here we use capital letters to indicate variables (month February) (researching Alison) (overworked Alison) First-serve apply Rule 2 Last in first serve apply rule 3 (month February) (researching Alison) (overworked Alison) (marking- practicals Alison) Recency Apply Rule that match most recent factRule # 7 Fired Rules – don’t fire the same rule again Specificity: If we had two rules but one of them matched more facts than we’’ chose that rule Prioritization If we add priority to theserules then the higher priority rule will be fired Conflict Resolution (example)
Uncertainty • The expert system must deal with the uncertainty that comes from the individual rules, conflict resolution, and incompatibilities among the rules. Certainty factors can be assigned to the rules as in the case of MYCIN.
Uncertainty in MYCIN • Rules contain certainty factors , cf. • they make inexact inferences on a confidence scale of -1.0 to 1.0. • 1.0 represents complete confidence that it is true. • -1.0 represents complete confidence that it is false. • The Cfs are measurements of the association between the premise and action clauses of each rules. when a production rule succeeds because its premise clauses are true in the current context, the Cfs of the component clauses that indicate how strongly each clause is believed are combined, the resulting CF is used to modify the CF specified in the action clause.
Explanation facilities • Explains the reasoning process used to arrive a conclusion • provides the user with a means of understanding the system behavior. • This is important because a consultation with a human expert will often require some explanation. • Many people would not always accept the answers of an expert without some form of justification. • e.g., a medical expert providing a diagnosis/treatment of a patient is expected to explain the reasoning behind his/her conclusions: the uncertain nature of this type of decision may demand a detailed explanation so that the patient concerned is aware of any risks, alternative treatments ,etc.
Expert System Tools (I) • PROLOG • A programming language that uses backward chaining. • ART-IM (Inference Corporation) • Following the distribution of NASA's CLIPS, Inference Corporation implemented a forward-chaining only derivative of ART/CLIPS called ART-IM. • ART (Inference Corporation) • In 1984, Inference Corporation developed the Automated Reasoning Tool (ART), a forward chaining system. • CLIPS – • NASA took the forward chaining capabilities and syntax of ART and introduced the "C Language Integrated Production System" (i.e., CLIPS) into the public domain. • OPS5 (Carnegie Mellon University) • OPS5 (Carnegie Mellon University) – First AI language used for Production System (XCON) • Eclipse (The Haley Enterprise, Inc.) • Eclipse is the only C/C++ inference engine that supports both forward and Backward chaining.
Expert Systems Tools (II) • Expert System Shells • provide mechanism for knowledge representation, reasoning, and explanation, e.g. EMYCIN • Knowledge Acquisition Tools: • Programs that interact with experts to extract domain knowledge. Support inputting knowledge, maintaining knowledge base consistency and completeness. E.g., Mole, Salt
Expert System Examples • MYCIN (1972-80) • MYCIN is an interactive program that diagnoses certain infectious diseases, prescribes antimicrobial therapy, and can explain its reasoning in detail • PROSPECTOR • Provides advice on mineral exploration • XCON • configure VAX computers • DENDRAL (1965-83) • rule-based expert systems that analyzes molecular structure. Using a plan-generate-test search paradigm and data from mass spectrometry and other sources, DENDRAL proposes plausible candidate structures for new or unknown chemical compounds.
LIMITATIONS • NARROW DOMAIN • LIMITED FOCUS • INABILITY TO LEARN • MAINTENANCE PROBLEMS • DEVELOPMENTAL COST