Download
template knowledge models n.
Skip this Video
Loading SlideShow in 5 Seconds..
Template knowledge models PowerPoint Presentation
Download Presentation
Template knowledge models

Template knowledge models

149 Views Download Presentation
Download Presentation

Template knowledge models

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Template knowledge models Reusing knowledge model elements

  2. Lessons • Knowledge models partially reused in new applications • Type of task = main guide for reuse • Catalog of task templates • small set in this book • see also other repositories

  3. The need for reuse • prevent "re-inventing the wheel" • cost/time efficient • decreases complexity • quality-assurance

  4. Task template • reusable combination of model elements • (provisional) inference structure • typical control structure • typical domain schema from task point-of-view • specific for a task type • supports top-down knowledge modeling

  5. A typology of tasks • range of task types is limited • advantage of KE compared to general SE • background: cognitive science/psychology • several task typologies have been proposed in the literature • typology is based on the notion of “system”

  6. The term “system” • abstract term for object to which a task is applied. • in technical diagnosis: artifact or device being diagnosed • in elevator configuration: elevator to be designed • does not need to exist (yet)

  7. Analytic versus synthetic tasks • analytic tasks • system pre-exists • it is typically not completely "known" • input: some data about the system, • output: some characterization of the system • synthetic tasks • system does not yet exist • input: requirements about system to be constructed • output: constructed system description

  8. Task hierarchy

  9. Structure of template description in catalog • General characterization • typical features of a task • Default method • roles, sub-functions, control structure, inference structure • Typical variations • frequently occurring refinements/changes • Typical domain-knowledge schema • assumptions about underlying domain-knowledge structure

  10. Classification • establish correct class for an object • object should be available for inspection • "natural" objects • examples: rock classification, apple classification • terminology: object, class, attribute, feature • one of the simplest analytic tasks; many methods • other analytic tasks: sometimes reduced to classification problem especially diagnosis

  11. Classification: pruning method • generate all classes to which the object may belong • specify an object attribute • obtain the value of the attribute • remove all classes that are inconsistent with this value

  12. Classification:inference structure

  13. Classification: method control while new-solution generate(object -> candidate) do candidate-classes := candidate union candidate-classes; while new-solutionspecify(candidate-classes -> attribute) and length candidate-classes > 1 do obtain(attribute -> new-feature); current-feature-set := new-feature union current-feature-set; for-each candidate in candidate-classes do match(candidate + current-feature-set -> truth-value); if truth-value = false; then candidate-classes := candidate-classes subtract candidate;

  14. Classification: method variations • Limited candidate generation • Different forms of attribute selection • decision tree • information theory • user control • Hierarchical search through class structure

  15. Classification: domain schema

  16. Rock classification

  17. Nested classification

  18. Rock classification prototype

  19. Assessment • find decision category for a case based on domain-specific norms. • typical domains: financial applications (loan application), community service • terminology: case, decision, norms • some similarities with monitoring • differences: • timing: assessment is more static • different output: decision versus discrepancy

  20. Assessment: abstract & match method • Abstract the case data • Specify the norms applicable to the case • e.g. “rent-fits-income”, “correct-household-size” • Select a single norm • Compute a truth value for the norm with respect to the case • See whether this leads to a decision • Repeat norm selection and evaluation until a decision is reached

  21. Assessment:inference structure case abstract abstracted specify norms select case evaluate norm norm decision match value

  22. Assessment: method control while new-solution abstract(case-description -> abstracted-case) do case-description := abstracted-case; end while specify(abstracted-case -> norms); repeat select(norms -> norm); evaluate(abstracted-case + norm -> norm-value); evaluation-results := norm-value union evaluation-results; until has-solution match(evaluation-results -> decision);

  23. Assessment control: UML notation [more abstractions] abstract specify norms [no more abstractions] [match fails [match succeeds: no decision] decision found] select norm evaluate match norm decision

  24. Assessment: method variations • norms might be case-specific • cf. housing application • case abstraction may not be needed • knowledge-intensive norm selection • random, heuristic, statistical • can be key to efficiency • sometimes dictated by human expertise • only acceptable if done in a way understandable to experts

  25. Assessment: domain schema

  26. Claim handling for unemployment benefits

  27. Decision rules for claim handling

  28. Diagnosis • find fault that causes system to malfunction • example: diagnosis of a copier • terminology: • complaint/symptom, hypothesis, differential, finding(s)/evidence, fault • nature of fault varies • state, chain, component • should have some model of system behavior • default method: simple causal model • sometimes reduced to classification task • direct associations between symptoms and faults • automation feasible in technical domains

  29. Diagnosis: causal covering method • Find candidate causes (hypotheses) for the complaint using a causal network • Select a hypothesis • Specify an observable for this hypothesis and obtain its value • Verify each hypothesis to see whether it is consistent with the new finding • Continue this process until a single hypothesis is left or no more observables are available

  30. Diagnosis:inference structure

  31. Diagnosis: method control while new-solution cover(complaint -> hypothesis) do differential := hypothesis add differential; end while repeat select(differential -> hypothesis); specify(hypothesis -> observable); obtain(observable -> finding); evidence := finding add evidence; foreach hypothesis in differential do verify(hypothesis + evidence -> result); if result = false then differential := differential subtract hypothesis until length differential =< 1 or “no observables left” faults := hypothesis;

  32. Diagnosis: method variations • inclusion of abstractions • simulation methods • see literature on model-based diagnosis • library of Benjamins

  33. Diagnosis: domain schema

  34. Monitoring • analyze ongoing process to find out whether it behaves according to expectations • terminology: • parameter, norm, discrepancy, historical data • main features: • dynamic nature of the system • cyclic task execution • output "just" discrepancy => no explanation • often: coupling monitoring and diagnosis • output monitoring is input diagnosis

  35. Monitoring:data-driven method • Starts when new findings are received • For a find a parameter and a norm value is specified • Comparison of the find with the norm generates a difference description • This difference is classified as a discrepancy using data from previous monitoring cycles

  36. Monitoring: inference structure

  37. Monitoring: method control receive(new-finding); select(new-finding -> parameter) specify(parameter -> norm); compare(norm + finding -> difference); classify(difference + historical-data -> discrepancy); historical-data := finding add historical-data;

  38. Monitoring: method variations • model-driven monitoring • system has the initiative • typically executed at regular points in time • example: software project management • classification function treated as task in its won right • apply classification method • add data abstraction inference

  39. Prediction • analytic task with some synthetic features • analyses current system behavior to construct description of a system state at future point in time. • example: weather forecasting • often sub-task in diagnosis • also found in knowledge-intensive modules of teaching systems e.g. for physics. • inverse: retrodiction: big-bang theory

  40. Synthesis • Given a set of requirements, construct a system description that fulfills these requirements

  41. “Ideal” synthesis method • Operationalize requirements • preferences and constraints • Generate all possible system structures • Select sub-set of valid system structures • obey constraints • Order valid system structures • based on preferences

  42. Synthesis:inference structure

  43. Design • synthetic task • system to be constructed is physical artifact • example: design of a car • can include creative design of components • creative design is too hard a nut to crack for current knowledge technology • sub-type of design which excludes creative design => configuration design

  44. Configuration design • given predefined components, find assembly that satisfies requirements + obeys constraints • example: configuration of an elevator; or PC • terminology: component, parameter, constraint, preference, requirement (hard & soft) • form of design that is well suited for automation • computationally demanding

  45. Elevator configuration: knowledge base reuse

  46. Configuration:propose & revise method • Simple basic loop: • Propose a design extension • Verify the new design, • If verification fails, revise the design • Specific domain-knowledge requirements • revise strategies • Method can also be used for other synthetic tasks • assignment with backtracking • skeletal planning

  47. Configuration: method decomposition

  48. Configuration: method control operationalize(requirements -> hard-reqs + soft-reqs); specify(requirements -> skeletal-design); whilenew-solution propose(skeletal-design + design +soft-reqs -> extension) do design := extension union design; verify(design + hard-reqs -> truth-value + violation); if truth-value = false then critique(violation + design -> action-list); repeat select(action-list -> action); modify(design + action -> design); verify(design + hard-reqs -> truth-value + violation); until truth-value = true; end while

  49. Configuration: method variations • Perform verification plus revision only when for all design elements a value has been proposed. • can have a large impact on the competence of the method • Avoid the use of fix knowledge • Fixes are search heuristics to navigate the potentially extensive space of alternative designs • alternative: chronological backtracking

  50. Configuration: domain schema