1 / 12

Hybrid Systems

Hybrid Systems. Two examples of the Combination of Rule-Based Systems and neural nets By Pieter Buzing. Plan. Introduction: Knowledge Based System vs Neural Net Basic hybrid technique Fu’s system KBANN system Comparison (rule improvement, semantics) Conclusions. Introduction.

dore
Download Presentation

Hybrid Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hybrid Systems Two examples of the Combination of Rule-Based Systems and neural nets By Pieter Buzing

  2. Plan • Introduction: • Knowledge Based System vs Neural Net • Basic hybrid technique • Fu’s system • KBANN system • Comparison (rule improvement, semantics) • Conclusions

  3. Introduction • Knowledge Based System • Neural Network • Characteristics KBS & NN • Basic hybrid technique

  4. Knowledge Based System • Rule base and fact base • Facts   Conclusions • Certainty Factors [-1, 1] • IF smart AND ambitious rich • Given: CF(smart)=0.8 CF(ambitious)=0.5 • Conclude: CF(rich)=0.7*min(0.8, 0.5)=0.35 INFERENCE CF=0.7

  5. Neural Network • Nodes and connections • layers: input, hidden and output nodes • Aim: right weight vector for each connection • Trained with examples: minimize error

  6. Characteristics

  7. Basic hybrid technique Initialize the neural network with domain knowledge. So architecture and initial weight are now founded! Use the following mapping:

  8. Fu’s system (1989) • Proposed by: Li-Min Fu, Winsconsin • Objective: let NN deal with incorrect KB • Construction: conceptual network with CFs AND-nodes to maintain meaning • Training: backpropagation and hill-climbing because AND-function not differentiable • Error handling: identifies wrong rules • Semantics: rules always ‘visible’ in network

  9. KBANN system (1994) • Proposed by: Towell&Shavlik, Winsconsin • Objective: use KB to initialize a NN • Construction: conceptnode extra nodes and connections added • Training: backpropagation • Error handling: weight adjustment • Semantics: too many connections to make sense out of it

  10. Comparison (1) Coping with erroneous rules Fu considers rule incorrect when weight change exceeds a threshold KBANN deals with it implicitly, it alters the weight of a inconsistent rule Fu can identify malicious rules when 12% of rules is corrupted KBANN: outperforms standard NN with 10% big or 30% small changes

  11. Comparison (2) Maintainability of semantics Fu: every unit keeps its meaning KBANN: (random) units are added Fu: conjunction units hold their original semantic basis KBANN: all nodes are connected, so every node is a big ‘conjunction’ Fu’s weights are CFs. KBANN?

  12. Conclusions • Coping with erroneous rules • Fu can be used to verify rules: identify inconsistent ones. • KBANN handles it convincingly • Maintainability of semantics • Fu succeeds in comprehensibility goal • KBANN loses its semantics: mere starting base • Mind you: different goals, periods, domain

More Related