1 / 32

Induction of Qualitative Trees

Induction of Qualitative Trees. Dorian Šuc and Ivan Bratko AI Lab Faculty of Computer and Information Sc. University of Ljubljana, Slovenia. Overview. Discovering qualitative relations in numerical data Qualitative trees The QUIN algorithm Experiments with QUIN

irina
Download Presentation

Induction of Qualitative Trees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Induction of Qualitative Trees Dorian Šuc and Ivan Bratko AI Lab Faculty of Computer and Information Sc. University of Ljubljana, Slovenia

  2. Overview • Discovering qualitative relations in numerical data • Qualitative trees • The QUIN algorithm • Experiments with QUIN • Application of QUIN to skill reconstruction in systems control

  3. Qualitative vs. quantitative models • Less detailed than quantitative modes • Often easier to understand • Abstractions of numerical models: • numerical values  qualitative values • real functions  qualitative constraints • Our motivation: application in reconstruction • of control skill (behavioral cloning) • Applied to control of crane, acrobot, bike, ...

  4. Example: behaviour of gas Quantitative law: Pressure * Volume / Temperature = const. Qualitative law expressed by QCF: Pressure = M+,-(Temperature, Volume)

  5. Program QUIN QUalitative INduction Numerical examples QUIN Qualitative tree Qualitative tree: similar to decision tree, qualitative constraints in leaves

  6. Example problem for QUIN Noisy examples: z = x2 - y2 + noise(st.dev. 50)

  7. Qualitative patterns in data x > 0 & y > 0 => z = M+,-(x,y)

  8. Induced qualitative tree for functionz=x2-y2 x > 0 £ 0 y y > 0 > 0 £ £ 0 0 -,+ -,- +,+ +,- z= ( x,y) z= ( x,y) z= ( x y) z= ( x,y) M M M , M Z monotonically increasing with X and monotonically decreasing with Y

  9. Qualitatively Constrained Functions, QCF Ms1, ..., sm: R m --> R, si= + or - Signs si indicate directions of change: If si = + then: function monotonically increases in i-th attribute Function “positively related” to i-th attr. si = -: function “negatively related” to i-th att.

  10. QCF consistency with examples • Each pair of examples (e,f) defines a qualitative change vector q with respect to no-change threshold • A QCF is consistent with (e,f) if QCF permits q

  11. QCF ambiguity • A QCF may be consistent with qualitative change vector q and ambiguous w.r.t. q • QCF is ambiguous w.r.t. q if QCF also permits other qualitative changes in class then those in q

  12. Error-cost of QCF Weighted by proximity of concerned examples • Error-cost of a QCF w.r.t. an example set defined as weighted encoding length • Error-cost of a QCF considers: encoding of QCF + encoding of inconsistent predictions by QCF + encoding of ambiguous predictions by QCF

  13. Outline of QUIN algorithm • Top-down greedy algorithm to induce qualitative tree • For every possible split, find the “most consistent” QCF (min. error-cost) for each subset of examples • Select the best split according to MDL

  14. Heuristics in QUIN • Finding best QCF for a set of examples exponential in # attributes • Greedy heuristic to find a “good” QCF; complexity quadratic in # attributes • In error-cost computation: sum over k nearest neighbours only

  15. Learning QCFs Pres = 2 Temp / Vol Temp Vol Pres 315.00 56.00 11.25 315.00 62.00 10.16 330.00 50.00 13.20 300.00 50.00 12.00 300.00 55.00 10.90 • For each pair of examples form a qualitative change vector

  16. Learning QCFs QCF Incons. Amb. M+(Temp) M-(Temp) M+(Vol) M-(Vol) M+,+(Temp,Vol) M+,-(Temp,Vol) M-,+(Temp,Vol) M-,-(Temp,Vol) QCF Incons. Amb. M+(Temp) 3 1 M-(Temp) M+(Vol) M-(Vol) M+,+(Temp,Vol) M+,-(Temp,Vol) M-,+(Temp,Vol) M-,-(Temp,Vol) QCF Incons. Amb. M+(Temp) 3 1 M-(Temp) 2,4 1 M+(Vol) 1,2,3 / M-(Vol) 4 / M+,+(Temp,Vol) 1,3 2 M+,-(Temp,Vol) / 3,4 M-,+(Temp,Vol) 1,2 3,4 M-,-(Temp,Vol) 4 2 qTemp=neg qVol=neg qPres=pos Select QCF with minimal QCF error-cost

  17. ep-QUIN, simplified QUIN ep-QUIN • uses every pair of examples to evaluate a QCF, not just near neighbours • does not weigh examples by proximity • does not search for best QCF heuristically • performance inferior to QUIN

  18. Problem with ep-QUIN, example • 12 learning examples that correspond to 3 linear functions ep-QUIN does not consider the locality of qual. changes Induced qual. tree does not correspond to the intuition

  19. QUIN does it better • Heuristic QUINalgorithm considers the locality and consistency of qualitative change vectors QUIN considers the proximity of examples Qualitative change vectors of near-by points weigh more QUIN finds 3 groups of examples

  20. Experimental evaluation • On a set of artificial domains: • Results by QUIN better than ep-QUIN • QUIN can handle noisy data well • QUIN finds qualitative relations corresponding to our intuition • QUIN in skill reconstruction: • QUIN used to induce qual. control strategies from examples of the human control performance • Experiments in the crane domain

  21. Experimental evaluation in artificial domains • A set of artificial domains: real functions with up to 4 arguments • Examples uniformly distributed over the attributes • 2 irrelevant attributes added • Noise added, various levels of noise

  22. Artificial test domains • Sin c = sin( x/ 10) • SinLn c = x/10 + sign(x) sin( x/ 10) • Poli c = ln( 104 + |(x+16) (x+5) (x-5) (x-16)| ) • Signs c = • sign(u+0.5) (x-10)2, if v  0 • sign(u-0.5) (y+10)2, otherwise • QuadA c = x2 – y2 • QuadB c = (x-5)2 - (y-10)2 • SQuadB c = sign(u) ((x-5)2 - (y-10)2) • YSinX c = y sin( x/ 10)

  23. Noise in the class variable (Sin) Normally distributed noise with std.dev. 0.5 is added to y=sin(x). Qualitative trees induced by QUIN have qualitative consistency over 90%

  24. Experimental evaluation in artificial domains y=sin( x/10), x [-20, 20] Minima of QUIN’s cost-error (at x=-15, -5, 5, 15) divide space into intervals with monotonic target function X=5

  25. Noise curves (Sin):consistency and ambiguity

  26. Application in behavioral cloning • Domain: crane control • Goal: effective and comprehensible clones

  27. Container crane Control forces: Fx, FLState: X, dX, , d, L, dL Based on previous work of Urbancic(94) Control task: transport the load from the start to the goal position

  28. QUIN in skill modeling, crane domain • Qualitative trees induced from execution traces for rope and trolley control • Traces of 2 operators with different control styles

  29. Trolley control, operator S desired_velocity = f(X, ,d) First the trolley velocity is increasing X < 20.7 yes no From about middle distance from the goal the trolley velocity is decreasing M+(X) X < 60.1 yes no At the goal reduce the swing of the rope (by acceleration of the trolley when the rope angle increases) M-(X) M+()

  30. Crane control: comparing operators Enables comparison of differences in control styles Operator S Operator L X < 20.7 X < 29.3 yes yes no no M+(X) M+,+,-(X, , d) X < 60.1 d < -0.02 yes yes no no M-(X) M+() M-(X) M-,+(X,)

  31. QUIN in skill modeling • Induced control strategies: • Comprehensible and very successful • Enable insight into individual differences • in control styles • QUIN able to detect very subtle aspects of human tacit skill (aspects earlier believed absent)

  32. Related work in qualitative reasoning • In qualitative reasoning: Our QFC’s inspired by qualitative proportionalities (Q+) in QPT (Forbus) and monotonicity relations (M+) in QSIM (Kuipers) • In learning qualitative models of dynamic systems: Mozetic; Coiera; Bratko et al.; Varsek; Richards et al.; Dzeroski, Todorovski; • Distinguishing features of QUIN: models of static systems, qualitative trees, takes numerical examples directly

More Related