1 / 7

S = ( X, A  {d[1],d[2],..,d[k]}, V ), where: - X is a finite set of objects,

Multi- H ierarchical D ecision S ystem. S = ( X, A  {d[1],d[2],..,d[k]}, V ), where: - X is a finite set of objects, - A is a finite set of classification attributes , - {d[1],d[2],..,d[k]} is a set of hierarchical decision attributes, and

zoe
Download Presentation

S = ( X, A  {d[1],d[2],..,d[k]}, V ), where: - X is a finite set of objects,

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-Hierarchical Decision System S= (X, A{d[1],d[2],..,d[k]}, V), where: - X is afinite set of objects, - A is a finite set of classification attributes, - {d[1],d[2],..,d[k]} is a set of hierarchical decision attributes, and V = {Va : a A{d[1],d[2],..,d[k]}}is a set of their values. We assume that: Va, Vbare disjoint for any a, b A{d[1],d[2],..,d[k]},such that a ≠ b, a : X →Vais a partial function for every a A{d[1],d[2],..,d[k]}. Decision queries (d-queries) for S- a least set TDsuch that: -0, 1 TD, - if w{Va: a {d[1],d[2],..,d[k]}} , then w, ~w TD, - if t1, t2TD, then (t1 + t2), (t1t2) TD. Incomplete Database Atomic Level

  2. Example 1 2 1 2 3 LevelI C[1] C[2] d[1] d[2] d[3] LevelII 1 2 1 2 C[2,1] C[2,2] d[3,1] d[3,2] Classification Attributes Decision Attributes

  3. Simple Term to Simple Query • Classification terms (c-terms) for S are defined as the least set TC: • 0, 1 TC, • if w {Va : a  A}, then w, ~w TC, • if t1, t2TC, then (t1 + 2), (t1t2)TC. • c-term t is called simpleif t = t1t2…tnand • (j  {1,2,…,n})[(tj{Va : a  A})] (tj = ~w  w{Va : a  A})]. • d-query t is called simple if t = t1t2…tn and • (j  {1,2,…,n})[(tj{Va : a {d[1],d[2],..,d[k]}})  • (tj = ~w  w{Va : a {d[1], d[2],.., d[k]})]. • By a classification rulewe mean an expression [t1 t2], • - both t1 and t2 are simple. Atomic Level 3

  4. Classifier-based Semantics • Semantics MSof c-terms in S is defined in a standard way as follows: • - MS(0) = 0, MS(1) = X, • - MS(w) = {x X : w = a(x)}for any w Va, a A, • - MS(~w) = {x X : (v Va)[v = a(x) v≠w]}for any w Va, a A, • - if t1, t2are terms, then • MS(t1 + t2) = MS(t1) MS(t2), MS(t1 t2) = MS(t1) MS(t2). • Classifier-based semantics MSof d-queries in • S = (X, A{d[1],d[2],..,d[k]}, V ), • if tis a simple d-query in S and {rj = [tj t]: j Jt} is a set of all • rules defining t which are extracted from S by classifier, then • MS(t) = {(x,px): (j Jt)(x  MS(tj)[px = • {conf(j)sup(j): x  MS(tj) & j Jt}/{sup(j): x  MS(tj) & jJt}]}, • where conf(j), sup(j) denote the confidence and the support of • [tj t], correspondingly. 4

  5. Standard Semantics of D-queries • Attribute value d[j1, j2,…jn] in S is dependent on an attribute • value which is either its ancestor or descendant in d[j1]. • d[j1, j2,…jn] is independent from any other attribute value in S. • Let S = (X, A{d[1],d[2],..,d[k]}, V),w Vd[i] , and IVd[i] be the set of • all attribute values in Vd[i]which areindependent from w. • Standard semantics NSof d-queries in S is defined as follows: • NS(0) = 0, NS(1) = X, • if w Vd[i] , then NS(w) = {x X : d[i](x)=w}, for any 1i k • if w Vd[i] , then NS(~w) = {x X : (v IVd[i])[ d[i](x)=v]}, • for any 1i k • if t1, t2 are terms, then • NS(t1 + t2) = NS(t1)NS(t2), NS(t1t2) = NS(t1) NS(t2). 5

  6. The Overlap of Semantics Let S =(X, A{d[1],d[2],..,d[k]}, V), tis a d-query in S, NS(t) is its meaning under standard semantics, and MS(t) is its meaning under classifier-based semantics. Assume that NS(t) = X  Y, where X = {xi, i  I1}, Y = {yi , i  I2}. Assume also that MS(t) = {(xi, pi): i  I1}  {(zi, qi): i  I3} and {yi , i  I2}  {zi , i I3}= . I2 I1 I3 NS MS 6

  7. Precision & Recall I2 I1 I3 NS MS • Precision Prec(MS, t) of a classifier-based semantics MS on a d-query t: Prec(MS, t) = [{pi : i  I1} + {(1 – qi ) : i  I3}] [card(I1) + card(I3)]. • Recall Rec(MS, t) of a classifier-based semantics MS on a d-query t: Rec(MS, t) = [{pi : i  I1}]____ [card(I1) + card(I2)].

More Related