1 / 13

Vitor R. Carvalho Text Learning Group Meetings, Carnegie Mellon University November 10 th 2004

Collective Classification A brief overview and possible connections to email-acts classification. Vitor R. Carvalho Text Learning Group Meetings, Carnegie Mellon University November 10 th 2004. Data Representation. spam. Not spam. “Flat” Data Object: email msgs

lenore
Download Presentation

Vitor R. Carvalho Text Learning Group Meetings, Carnegie Mellon University November 10 th 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collective Classification A brief overview and possible connections to email-acts classification Vitor R. Carvalho Text Learning Group Meetings, Carnegie Mellon University November 10th 2004

  2. Data Representation spam Not spam • “Flat” Data • Object: email msgs • Attributes: words, sender, etc • Class: spam/not spam • Usually assumed IID • Sequential Data • Object: words in text • Attr: capitalized, number, dict • Class: POS (or name/not) • Relational Data • class+attributes • +links(relations) • Example: webpages spam Not spam spam pron verb name det name

  3. J. Neville et al., 2003

  4. Relational Data and Collective Classification • Different objects interact • Different types of relations (links) • Attributes may be correlated • Examples: • actors, directors, movies, companies • papers, authors, conferences, citations • company, employee, customer, Classify objects collectively Use prediction on some objects to improve prediction on related objects

  5. Collective Classification Methods • Relational Probability Trees (RPT) • Iterative methods (Relaxation-based Methods) • Relational Dependency Networks (RDN) • Relational Bayesian Networks (RBN/PRM) • Relational Markov Networks (RMN) • Other models (ILP based, Vector Space based, etc) • Overall: • Lack of direct comparison among methods • Results are usually compared to “flat” model • Splitting data into train/test sets can be an issue

  6. Relational Probability Trees • Decision Trees applied to Relational data • Predicts the target class label based on: • same object attributes • attributes + links in “relational neighborhood” (one link away) • counts of attributes and links in the “neighborhood” • Enhanced feature selection (Chi-square, pruning, randomization tests) • Results were not exciting • Neville et al. KDD2003, related work from Blockeel et al. (Artificial Intelligence, 1998), Kramer AAAI-96

  7. Iterative Methods • Predicts the target class label based on: • Same object attributes • Attributes and links of relational neighborhood • CLASS LABEL of neighborhood • Features derived from CLASS LABELS • Different update strategies: • By threshold in prediction confidence • By top-N most confident predictions • Heuristic-based • Slattery & Mitchell, ICML-2000;Neville & Jensen, AAAI-2000; Chakrabarti et al. ACM-SIGMOD-98 • Some results with Email-acts

  8. Relational Bayesian Networks (RBN/PRM) • Bayes Net extended to Relational domain • Given an “instantiation”, it induces a bayes-net that specifies a joint probability distribution over all attributes of all entities • Directed graphical model, with acyclicity constraint. • Exact model - Closed form for parameter estimation – Products of conditional probabilities • Was applied to simple domains, since the acyclicity constraints is very restrictive to most relational applications • Friedman et al, IJCAI-99; Getoor et al., ICML-2001; Taskar et al. IJCAI-2001

  9. Extension of CRF idea to Relational Domain Given an instantiation, it induces a Markov Network that specifies a probability distribution of labels, given links and attributes Undirected, Discriminative model Parameter estimation is expensive, requires approximate probabilistic inference (belief propagation) Taskar et al., UAI2002 Relational Markov Networks (RMN)

  10. Dependency Networks extended to Relational domain P(X) = π [ Prob (Xi | Neighbor(Xi)) ] Given an “instantiation”, it induces a DN that specifies an “approximate” joint probability distribution over all attributes of all objects Undirected graphical model, no acyclicity constraint. Approximate model - Simple parameter estimation – approximate inference (Gibbs sampling) Neville & Jensen, KDD-MRDM-2003 Relational Dependency Networks (RDN)

  11. Other Models From Neville et al., 2003

  12. Comparing Some Results PRM RMN • Comparing PRM, RMN, SVM and M^3N • Diff: PRM and RMN • Diff: mSVM and RMN • RN* (Relational Neighbor) is a very simple Relational Classifier • RN* (Macskassy et al., 2003) • M^3N(Taskar et al., 2003)

  13. End of overview…now, the email-act problem • Strong correlation with previous and next message Commit Proposal Request Request Request Proposal Delivery Commit Request Acknowled Delivery • A “verb” has little or no correlation with other “verbs” of same message • Flat data? • Sequential data? Commit Delivery Time

More Related