1 / 26

Unsupervised Ontology Acquisition from plain texts : The OntoGain method

Unsupervised Ontology Acquisition from plain texts : The OntoGain method. Efthymios Drymonas Kalliopi Zervanou Euripides G.M. Petrakis Intelligent Systems Laboratory http://www.intelligence.tuc.gr Technical University of Crete (TUC), Chania , Greece. OntoGain.

aaron
Download Presentation

Unsupervised Ontology Acquisition from plain texts : The OntoGain method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unsupervised Ontology Acquisition from plain texts: The OntoGain method EfthymiosDrymonas KalliopiZervanou Euripides G.M. Petrakis Intelligent Systems Laboratory http://www.intelligence.tuc.gr Technical University of Crete (TUC), Chania, Greece

  2. OntoGain • A platform for unsupervised ontology acquisition from text • Application independent • Ontology of multi-word term concepts • Adjusts existing methods for taxonomy & relation acquisition to handle multi-word concepts • Outputs ontology in OWL • Good results on Medical, Computer science corpora 2

  3. Why multi-word term concepts? • Majority of terminological expressions • Convey classificatory information, expressed as modifiers • e.g. “carotid artery disease” denotes a type of “artery disease” which is a type of “disease” • Leads to more expressive and compact ontology lexicon 3

  4. Ontology Learning Steps • Concept Extraction • C/NC-value • Taxonomy Induction • Clustering, Formal Concept Analysis • Non-taxonomic Relations • Association Rules, Probabilistic algorithm 4

  5. The C/NC-Value method[Frantziet.al. , 2000] • Identifies multi-word term phrases denoting domain concepts • Noun phrases are extracted first • ((adj | noun)+ | ((adj | noun) *(noun prep)?) (adj | noun) *) noun • C-Value: Term validity criterion, relying on the hypothesis that multi-word terms tend to consist of other terms • NC-Value: Uses context information (valid terms tend to appear in specific context and co-occur with other terms) 5

  6. C-Value: Statistical Part • For candidate term a • f(a):Total frequency of occurrence • f(b): Frequency of a as part of longer terms • P(Ta): number of these longer terms • |a|: The length of the candidate string Concept Extraction

  7. C/NC-Value sample results 7

  8. Ontology Learning Steps • Preprocessing • Concept Extraction • Taxonomy Induction • Non-taxonomic Relations 8

  9. Taxonomy Induction • Aims at organizing concepts into a hierarchical structure where each concept is related to its respective broader and narrower terms • Two methods in OntoGain • Agglomerative clustering • Formal Concept Analysis (FCA)

  10. Agglomerative Clustering • Proceeds bottom-up: at each step, the most similar clusters are merged • Initially each term is considered a cluster • Similarity between all pairs of clusters is computed • The most similar clusters are merged as long as they share terms with common heads • Group average for clusters, Dice like formula for terms 10

  11. Formal Concept Analysis (FCA) [Ganter et al., 1999] • FCA relies on the idea that the objects (terms) are associated with their attributes (verbs) • Finds common attributes (verbs) between objects and forms object clusters that share common attributes • Formal concepts are connected with the sub-concept relationship

  12. FCA Example • Takes as input a matrix showing associations between terms (concepts) and attributes (verbs)

  13. FCA Taxonomy • Formal concepts • ({hierarchical clustering, root node, single cluster}, {compute, search}) • ({html form, web page}, {print, search}) • Not all dependencies c,v are interesting 13

  14. Non-Taxonomic Relations extraction phase • Concept Extraction • Taxonomy Induction • Non-Taxonomic Relations 14

  15. Non-Taxonomic Relations • Concepts are also characterized by attributes and relations to other concepts in the hierarchy • Typically expressed by a verb relating pair of concepts • Two approaches • Associations rules • Probabilistic

  16. Association Rules [Aggrawal et.al., 1993] • Introduced to predict the purchase behavior of customers • Extract terms connected with some relation subject-verb-object • Enhance with general terms from the taxonomy • Eliminate redundant relations: • predictive accuracy < t

  17. Association Rules: Example 17

  18. Probabilistic approach [Cimiano et.al. 2006] • Collect verbal relations from the corpus • Find the most general relation wrt verb using frequency of occurrence • Suffer_from(man, head_ache) • Suffer_from(woman, stomach_ache) • Suffer_from(patient,ache) • Select relationships satisfying a conditional probability measure • Associations > t become accepted 18

  19. Evaluation • Relevance judgments are provided by humans • Precision - Recall • We examined the 200 top-ranked concepts and their respective relations in 500 lines • Results from OhsuMed & Computer Science corpus 19

  20. Results 20

  21. Comparison with Text2Onto [Cimiano & Volker, 2005] • Huge lists of plain single word terms, and relations lacking of semantic meaning • Text2Onto cannot work with big texts • Cannot export results in OWL 21

  22. Conclusions • OntoGain • Multi-word term concepts • Exports ontology in OWL • Domain independent • Results • C/NC-Value yields good results • Clustering outperforms FCA • Association Rules perform better than Verbal Expressions 22

  23. Future Work • Explore more methods / combinations • e.g., clustering, FCA • Hearst patterns for discovering additional relation types (Part-of) • Discover attributes and cardinality constraints • Incorporate term similarity information from WordNet, MeSH • Resolve term ambiguities 23

  24. Thank you! Questions ? 24

  25. Preprocessing • Tokenization, POS tagging, Shallow parsing (OpenNLP suite) • Lemmatization (WordNet Java Library • Apply to all steps of OntoGain • Shallow parsing is used in relations acquisition for the detection of verbal dependencies

  26. Terms sharing a head tend to be similar • e.g. hierarchical methodand agglomerative methodare both methods • Nested terms are related to each other • e.g. agglomerative clustering methodand clustering method should be associated) 26

More Related