Finding knowledge, data and answers on the Semantic Web - PowerPoint PPT Presentation

finding knowledge data and answers on the semantic web n.
Skip this Video
Loading SlideShow in 5 Seconds..
Finding knowledge, data and answers on the Semantic Web PowerPoint Presentation
Download Presentation
Finding knowledge, data and answers on the Semantic Web

play fullscreen
1 / 59
Finding knowledge, data and answers on the Semantic Web
Download Presentation
Download Presentation

Finding knowledge, data and answers on the Semantic Web

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Finding knowledge, data and answers on the Semantic Web Tim Finin University of Maryland, Baltimore County Joint work with Li Ding, Anupam Joshi, Yun Peng, Cynthia Parr, Pranam Kolari, Pavan Reddivari, Sandor Dornbush, Rong Pan, Akshay Java, Joel Sachs, Scott Cost and Vishal Doshi  This work was partially supported by DARPA contract F30602-97-1-0215, NSF grants CCR007080 and IIS9875433 and grants from IBM, Fujitsu and HP.

  2. This talk • Motivation • Swoogle Semantic Websearch engine • Use cases and applications • Conclusions

  3. Google has made us smarter

  4. tell register But what about our agents? Agents still have a very minimal understanding of text and images.

  5. Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle Swoogle tell register But what about our agents? A Google for knowledge on the Semantic Web is needed by software agents and programs

  6. This talk • Motivation • Swoogle Semantic Websearch engine • Use cases and applications • Conclusions

  7. • Running since summer 2004 • 1.6M RDF docs, 300M triples, 10K ontologies,15K namespaces, 1.3M classes, 175K properties, 43M instances, 420 registered users

  8. Analysis … SWD classifier Ranking Index Search Services Semantic Web metadata IR Indexer Web Server Web Service SWD Indexer html rdf/xml Discovery the Web document cache SwoogleBot Semantic Web Candidate URLs Bounded Web Crawler Google Crawler human machine Legends Information flow Swoogle‘s web interface Swoogle Architecture

  9. This talk • Motivation • Swoogle Semantic Websearch engine • Use cases and applications • Conclusions

  10. Applications and use cases Supporting Semantic Web developers • Ontology designers, vocabulary discovery, who’s using my ontologies or data?, use analysis, errors, statistics, etc. Searching specialized collections • Spire: aggregating observations and data from biologists • InferenceWeb: searching over and enhancing proofs • SemNews: Text Meaning of news stories Supporting SW tools • Triple shop: finding data for SPARQL queries 1 2 3

  11. 1

  12. 80 ontologies were found that had these three terms By default, ontologies are ordered by their ‘popularity’, but they can also be ordered by recency or size. Let’s look at this one

  13. Basic Metadata hasDateDiscovered:  2005-01-17 hasDatePing:  2006-03-21 hasPingState:  PingModified type:  SemanticWebDocument isEmbedded:  false hasGrammar:  RDFXML hasParseState:  ParseSuccess hasDateLastmodified:  2005-04-29 hasDateCache:  2006-03-21 hasEncoding:  ISO-8859-1 hasLength:  18K hasCntTriple:  311.00 hasOntoRatio:  0.98 hasCntSwt:  94.00 hasCntSwtDef:  72.00 hasCntInstance:  8.00

  14. rdfs:range was used 41 times to assert a value. owl:ObjectProperty was instantiated 28 times time:Cal… defined once and used 24 times (e.g., as range)

  15. These are the namespaces this ontology uses. Clicking on one shows all of the documents using the namespace. All of this is available in RDF form for the agents among us.

  16. Here’s what the agent sees. Note the swoogle and wob (web of belief) ontologies.

  17. We can also search for terms (classes, properties) like terms for “person”.

  18. 10K terms associated with “person”! Ordered by use. Let’s look at foaf:Person’s metadata

  19. 87K documents used foaf:gender with a foaf:Person instance as the subject

  20. 3K documents used dc:creator with a foaf:Person instance as the object

  21. Swoogle’s archive saves every version of a SWD it’s seen.

  22. 2 • An NSF ITR collaborative project with • University of Maryland, Baltimore County • University of Maryland, College Park • U. Of California, Davis • Rocky Mountain Biological Laboratory

  23. An invasive species scenario • Nile Tilapia fish have been found in a California lake. • Can this invasive species thrive in this environment? • If so, what will be the likelyconsequences for theecology? • So…we need to understandthe effects of introducingthis fish into the food webof a typical California lake

  24. Food Webs • A food web models the trophic (feeding) relationships between organisms in an ecology • Food web simulators are used to explore the consequences of changes in the ecology, such as the introduction or removal of a species • A locations food web is usually constructed from studies of the frequencies of the species found there and the known trophic relations among them. • Goal: automatically construct a food web for a new location using existing data and knowledge • ELVIS: Ecosystem Location Visualization and Information System

  25. East River Valley Trophic Web

  26. Species List Constructor Click a county, get a species list

  27. The problem • We have data on what species are known to be in the location and can further restrict and fill in with other ecological models • But we don’t know which of these the Nile Tilapia eats of who might eat it. • We can reason from taxonomic data (simlar species) and known natural history data (size, mass, habitat, etc.) to fill in the gaps.

  28. Food Web Constructor Predict food web links using database and taxonomic reasoning. In an new estuary, Nile Tilapia could compete with ostracods (green) to eat algae. Predators (red) and prey (blue) of ostracods may be affected

  29. Evidence Provider Examine evidence for predicted links.

  30. Status • Goal is ELVIS(Ecosystem Location Visualization and Information System) as an integrated set of web services for constructing food webs for a given location. • Background ontologies • SpireEcoConcepts: concepts and properties to represent food webs, and ELVIS related tasks, inputs and outputs • ETHAN (Evolutionary Trees and Natural History) Concepts and properties for ‘natural history’ information on species derived from data in the Animal diversity web and other taxonomic sources • Under development • Connect to visualization software • Connect to triple shop to discover more data

  31. UMBC Triple Shop 3 • • Online SPARQL RDF query processing with several interesting features • Automatically finds SWDs for give queries using Swoogle backend database • Datasets, queries and results can be saved, tagged, annotated, shared, searched for, etc. • RDF datasets as first class objects • Can be stored on our server or downloaded • Can be materialized in a database or(soon) as a Jena model

  32. Web-scale semantic web data access data access service the Web agent Index RDF data ask (“person”) Search vocabulary Search URIrefs in SW vocabulary inform (“foaf:Person”) Compose query ask (“?x rdf:type foaf:Person”) Search URLs in SWD index Populate RDF database inform (doc URLs) Fetch docs Query local RDF database

  33. Who knows Anupam Joshi? Show me their names, email address and pictures

  34. The UMBC ebiquity site publishes lots of RDF data, including FOAF profiles

  35. No FROM clause! PREFIX foaf: <> SELECT DISTINCT ?p2name ?p2mbox ?p2pix FROM ??? WHERE { ?p1 foaf:surname "Joshi" . ?p1 foaf:firstName “Anupam" . ?p1 foaf:mbox ?p1mbox . ?p2 foaf:knows ?p3 . ?p3 foaf:mbox ?p1mbox . ?p2 foaf:name ?p2name . ?p2 foaf:mbox ?p2mbox . OPTIONAL { ?p2 foaf:depiction ?p2pix } . } ORDER BY ?p2name

  36. log in specify dataset Enter query w/oFROM clause!

  37. 302 RDF documents were found that might have useful data.

  38. We’ll select them all and add them to the current dataset.

  39. We’ll run the query against this dataset to see if the results are as expected.

  40. The results can be produced in any of several formats

  41. Looks like a useful dataset. Let’s save it and also materialize it the TS triple store.