1 / 31

Key challenges in expressing and utilising geospatial semantics at Ordnance Survey

Research Labs GeoSemantics. Key challenges in expressing and utilising geospatial semantics at Ordnance Survey. Presentation by Katalin Kovacs and Sheng Zhou Our colleagues: Cathy Dolbear, John Goodwin, Glen Hart European GeoInformatics Workshop 9 th March 2007. Ordnance Survey.

kimo
Download Presentation

Key challenges in expressing and utilising geospatial semantics at Ordnance Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Labs GeoSemantics Key challenges in expressing and utilising geospatial semantics at Ordnance Survey Presentation by Katalin Kovacs and Sheng Zhou Our colleagues: Cathy Dolbear, John Goodwin, Glen Hart European GeoInformatics Workshop 9th March 2007

  2. Ordnance Survey • The national mapping agency of Great Britain • Information provider • Responsibility to collect and disseminate geographical information • Responsibility to maintain the accuracy, currency and delivery of geographical information To Benefit the Nation

  3. GeoSemantics at Ordnance Survey • Responsibility to develop methods to add semantic content to geographical information • To maximise use and value that our customers and partners get out of our data To Benefit Society Why?

  4. Challenges in Utilising Semantics What? – Getting the content right • Ontology Authoring How? - Making it work • Identifying Fields in OS MasterMap™ (Glen Hart) • Mapping the ontology to a database (Cathy Dolbear) • Automatic sentences conversion and CONFLUENCE project (Sheng Zhou)

  5. Ontology Authoring • Domain expert develops conceptual ontology with a purpose in mind using structured sentences (human readable) • Knowledge modeller converts it to OWL (Protégé, Machine readable) • Iterate and Cooperate • Publish both the human and machine readable ontologies http://www.ordnancesurvey.co.uk/oswebsite/ontology/ Getting the CONTENT right

  6. Identifying Fields in OS MasterMap™ • DEFRA, Natural England, Natural Heritage Fund need to know where fields are. • Currently, there is no classification for a field in OS data. Theme: Land Descriptive Group: Natural Environment Descriptive Term: Scrub Make: Natural Theme: Land Descriptive Group: Natural Environment Descriptive Term: Heath Make: Natural Theme: Land Descriptive Group: General Surface Descriptive Term: Make: Natural

  7. So what does define a field? • Through experimentation a field was determined to be: • Single Polygon features with: • Theme: Land • Make: Natural And • Descriptive Group: General Surface and Descriptive Term: Null Or • Descriptive Group: Natural Environment and Descriptive Term: Orchard, • Coniferous trees (scattered), • Non-coniferous trees (scattered), • Boulders (scattered) • rough grassland • heath. • The smallest that a field can be is 0.1 Ha and the largest 30Ha • Fields must not be long and thin (e.g. a road side verge): area/perimeter > 8.7

  8. An Ontological Approach • Cumbersome to reclassify within the database • But fields could be defined in an ontology and then mapped to the database Domain Ontology Data Ontology Database

  9. Simplified Example Orchard Field Is a Domain Ontology Is a Has Arable / Pasture Footprint Is a Feature Table and Desc Term and (has FieldValue has “Orchard”) Desc Term DB Orchard Is equivalent to Has Column Data Ontology Is in Feature Table Polygon Has Column

  10. Mapping the ontology to a database Current technology options: • D2R Server (SPARQL queries on virtual RDF graph) [1] • Oracle RDF or other triple stores [2,3] • Semantic Web Services • [1] http://sites.wiwiss.fu-berlin.de/suhl/bizer/D2RQ/ • [2] Developing Semantic Web Applications using the Oracle Database 10g RDF Data Model Xavier Lopez, and Melliyal Annamalai, http://www.oracle.com/technology/tech/ semantic_technologies/pdf/oow2006_ semantics_061128.pdf • [3] S. Harris, N. Gibbins. 3store: Efficient Bulk RDF Storage, 2003. 1st International Workshop on Practical and Scalable Semantic Systems.

  11. Oracle spatial database D2RQ Custom SPARQL functions can be written for spatial queries – but exact implementation (+ meaning) hidden in code SPARQL query Query result D2RQ Server Joseki SPARQL service JDBC connection DIG reasoner (e.g. Pellet) D2RQ Jena RDF virtual graph OWL ontology D2RQ Map

  12. Oracle RDF Store • Limited reasoning (RDFS in 10g, subset of OWL in 11g) • How much reasoning do we really need? • Will companies convert all their relational data to RDF? • RDF likely to degrade performance – 11GB for Southampton area triples. • Spatial component must be executed after RDF filtering • Would it be more efficient to perform the spatial query first to minimise the size of the RDF graph?

  13. Our proposal SPARQL+Geo Query Query Results Jena API • The meaning is stored in the ontology, • The doing is stored in the PL/SQL. • Query optimisation is carried out in the OS API OWL ontology Joseki Reasoner RDF Graph + Spatial Indexing Proposed API SQL Spatially Enabled Database

  14. Other Things we are working on Places and Buildings ontology Administrative Geography ontology – with instances! Ontology Merging

  15. So a knowledge modeller walks into a Starbucks®...

  16. ...and he says:

  17. “I'll have a Venti Physical Non-Agentative Non-Chemical Mental Object with a Stative Abstract Quality and Double Qualia inferred from the Achievement of a Temporal Region.”

  18. And the barista says:

  19. “Endurent or Perdurent?”

  20. Introduction to Rabbit • Purpose: to allow domain experts to create and work with ontologies • Domain experts hold all the domain information • Do not have description logic experience

  21. Current ontology building at Ordnance Survey • A Two stage process • Domain experts + Knowledge Engineers • Pros: • Interactive and iterative interpretation by domain experts and knowledge engineers • Capable of complex modelling • Cons: • Tedious • Could be more efficient • Inconsistent conversion from conceptual ontology to logical ontology

  22. The CONFLUENCE Project • Objective: • To develop a tool for domain experts • To enable one-step ontology development in a controlled natural language • Output: • A public domain software tool with full GUI • Working plan: • Cooperation between Ordnance Survey and Leeds University (Prof. Cohn and Dr. Dimitrova)

  23. Elements of CONFLUENCE • CONFLUENCE • Glossary population: • tools to help domain experts to organise knowledge, discover concepts and relations. • Formal structuring: • interactive ontology authoring with Rabbit • Ontology documentation and maintenance • Ontology inspection and evaluation

  24. Rabbit: an interface languages for CONFLUENCE • What is Rabbit: • A controlled natural language (English) for ontology authoring • (Potentially) a high-level syntax for OWL • Why yet another CNL • Rabbit includes specific constructs and interpretations originated from real world ontology building practice Golden Egg!

  25. Rabbit: feature list • A feature list for Rabbit: • Formalisation of “structured sentences” • Support of most OWL elements • Support of references to external resources • Stand-alone (independent of the tool) • Experiment on the usability of Rabbit will be carried out soon.

  26. Rabbit by Example • Rabbit is a kind of mammal • => Rabbit subClassOf Mammal • A rabbit has exactly 2 big ears. • => hasEar exactly 2 BigEar • A rabbit has exactly 2 big eyes. • hasEye exactly 2 (Eye AND BigThing) A rabbit has whiskers. => hasWhiskers some Whiskers • A rabbit has exactly 1 short tail. • => hasTail exactly 1 ShortTail • A rabbit has soft fur. • => hasFur some SoftFur • A rabbit has fur colour only white, brown or black. • hasFurColour some {white, brown, black} • A rabbit eats fresh vegatables. • => eats some FreshVegatable • Peter Rabbit is an instance of Rabbit. • => Rabbit: PeterRabbit

  27. Rabbit-to-OWL Conversion • Conversion module • Responsibility of Ordnance Survey • Purpose: • to provide automated and consistent conversion from Rabbit sentences to OWL ontology • Development is in progress • A Rabbit Ontology model + Protégé API • Plan to release the first prototype in the near future

  28. Plan for the future: breeding rabbits • To improve current Rabbit phrase set / syntax • Mechanism for expansion • Query in Rabbit?

  29. Conclusions • Getting the Content Right: • Allow domain experts to building ontologies – Rabbit • Making it Work: • Making semantic sense of existing spatial data • Linking ontologies to data bases – OS API • Putting first things first: • Solve more real world problems for more real people • To benefit Society before our personal research agenda Start asking: “What can geosemantics do for you?”

  30. Contact us Our Website: http://www.ordnancesurvey.co.uk/oswebsite/ontology/ Email: Catherine.Dolbear@ordnancesurvey.co.uk Katalin.Kovacs@ordnancesurvey.co.uk Sheng.Zhou@ordnancesurvey.co.uk

  31. Break out session topic: Key Limitations • Getting the Content Right: • Make it easier to domain experts to build and work with ontologies • Making it Work: • Meaning of data in existing data bases • Linking ontologies to data bases • Putting first things first: • Demonstrate that semantics can solve real world problems for real people • Benefit society before our personal research interests

More Related