Resource description discovery and metadata for open educational resources
Sponsored Links
This presentation is the property of its rightful owner.
1 / 21

Resource description, discovery, and metadata for Open Educational Resources PowerPoint PPT Presentation


  • 68 Views
  • Uploaded on
  • Presentation posted in: General

Resource description, discovery, and metadata for Open Educational Resources. R. John Robertson, Phil Barker & Lorna Campbell OER 10, Cambridge, 22 nd -24 th March 2010. This work is licensed under a Creative Commons Attribution 2.5 UK: Scotland License. Overview. UKOER and JISC CETIS

Download Presentation

Resource description, discovery, and metadata for Open Educational Resources

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Resource description, discovery, and metadata for Open Educational Resources

R. John Robertson, Phil Barker & Lorna Campbell

OER 10, Cambridge, 22nd-24th March 2010

This work is licensed under a Creative Commons Attribution 2.5 UK: Scotland License.


Overview

  • UKOER and JISC CETIS

  • Stakeholders

  • 6 tensions in description and metadata

  • Where next?


Purpose

  • To begin to provide an overview about how the UKOER projects have approached describing educational resources

  • To highlight issues relating to description that should be considered when sharing learning resources


UKOER and JISC CETIS


Key stakeholders in the programme

  • Academics

    • Creating OERs

    • Using OERs

  • Institutions/Consortia

    • Releasing OERs

    • Consuming OERs

  • HEA/ JISC / HEFCE


Other stakeholders in the programme

  • Aggregators

    • JORUM

    • Others

  • Independent learners

    • On related course elsewhere

    • Truly independent

  • Enrolled students

    • On original course

    • On other courses

  • Employers and the marketplace

    • Training benefits?


Description for your use vs. description for sharing (1/4)

  • Description costs, so prioritisation required.

  • Balance the needs of immediate users of system with requirements of taking part in wider networks.

  • For example, needing course codes for local use and JACS for sharing.


Description for your use vs. description for sharing (2/4)

  • Requirements:

    • programme tag

    • author

    • title

    • date

    • url

    • file format

    • file size

    • rights


Description for your use vs. description for sharing (3/4)

  • Key influences on descriptive choices?

    • project team (and support/ programme)

    • Technology already in use

    • Jorum’s requirements (or perception of them)


Description for your use vs. description for sharing (4/4)

  • Do standards help or hinder this decision?

    • Mostly irrelevant

      • Exist in underlying systems

      • Export in a given standard can be mapped

      • Tools hide standards

    • However, perceptions about standards do play a role

      • Jorum uses ‘X’ so we’ll use it;

      • ‘X’ has a space to describe this feature


Metadata standards vs other forms of description

  • Most projects are creating metadata

    • For some projects license information only in the metadata

    • But others are not using any formal descriptive standard

  • Does full text indexing eliminate the need for keywords?

    • audio, video, image, and flash materials as well

    • keywords and tags are very useful for aggregators

  • Do we need metadata if we have a cover page (or vice versa)?

    • Use of cover pages is not yet fully known but it appears to not be a major feature.


SEO vs. description for specialized discovery tools (1/3)

  • Specialized discovery tools include:

    • format-based tools like Vimeo, YouTube, Slideshare and Scribd

    • aggregators like DiscoverEd and OERCommons

    • subject or domain repositories (such as Jorum)


SEO vs. description for specialized discovery tools (2/3)

  • Specialised tools often require domain specific terminology and their search indexing can reward comprehensive description – e.g. Use of MESH.

  • Specialised tools may restrict the fields of descriptive information that can be supplied or that will be used. There is therefore a temptation to put everything into the fields which are available.


SEO vs. description for specialized discovery tools (3/3)

  • SEO is more of an arcane art; the mmtv project found that too many high value terms (teacher-training, online, education) in a description diluted the page’s ranking. It’s better to be highly-ranked in a few terms

  • Perhaps not so much of a tension as a balance between comprehensiveness and selectivity is required. OER producers need to be good at both.


Rich metadata vs. thin metadata (1/2)

  • How much metadata do you need to create?

  • How much of it is actually used?

    • No answer to this yet

    • programme was deliberately not prescriptive

    • Jorum’s deposit tool expands on this


Rich metadata vs. thin metadata (2/2)

  • Different projects have taken different approaches to description.

    • OpenStaffs: LOM, XCRI

    • ADOME: DC

  • Most projects using metadata seem to have taken a light approach.

  • No clear answers yet

  • Medev OOER project survey about the use of description for learning materials out soon

  • Longer term balance informed by:

    • efforts to track usage and discovery of UKOERs

    • the usability of this material when aggregated in Jorum


Specialist vs. generic standards: description

  • Dublin Core: 15 projects

  • LOM: 9 projects

  • QTI: 9 projects

  • In most cases it seems to relate to the metadata options which the software chosen provides

  • Longer term

    • comparative volume of use (number of OERs)

    • which elements used


Specialist vs. generic standards: packaging

  • Content Packaging: 10 projects

    • 3 projects choosing to use it.

  • Zip: 2 projects

    • But this figure doesn’t reflect use –too obvious to record.

  • Default support by tools and project team background seems to be key factor

  • Perceptions of the available content package creation tools plays a role.


RSS/Atom based dissemination vs. OAI-PMH based dissemination

  • What tools, services, and communities can take advantage of each dissemination approach?

    • most of aggregators of learning resources are based exclusively around RSS/ATOM or support both RSS/ATOM and OAI-PMH.

    • existing OAI-PMH harvesters are firmly focused on the Scholarly Communications community

  • Are there any inherent difficulties in either approach?

    • Both have problems

  • Steer to use RSS/ATOM and many projects using technologies that doesn’t support OAI-PMH.


Summary thoughts

  • The UKOER programme so far:

    • Many diverse choices

    • Thus far no one clear right answer

  • Next steps

    • Ongoing synthesis

    • Tracking work

    • Jorum usage statistics


Further Information

  • http://wiki.cetis.ac.uk/Educational_Content_OER

  • http://jisc.cetis.ac.uk//topic/oer

  • Contact details

    • robert.robertson at strath.ac.uk

    • Lmc at strath.ac.uk

    • Philb at icbl.hw.ac.uk


  • Login