slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL PowerPoint Presentation
Download Presentation
ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL

Loading in 2 Seconds...

play fullscreen
1 / 47

ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL - PowerPoint PPT Presentation

  • Uploaded on

MINES for Libraries. ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL. Toni Olshen York University. Association of College and Research Libraries ACRL Conference 2005 Minneapolis April 7, 2005. Ontario Council of University Libraries.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL' - cynthia-mcmahon

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

MINES for Libraries

ACRL: Outcome Assessment Tools for the Library of the Future: MINES at OCUL

Toni Olshen

York University

Association of College and Research Libraries

ACRL Conference 2005


April 7, 2005

ontario council of university libraries
Ontario Council of University Libraries
  • OCUL is a consortium of twenty university libraries in the province of Ontario
  • The member libraries cooperate to enhance information services through: resource sharing, collective purchasing, document delivery and many other similar activities.

ocul members those in green are arl libraries
Brock University

Carleton University

University of Guelph

Lakehead University

Laurentian University

McMaster University

Nipissing University

Ontario College of Art & Design

University of Ontario Institute of Technology

University of Ottawa

Queen's University

Royal Military College of Canada

Ryerson Polytechnic University

University of Toronto

Trent University

University of Waterloo

University of Western Ontario

Wilfrid Laurier University

University of Windsor

York University

OCUL Members – Those in Green are ARL Libraries

member institution enrolments 2003 total undergraduate and graduate students
University of Ontario Institute of Technology 936

Royal Military College of Canada 1,941

Ontario College of Art and Design 3,062

Nipissing University 5,478

Lakehead University 7,304

Trent University 7,388

Laurentian University 8,751

Wilfrid Laurier University


Brock University 15,527

University of Windsor 16,266

University of Guelph 19,096

Queen's University 20,034

McMaster University 22,064

Carleton University 22,535

University of Waterloo 25,029

Ryerson University 27,221

University of Ottawa 30,948

University of Western Ontario


York University 46,794 

University of Toronto 68,290

Total 391,933

90% undergrads 10% graduate

12,500 faculty

Member Institution Enrolments (2003)- Total Undergraduate and graduate students

scholars portal what is it
Scholars Portal – What is it?
  • A unique set of shared information resources

and services

  • Resources acquired and managed through OCUL with funding support from a 5-year grant from the Ontario Innovation Trust (OII), a provincial funding body
  • Resources are made available to researchers and students in Ontario through their own university libraries

scholars portal ontario council of university libraries ocul
Scholars PortalOntario Council of University Libraries (OCUL)
  • Ontario Information Infrastructure (OII) funded by the Ontario Innovation Trust in 2001 for five years
  • Consortia-purchased electronic resources offered through the Ontario Scholars Portal
  • March 2004, we began the evaluation phase of $7.6 million dollar OII project

scholarly information resources
Scholarly Information Resources
  • As of the end of March, contains 7,547,904 full text articles from 6,783 full text journals published by 12 academic publishers
  • Coverage of most disciplines but concentration in sciences
  • Current and historic coverage
  • One of the largest collections of electronic

journals available to researchers anywhere

scholars portal resources
Scholars Portal Resources
  • Academic Press,
  • American Psychological Association,
  • American Chemical Society,
  • Berkeley Electronic Press,
  • Cambridge University Press,
  • Emerald Publishing,
  • Elsevier Science (Elsevier Science, Harcourt Health Sciences),
  • Kluwer (Kluwer Academic Publishers, Kluwer Law International and Kluwer/Plenum),
  • Oxford University Press,
  • Project MUSE,
  • Springer-Verlag, and
  • John Wiley & Sons.

scholars portal project goals
Scholars Portal – Project Goals
  • Centrally mount and deliver information resources acquired through OCUL consortia purchases to ensure rapid and reliable access
  • Provide for the long term, secure archiving of resources to ensure continued availability

scholars portal project goals1
Scholars Portal – Project Goals
  • Ensure that the resources and services provided meet the needs of faculty, students and staff.
  • Ensure that resources and services can be seamlessly integrated to the local library and information systems

measuring success
Measuring Success
  • OCUL provides a sophisticated statistical report mechanism. (see next slide). Download statistics are a rough measure of value but we need more to properly assess impacts.
  • Need to measure also the significance for

research of access to e-journals

  • Employing ARL MINES Survey methodology to capture information on how resources are being used (from where, by whom, and for what purposes)

sp statistics and report generator
SP Statistics and Report Generator

why evaluation
Why Evaluation?
  • Feedback to OII and University funders
  • Understand who, where, and why the digital resources are used
  • Supplement usage numbers to answer the key question:

What is the impact of Portal content on research at Ontario academic libraries?

evaluating success
Evaluating Success
  • Evaluating Scholars Portal from user and staff points of view
  • Use a mix of quantitative and qualitative tools for a richer assessment: MINES, focus groups, staff survey
  • Are OII projects improving research services?
  • Does Scholars Portal meet OCUL user and staff expectations?

mines m easuring the i mpact of n etworked e lectronic s ervices
MINES (Measuring the Impact of Networked Electronic Services)
  • MINES survey is one of a new breed of assessment tools that did not exist before because services were not digital.

mines m easuring the i mpact of n etworked e lectronic s ervices desired outcomes
MINES (Measuring the Impact of Networked Electronic Services)-Desired Outcomes
  • To capture in-library and remote web usage of the Scholars Portal in a sound representative sample using MINES methodology;
  • To identify the demographic differences between in-house library users as compared to remote users by status of user;

mines m easuring the i mpact of n etworked e lectronic s ervices desired outcomes1
MINES (Measuring the Impact of Networked Electronic Services)-Desired Outcomes
  • To identify users’ purposes for accessing Scholars Portal electronic services (funded research, non-funded research, instruction/education use, student research papers and course work);
  • To assist with the evaluation of the project as well as to capture information for OCUL about indirect research costs; and
  • To develop an infrastructure to make studies of patron usage of networked electronic resources routine, robust and integrated into the decision-making process.

mines methodology
MINES Methodology
  • What user groups use SP?
  • What specific resources are used?
  • From where?
  • How do users learn about SP?
  • Are there differences in the use of digital resources based on the user's location?
  • Why use SP? (sponsored research? Instruction? patient care?)
  • Does use differ by discipline? user group? location?

mines methodology1
MINES Methodology
  • Web-based surveys conducted over the course of a year for each institution
  • Activated during randomly selected 2-hour survey periods each month as users access one of SP’s journals
  • Mandatory, short, and anonymous

arl mines jan 04 dec 05
ARL/MINES – Jan. ’04-Dec. ‘05
  • ARL developed random schedule of two-hour sessions per month
  • OCUL designed local questions, mounted survey, collects and sends data to ARL
  • ARL compiles survey results for all sites
  • ARL reports findings on a semi-annual basis
  • ARL presents findings and final report to project participants on an aggregated and individual institution basis

development of survey form
Development of survey form
  • Finding balance between simplicity, ease and richness of data elements
  • Bilingual – University of Ottawa, Laurentian University, Glendon College at York University
  • Ultimately a change in focus to the creation of a unique data set

survey form
Survey Form
  • Survey form determined :
    • users’ status
    • Discipline (affiliation)
    • location or where accessed from
    • purpose of use (sponsored research, instruction, patient care, course work)
    • how the resource was identified (bibliography, colleague, librarian, importantjournal in field etc.)

ocul definition of usage for mines
OCUL Definition of Usage for MINES
  • A successful search connecting the user to an article of interest for viewing, printing or downloading
  • Unique to Scholars Portal because of consortia server setup and archiving of all journals

mines methodology2
MINES Methodology
  • Random sampling plan and the mandatory nature of the questions are both required to create a statistically sound study
  • If the survey is not mandatory, the group of non-respondents is likely to be different from the group of respondents, and we will not know what that difference is
  • One of the strengths and innovations of this survey technique is that it is based upon actual use, not on predicted, intended, or remembered use

ocul implementation of mines
OCUL Implementation of MINES
  • Once the survey is completed, the respondent's browser is forwarded to the desired networked electronic resource
  • If more than one search is carried out, the survey form is auto-populated with user’s responses as defaults which only have to change if response is different

informed consent
Informed Consent
  • Because this is a Web-based survey, the respondents consent to participate by electing to fill out the survey questionnaire
  • It is the participating library’s responsibility to provide an explanation of the survey and information pertaining to its confidentiality

confidentiality of data
Confidentiality of Data
  • Institutional data are confidential. Individual institutions and/or their specific data will not be identified.
  • Individual data are anonymous. The respondent’s privacy is protected because only very indirect information is captured, which would be difficult to trace back to an individual.

ethics review
Ethics Review
  • A major step was contacting research ethics officers and/or Ethics Review Boards to get approval, where necessary, to run the survey
  • Purpose of ethics reviews for human subjects is to prevent putting subjects at risk
  • Officers/Boards on 16 OCUL campuses accepted that no physical or psychological harm would come to library users who are asked to fill out a brief mandatory anonymous survey before they are connected to the title of their choice.

ethics review1
Ethics Review
  • Reference to interesting opinion piece by J. Paul Grayson. “How Ethics Committees are Killing Survey Research on Canadian Students”. University Affairs, January 2004.

mandatory survey
Mandatory Survey
  • If individuals chose to avoid filling out the brief anonymous survey, they might be inconvenienced for a maximum of a two-hour period, but they would not be harmed
  • We needed to balance good data for making decisions and the inconvenience caused to the user.

ethics review issues and problems
Ethics Review – Issues and Problems
  • Mandatory nature of the survey required discussion on some campuses
  • Several campuses did not require approval because the survey fell into quality assurance guidelines and was seen as a library management tool (8)
  • Several schools received approval after an application process (8)
  • One Library and Review Board did not support the mandatory nature of the methodology so that school dropped out of the project.

pre testing and false start january march 2004
Pre-testing and False start – January – March 2004
  • ARL prepared a schedule for the random two-hour monthly runs.
  • A test run was planned at York and Wilfrid Laurier in January with the real survey commencing at the end of February.
  • The pilot in January failed at York and highlighted the need for all institutions to be using a link resolver URL when connecting to SP journals from their catalogues or eResources databases.
  • Each site reviewed their configuration and necessary changes were made.

pre testing and false start january march 20041
Pre-testing and False start – January – March 2004
  • Survey form and the explanatory material were translated into French for bilingual Ottawa, Laurentian, and Glendon College at York.
  • February run highlighted concerns about the data collection.
  • The technical infrastructure was capturing only access through library catalogues or eresource databases, but not from the use of the SP directly.
  • There were some technical problems with the February and March runs and the validity of the data was under question. The data-collection programming was revisited.

lessons learned
Lessons Learned
  • Early runs taught us a great deal about the different ways OCUL libraries access the SP
  • We needed to reflect that in the data gathering

lessons learned1
Lessons Learned
  • As originally planned, we now capture as much usage as possible that comes from :
    • local eresource databases
    • library catalogues
    • Scholars Portal browse and search functions.

new definition of usage for mines
New Definition of Usage for MINES
  • A successful search is now defined as connecting the user to an article of interest for viewing, downloading or printing
  • Definition is unique to Scholars Portal because of consortial server setup and archiving of content
  • We cancelled the April 20 run and reset the dates of the survey from May 2004 through April 2005, considering the February and March runs as tests.

new definition of usage for mines innovation
New Definition of Usage for MINES - Innovation
  • We continue to build on the unique opportunity we have to gather useful data that is not open to other types of library groups. By the end of March about 22,500 surveys have been completed. One more month to go!
  • By implementing the MINES survey, OCUL is ahead of other projects in that we are not held "hostage" to the limitations and inconsistencies of vendor statistics
  • We have opportunities to disseminate research on measurement of networked resources through conferences and publications

mines very preliminary output may august 2004 5223 respondents
MINES Very Preliminary Output: MAY –AUGUST 2004 5223 respondents

very preliminary findings 4 months of data subject affiliation
Applied Sciences



Environmental Studies

Fine Arts



Medical Health


Social Sciences


804 17.5%

146 3.2

176 3.8

160 3.5

22 .5

93 2.0

21 .5

1341 29.2

1031 22.4

673 14.6

129 2.8

Very Preliminary Findings – 4 months of data -Subject Affiliation

very preliminary findings 4 months of data user status



Library Staff



764 16.6%

2068 45.0

1039 22.6

47 1.0

427 9.3

251 5.5

Very Preliminary Findings – 4 months of dataUser Status

very preliminary findings 4 months of data location


On-Campus ( but not in the library)

578 12.6%

1978 43.6

2040 44.4

Very Preliminary Findings –4 months of data - Location

very preliminary findings 4 months of data purpose of use
Sponsored research

Other non-sponsored research


Course work

Patient care

Other activities

2189 47.6%

919 20.0

278 6.0

686 14.9

143 3.1

381 8.3

Very Preliminary Findings-4 months of data - Purpose of Use

cross tabulations
Cross Tabulations
  • Purpose of use by affiliation, user status, location, why
  • Location by affiliation, user status, purpose of use, why
  • Why by affiliation, user status, location, purpose of use
  • Which titles used by which users for which purposes

location and purpose of use
Location and Purpose of Use

additional qualitative data
Additional Qualitative Data
  • MINES Survey respondent comments
  • Staff Survey: What does the range of institutional experiences reveal?
  • Focus Groups: What anecdotal data can faculty and students add to the development of the Scholars Portal?

thank you for your attention
Thank you for your attention!
  • Questions?
  • Toni Olshen: