1 / 55

Assessing the Scholars Portal

Assessing the Scholars Portal. Melody Burton, Queen’s University Toni Olshen, York University Ontario Library Association Superconference February 3, 2005. Today’s Presentation. Overview of Ontario Scholars Portal (OSP) – scholarly resources

karik
Download Presentation

Assessing the Scholars Portal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing the Scholars Portal Melody Burton, Queen’s University Toni Olshen, York University Ontario Library Association Superconference February 3, 2005

  2. Today’s Presentation • Overview of Ontario Scholars Portal (OSP) – scholarly resources • Measuring the Impact of Networked Electronic Services – MINES survey methodology • Overview of Qualitative Research – survey comments and focus groups Questions?

  3. Scholars PortalOntario Council of University Libraries (OCUL) • Ontario Information Infrastructure (OII) funded by the Ontario Innovation Trust in 2001 • Consortia-purchased electronic resources offered through the Ontario Scholars Portal • March 2004, we began the evaluation phase of $7.6 million dollar OII project

  4. Scholars Portal – Project Goals • Centrally mount and deliver information resources acquired through OCUL consortia purchases (e.g. CNSLP, OCUL IR) to ensure rapid and reliable access • Provide for the long term, secure archiving of resources to ensure continued availability

  5. Scholars Portal – Project Goals • Ensure that the resources and services provided meet the needs of faculty, students and staff. • Ensure that resources and services can be seamlessly integrated to the local library and information systems

  6. Why Evaluation? • Feedback to OII and University funders • Understand who, where, and why the digital resources are used • Supplement usage numbers to answer the key question: What is the impact of Portal content on research at Ontario academic libraries?

  7. Evaluating Success • Evaluating Scholars Portal from user and staff points of view • Use a mix of quantitative and qualitative tools for a richer assessment • Are OII projects improving research services? • Does Scholars Portal meet OCUL user and staff expectations?

  8. What do we need to learn? • Do researchers use the service? • Who are they? • Why do they use the service? • What are their views on the quality of the service provided? • How do Scholars Portal impact research, teaching and learning?

  9. MINES (Measuring the Impact of Networked Electronic Services) • OCUL developed a customized version of an instrument and methodology previously tested in academic settings • MINES project is part of the Association of Research Libraries ( ARL) New Measures Initiatives • The studies were designed by Brinley Franklin (UConn) and Terry Plum (Simmons) • MINES survey is one of a new breed of assessment tools that did not exist before because services were not digital 

  10. MINES (Measuring the Impact of Networked Electronic Services)-Desired Outcomes • To capture in-library and remote web usage of the Scholars Portal in a sound representative sample using MINES methodology; • To identify the demographic differences between in-house library users as compared to remote users by status of user;

  11. MINES (Measuring the Impact of Networked Electronic Services)-Desired Outcomes • To identify users’ purposes for accessing Scholars Portal electronic services (funded research, non-funded research, instruction/education use, student research papers and course work); • To assist with the evaluation of the project as well as to capture information for OCUL about indirect research costs; and • To develop an infrastructure to make studies of patron usage of networked electronic resources routine, robust and integrated into the decision-making process.

  12. ARL/MINES – Jan. ’04-Dec. ‘05 • ARL developed random schedule of two-hour sessions per month • OCUL designed local questions, mounted survey, collects and sends data to ARL • ARL compiles survey results for all sites • ARL reports findings on a semi-annual basis • ARL presents findings and final report to project participants on an aggregated and individual institution basis

  13. Evaluation Report- June 2005 • The report of the assessments will provide evidence to support continued funding from individual institutions as well as indicate where funds should be directed to meet ongoing objectives.

  14. MINES Methodology • What user groups use the Scholars Portal? • What specific resources are used? • From where? • How do users learn about the Scholars Portal? • Are there differences in the use of digital resources based on the user's location? • Why use the Scholars Portal? (sponsored research? Instruction? patient care?) • Does use differ by discipline? user group? location?

  15. MINES Methodology • Web-based surveys conducted over the course of a year for each institution • Activated during randomly selected 2-hour survey periods each month as users access one of the Scholars Portal journals • Mandatory, short, and anonymous

  16. MINES Methodology • Methodology is based on an attempt to capture every user of the service • Without a random sampling plan, in which each user has an equal chance of being included in the sample, we cannot really say anything about the population from which the sample is drawn.  • The sample based on random moments (2 hours every month for a year) permits OCUL to make reliable inferences about the population, and to test hypotheses. 

  17. MINES Methodology • Random sampling plan and the mandatory nature of the questions are both required to create a statistically sound study • If the survey is not mandatory, the group of non-respondents is likely to be different from the group of respondents, and we will not know what that difference is • One of the strengths and innovations of this survey technique is that it is based upon actual use, not on predicted, intended, or remembered use

  18. MINES Methodology • Once the survey is completed, the respondent's browser is forwarded to the desired networked electronic resource • If more than one search is carried out, the survey form is auto-populated with user’s responses as defaults which only have to changed if response is different

  19. Development of survey form • Finding balance between simplicity, ease and richness of data elements • Bilingual – University of Ottawa, Laurentian University, Glendon College at York University • Ultimately a change in focus to the creation of a unique data set

  20. MINES Survey Form – Five Questions and a Comment Box

  21. Survey Form • Survey form determined : • users’ status • discipline • location or where accessed from • purpose of use (sponsored research, instruction, patient care, course work) • how the resource was identified (bibliography, colleague, librarian, importantjournal in field etc.)

  22. Informed Consent • Because this is a Web-based survey, the respondents consent to participate by electing to fill out the survey questionnaire • It is the participating library’s responsibility to provide an explanation of the survey and information pertaining to its confidentiality

  23. Confidentiality of Data • Institutional data are confidential. Individual institutions and/or their specific data will not be identified. • Individual data are anonymous. The respondent’s privacy is protected because only very indirect information is captured, which would be difficult to trace back to an individual.

  24. Ethics review • A major step was contacting research ethics officers and/or Ethics Review Boards to get approval, where necessary, to run the survey • Purpose of ethics reviews for human subjects is to prevent putting subjects at risk • Officers/Boards on 16 OCUL campuses accepted that no physical or psychological harm would come to library users who are asked to fill out a brief mandatory anonymous survey before they are connected to the title of their choice.

  25. Mandatory • If individuals chose to avoid filling out the brief anonymous survey, they might be inconvenienced for a maximum of a two-hour period, but they would not be harmed • We needed to balance good data for making decisions and the inconvenience caused to the user.

  26. Ethics Review – Issues and Problems • Mandatory nature of the survey required discussion on some campuses • OII contacts and/or the Directors were involved in this process and interacted with the necessary people on campus to facilitate approval • Several campuses did not require approval because the survey fell into quality assurance guidelines and was seen as a library management tool (8) • Several schools received approval after an application process (8) • One Library and Review Board did not support the mandatory nature of the methodology.

  27. Pre-testing and False start – January – March 2004 • ARL prepared a schedule for the random two-hour monthly runs. • A test run was planned at York and Wilfrid Laurier in January with the real survey commencing at the end of February. • The pilot in January failed at York and highlighted the need for all institutions to be using a link resolver URL when connecting to the Scholars Portal from their catalogues or eResources databases. • Each site reviewed their configuration and necessary changes were made.

  28. Pre-testing and False start – January – March 2004 • Survey form and the explanatory material were translated into French for bilingual Ottawa, Laurentian, and Glendon College at York. • February run highlighted concerns about the data collection. The technical infrastructure was capturing only access through library catalogues or eresource databases, but not from the use of the Scholars Portal directly. • There were some technical problems with the February and March runs and the validity of the data was under question. The data-collection programming was revisited.

  29. Lessons learned • Early runs taught us a great deal about the different ways OCUL libraries access the Scholars Portal • We needed to reflect that in the data gathering • Meetings were held to discuss the changes that needed to be made.

  30. Lessons learned • As originally planned, we now capture as much usage as possible that comes from : • local eresource databases • library catalogues • Scholars Portal browse and search functions.

  31. New Definition of Usage for MINES • A successful search is now defined as connecting the user to an article of interest for viewing, downloading or printing • Definition is unique to Scholars Portal because of consortial server setup and archiving of content • We cancelled the April 20 run and reset the dates of the survey from May 2004 through April 2005, considering the February and March runs as tests.

  32. New Definition of Usage for MINES - Innovation • We continue to build on the unique opportunity we have to gather useful data that is not open to other types of library groups • By implementing the MINES survey, OCUL is ahead of other projects in that we are not held "hostage" to the limitations and inconsistencies of vendor statistics • We have opportunities to disseminate research on measurement of networked resources through conferences and publications

  33.  Scholars Portal Statistical Reporting

  34. MINES Preliminary Output: MAY –AUGUST 2004 – 5223 respondents

  35. Applied Sciences Business Education Environmental Studies Fine Arts Humanities Law Medical Health Sciences Social Sciences Other 804 17.5% 146 3.2 176 3.8 160 3.5 22 .5 93 2.0 21 .5 1341 29.2 1031 22.4 673 14.6 129 2.8 Very Preliminary Findings – 4 months of data -Subject Affiliation

  36. Faculty GraduateProfessional Undergraduate Library Staff Staff Other 764 16.6% 2068 45.0 1039 22.6 47 1.0 427 9.3 251 5.5 Very Preliminary Findings – 4 months of dataUser Status

  37. Library Off-Campus On-Campus ( but not in the library) 578 12.6% 1978 43.6 2040 44.4 Very Preliminary Findings –4 months of data - Location

  38. Sponsored research Other non-sponsored research Teaching Course work Patient care Other activities 2189 47.6% 919 20.0 278 6.0 686 14.9 143 3.1 381 8.3 Very Preliminary Findings-4 months of data - Purpose of Use

  39. Cross Tabulations • Purpose of use by affiliation, user status, location, why • Location by affiliation, user status, purpose of use, why • Why by affiliation, user status, location, purpose of use • Which titles used by which users for which purposes

  40. Location and Purpose of Use

  41. Additional Qualitative Data • MINES Survey respondent comments • What does the range of institutional experiences reveal? • What anecdotal data can faculty and students add to the development of the Scholars Portal?

  42. What is the relationship of usage data to MINES findings?

  43. Qualitative Data What is the story behind the numbers? Where can we look for answers?

  44. Qualitative Evaluation • Evaluation components: • Usage data (data) • MINES survey (data) --------------------------------------------------- • MINES survey (comments) • Focus group (comments) quantitative qualitative

  45. Why collect qualitative data? • Tells the story behind the numbers • Ensures evaluation not simply • How many? How often? What? • Identifies impact and future growth • Dialogue with users about product

  46. Qualitative Data • Includes: • Comments from MINES survey • Available to all – few elect to submit • Focus group comments • Available to few – lengthy responses to few key questions

  47. Focus Group Composition • Target audiences: Faculty Graduate students Undergraduates • Offer focus groups by user category not discipline • 6 – 8 participants • Voluntary among OCUL libraries

  48. Focus Group Questions • What impact has the Scholar’s Portal had on your research, teaching and learning? • What specific features do you use or want to know more about? • The portal is evolving – how would you like to see it grow and change?

  49. Rewards • For them: • Modest incentives ($10 copy card, coffee and snacks) • For us: • Clarification of portal use, value to researchers, students • Deeper understanding of use by asking “tell us more” or “give me an example” • Opportunity to observe reactions of other participants

  50. Using Anecdotal Information Focus group comments = experiential or anecdotal data • Plain language descriptions by users of their experiences • poignant, concise oral evidence of how portal is received by users • priceless

More Related