1 / 67

Service Quality Assessment in Research Libraries The Guelph/Waterloo ARL LibQUAL Projects

Service Quality Assessment in Research Libraries The Guelph/Waterloo ARL LibQUAL Projects. Mark Haslett Ron MacKinnon Susan Routliffe February 1, 2002. Structure of today’s session. Part 1: History & context of LibQUAL Part 2: Local administration Part 3: Local results Questions.

denim
Download Presentation

Service Quality Assessment in Research Libraries The Guelph/Waterloo ARL LibQUAL Projects

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Service Quality Assessment in Research LibrariesThe Guelph/Waterloo ARL LibQUAL Projects Mark Haslett Ron MacKinnon Susan Routliffe February 1, 2002

  2. Structure of today’s session • Part 1: History & context of LibQUAL • Part 2: Local administration • Part 3: Local results • Questions

  3. Part 1 History & context of LibQUAL

  4. ARL New MeasuresInitiative • October 1999 Membership Meeting, the ARL New Measures Initiative was established in response to the following two needs: • Increasing demand for libraries to demonstrate outcomes/impacts in areas important to the institution. • Increasing pressure to maximize use of resources - benchmark best practices to save or reallocate resources.

  5. ARL New MeasuresInitiative • Assessing outcomes important to students and faculty • Maximizing access to information resources • Bench marking best practices • Improving services • Reallocating resources

  6. ARL New MeasuresInitiative • Higher education outcomes research review • LibQUAL (measures for library service quality) • Investigation of cost drivers (e.g. technical services cost study) • Development of a self-assisted guide for measuring performance of ILL/DD • E-Metrics (measures for electronic resources)

  7. Some more context • Traditional focus on inputs • “Research libraries have always placed value in describing their… resources & services.” • Strong history of statistical data collection

  8. Research libraries searching for improved measures • Past practice equated use with value and quantitywith quality • Resulted in focus on tonnage • But what about the outcome value for faculty & students?

  9. Research libraries searching for improved measures New strategic objective for ARL: “the need for alternatives to expenditure metrics as measures of library performance…”

  10. We need to listen to our users • In order to help “describe and measure the performance of research libraries …” • Such listening should provide opportunities to: • Develop & revise our services • Use our information resources effectively • Provide for continuous assessment & accountability

  11. Multiple methods of listening • Active listening • Complaints and suggestions • Focus groups • Students & faculty on committees • Web usability studies • Surveys

  12. ARL’s LibQUAL proposal • A web survey instrument • Identify user expectations -- and user perceptions of how they're met • Not a forecasting or predictive tool • Not a ranking tool

  13. LibQUAL goals • Establish a library service quality assessment program at ARL • Develop web-based tools for assessing library service quality • Develop mechanisms and protocols for evaluating libraries • Identify best practices in providing library service

  14. LibQUAL • A research anddevelopment project • Aim is to have a mature web survey instrument within 4 to 5 years • Focus on client expectations and perceptions

  15. LibQUAL’s origins • Based on SERVQUAL survey instrument • Developed in the 1980s for use in the for-profit sector • Utilizes gap theory to measure service quality • “Only the perceptions of the customers matter.”

  16. What does SERVQUAL measure? FIVE “Dimensions” of service as perceived by customers: • Tangibles • Reliability • Responsiveness • Assurance • Empathy

  17. LibQUAL (2000/2001)Nine dimensions of service • Tangibles • Reliability • Responsiveness • Assurance • Empathy • Access to Collections • Library as Place • Self-Reliance • Instruction

  18. LibQUAL start-up • Spring 1999 ARL meeting – ARL decision to engage in “LibQUAL” pilot project with Texas A&M • October 2000 – ARL Symposium on “Measuring Service Quality.”

  19. LibQUAL phases • Phase 1: 1999/2000 • 12 participating libraries • One Canadian – York • Phase 2: 2000/2001 • 43 participating libraries • Three Canadian - Guelph, McGill, Waterloo • Phase 3: 2001/2002 • 171 participating libraries • Four Canadian – Alberta, Calgary, McGill, York

  20. Purpose of LibQUAL phase 2 • Test what was learned in Phase 1 • Increase sample size and diversity • More Canadian universities • Additional questions on, for example, user self-reliance

  21. Benefits of participation in phase 2 • ARL’s collective work • Information about user expectations • Opportunity to identify service areas for further review • Sharing best practices • Experience with this type of survey • Experience analyzing the data

  22. Part 2 Local administration of the survey

  23. Research ethics approval • UW • UG

  24. Survey population • How Many • How Selected

  25. Email addresses • How obtained • Substitutions • Accuracy

  26. Incentive to participate • Project wide incentive • Palm pilot • Local incentives • Gift certificates

  27. Demographic Detail • Population total • Students by discipline and year • Faculty by status

  28. Start and finish dates • March 15 – March 30 • PRAGMATIC FACTORS: • March Break • End of classes • Beginning of Exams

  29. Testing the questionnaire • Why • What we found

  30. Messages to Survey Sample • Four messages • Invitation • URL • Reminders

  31. Responding to questions/comments/complaints • Who • How much time

  32. Nature of questions/comments/complaints • Technical problems • Can’t/won’t respond • Respond later • Already responded • Spam • Survey….

  33. Comments about the survey • Too long • Redundant questions • Rating scale is too broad and not meaningful • Questions are confusing

  34. Comments about the Survey • More questions about collections • No opportunity to provide comments • Endless questions… • Minimal/desired/perceived format not is desirable; prefer strongly disagree/agree format

  35. Comments about the Survey • Poor visual layout, small dots on beige page • Uninviting layout, too dense • Questions didn’t all fit on a screen, needed to constantly scroll back and forth

  36. Comments about the Survey • No way to save a partially completed survey • Total number of questions should have been indicated at the beginning

  37. Comments about the Survey • Should have let respondents indicate which library they were commenting on • Age and sex are never relevant on a survey • Irrelevant questions

  38. Survey administration wrap-up • Summary report to Texas A&M / ARL project team

  39. Part 3 Local results: What we learned

  40. Three areas • Demographic data • Satisfaction data: • Expectations & perceptions • Data models

  41. Area 1: Demographic Data • Good match with known discipline populations • More in common than we thought • May foster collaboration rather than competition

  42. Respondents by Discipline UW UG • Agr/Envl Studies 72 137 • Architecture 5 3 • Business 47 38 • Education 1 2 • Engineering 222 43 • General Studies 1 5 • Health Sciences 43 84 • Humanities 78 50 • Other 69 49 • Performing & Fine Arts 11 10 • Science 289 228 • Social Sciences 124 145 • Undecided 5 0 • Total 967 794

  43. Library Use on Premises

  44. Electronic Resource Use

  45. Respondents by Sex

  46. Area 2: Satisfaction Data • Caution • Do not over-interpret the data • Mature methodology • Not a mature instrument (R&D) • Even when “mature”, gaps only indicate probable concern  further investigation

  47. Example: When it comes to complete runs of journal titles… My desired service level is 1..2..3..4..5..6..7..8..9 My minimum service level is 1..2..3..4..5..6..7..8..9 My perception of the library’s service performance is 1..2..3..4..5..6..7..8..9

  48. Zone of tolerance Desired level--------------------------------- Zone of Tolerance Minimum level-------------------------------

  49. Zone of Tolerance / sample

  50. Above and within the zone of tolerance • We do not exceed the zone of tolerance in any dimension • We are within the zone for most areas: • Assurance • Empathy • Responsiveness • Tangibles • Self-Reliance • Instruction

More Related