1 / 62

Making Library Assessment Work

Making Library Assessment Work. Steve Hiller and Jim Self University of Washington and University of Virginia Association of Research Libraries. ARL 4th Human Resources Management Symposium Washington, D.C. November 9, 2004. Why Assess?. Accountability and justification

hatch
Download Presentation

Making Library Assessment Work

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Making Library Assessment Work Steve Hiller and Jim Self University of Washington and University of Virginia Association of Research Libraries ARL 4th Human Resources Management Symposium Washington, D.C. November 9, 2004 www.arl.org

  2. Why Assess? • Accountability and justification • Improvement of services • Comparison with others • Identification of changing patterns • Identification of questionable services • Marketing and promotion www.arl.org

  3. Good assessment practices • Focus on the user • Diverse samples of users • Fair and unbiased queries • Measurable results • Criteria for success • Qualitative and quantitative techniques • Corroboration www.arl.org

  4. Assessment is not… • Quick and easy • Free and easy • A one-time effort • A complete diagnosis • A roadmap to the future www.arl.org

  5. “…but to suppose that the facts, once established in all their fullness, will ‘speak for themselves’ is an illusion.” Carl Becker Annual Address of the President of the American Historical Association, 1931 www.arl.org

  6. What Does it Mean?Understanding Your Data • Scan results for basic overview • Frequencies, means, patterns, variation • Use statistical analysis that make sense • Qualitative information and comparisons provide context and understanding • Seek internal or external validation • Within same data sets or others • Identify what is important and why www.arl.org

  7. Communicating and Using Results • Identify key findings, not all results • Mix text, data, and graphics • avoid jargon • add context • Know your audiences. Make it understandable • Prioritize potential action items and follow-up • Identify “Handoffs” to those responsible for action • Look for some easy “wins” • Quick, inexpensive, and noticeable • Report results www.arl.org

  8. Effective AssessmentEasier Said Than Done • Libraries in many cases arecollecting data without really having the will, organizational capacity, or interest to interpret and use the data effectively in library planning. • The profession could benefit from case studies of those libraries that have conducted research efficiently and applied the results effectively. (Denise Troll Covey, Usage and Usability Assessment: Practices and Concerns, 2002) www.arl.org

  9. Two Approaches to Assessment • University of Washington • User needs assessment • Large-scale cyclical surveys and ongoing qualitative input • Assessment distributed throughout organization • University of Virginia • Performance and financial standards • Compilation of data from varied sources • Centralized Management Information Services unit www.arl.org

  10. UW Libraries Assessment Organization • Library Assessment Coordinator (50%) • Chairs Library Assessment Group (9 members) • Coordinates and develops broad-based user needs assessment efforts (surveys, focus groups, observation) • Encourages and supports other assessment work • Shared and Distributed Assessment Activities Usability (Web Services) E-Metrics (Assessment, Collection Management Services) Management information (Assessment, Budget, CMS) Instruction (Information Literacy, Assessment) Digital Library (Digital initiatives, Public Services, Assessment) www.arl.org

  11. UW Assessment Methods • Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004 • In-library use surveys every 3 years beginning 1993 • LibQUAL+™ in 2000, 2001, 2002, 2003 • Focus groups on varied topics (annually since 1998) • Observation (guided and non-obtrusive) • Usability • E-Metrics www.arl.org

  12. Growing Assessment at UWFromProject-Based to Ongoing and Sustainable • Libraries’ first strategic plan in 1991 called for survey as part of user-centered services philosophy • Initial large scale library survey done in 1992 as “one-time project” • Library Services Committee formed in 1993 • Conducted in-library use surveys in 1993,1996, triennial survey in 1995 • Library Assessment Group appointed in 1997 • Focus groups, observation studies, in-library and triennial surveys • Collection Management Services, 1997 • E-Metrics and collections use • Library systems, 1997 • Usability, Web usage logs • Library Assessment Coordinator (50%) appointed 1999 www.arl.org

  13. UW Assessment Priorities • Information seeking behavior and use • Library use patterns • Library importance and impact • User priorities for the library • User satisfaction with services, collections, overall • Using data to make informed decisions that lead to library improvement www.arl.org

  14. How UW Has Used Assessment Information • Understand that different academic groups have different needs • Make our physical libraries “student” places • Identify student information technology needs • Move to desktop delivery of resources • Enhance resource discovery tools • Provide standardized service training for all staff • Stop activities that do not add value to users • Consolidate and merge branch libraries www.arl.org

  15. Branch Library ConsolidationA UW Case Study • 3 Social Science libraries consolidated in 1994 • Described in ARL SPEC Kit • Review Committee formed in 2002 • Changing use patterns a bigger driver than budget • Review to be objective and data-based • Identify one library to be consolidated into main library www.arl.org

  16. Performance Measures to Assess Branch Library Viability • Use • Print items, photocopies, reference questions, gate counts • Primary user population • Number of faculty and students, change over time • Facility quality • For users, collections, and staff • Physical library dependency of primary users www.arl.org

  17. Data Sources Used • Library generated use data (including trend data) • Electronic resources use data supplied by vendors • University enrolment data (including trend data) • Interviews, focus groups, survey comments • Facility data • Survey data • Triennial survey • In-library use • Cost data www.arl.org

  18. Primary User Groups by Branch Library (2003 University data) www.arl.org

  19. Facility Space Quality: Methodology • Discussed facility issues with unit staff • Reviewed user survey comments from 2001 and 2002 • Used previous focus group data for fine arts libraries • Developed list of criteria • A team of 3 walked through each unit • A second walk through was conducted 2 months later • Each member of the team assigned a score of 1 to 5 for quality of staff, collections, and user spaces. • Scores were compared and made consistent. www.arl.org

  20. Facility Quality www.arl.org

  21. Science Faculty Libraries Used Regularly (2001 Survey) www.arl.org

  22. Forestry Faculty and Grad StudentsFrequency of Library Use www.arl.org

  23. www.arl.org

  24. Merger Time Line • Review Group formed Spring 2002 • Recommendations submitted February 2003 • Merger of Forest Resources Library • Identification of two other libraries for later merger • Recommendation accepted June 2003 • Joint implementation team appointed September 2003 • Forestry faculty and students surveyed and presentation made February 2004 • Forest Resources Library merged into Natural Sciences Library August 2004 www.arl.org

  25. Triennial Survey Spring 2004Satisfaction All Faculty and Forestry Faculty(1998, 2001, 2004) www.arl.org

  26. The University of Virginia LibraryOrganizational Culture • Customer Service • Collecting and using data • Innovation • Flexibility • Learning and development • Participation and discussion • Pride www.arl.org

  27. In the words of our leader… • Use data to IMPROVE • Services • Collections • Processes • Performance • Etc., etc. • Don’t use data to preserve the status quo -Karin Wittenborg University Librarian, University of Virginia June 24, 2004 www.arl.org

  28. University of Virginia LibraryOrganizing for Assessment • Management Information Services unit • Established in 1996 • Currently 3 staff • Resource for library management and staff • Advocates for sustainable assessment • Centralized data collection, analysis and compilation • Multifaceted approaches www.arl.org

  29. Collecting the Data at U.Va. • Customer Surveys • Staff Surveys • Mining Existing Records • Comparisons with peers • Qualitative techniques www.arl.org

  30. Customer Surveys • Faculty • 1993, 1996, 2000, 2004 • Students • 1994, 1998, 2001, 2005 • Separate analyses for grads and undergrads www.arl.org

  31. www.arl.org

  32. Faculty Priorities1993 to 2004 www.arl.org

  33. Using Customer Survey Results – UVa • Additional resources for the science libraries (1994+) Major renovation (2001) • Revision of library instruction for first year students (1995) • Redefinition of collection development (1996) • Initiative to improve shelving (1999) • Undergraduate library open 24 hours (2000) • Additional resources for the Fine Arts Library (2000) • Support transition from print to electronic journals (2004) www.arl.org

  34. Staff Surveys • Internal Customer Service • 2002, 2003, 2004 • 1 to 5 satisfaction scale • Worklife Survey • 2004 • Agree or disagree with positive statements www.arl.org

  35. Internal Customer Service Surveys • Ratings (1 to 5) of units providing service to other library staff • Reports to managers and administrators • Anonymous structured interviews to follow up • Survey expanded in 2004 to include all library departments www.arl.org

  36. Worklife Survey • Areas of inquiry • Job Satisfaction • Interpersonal Relations • Communications & Collaborations • Diversity • Resource Availability • Staff Development • Health & Safety • Report at Library ‘Town Meeting’ • Focus groups following up www.arl.org

  37. Data Mining • Acquisitions • Circulation • Finance • University Records www.arl.org

  38. Acquisitions Expenditures by FormatUniversity of Virginia Library www.arl.org

  39. University of Virginia Library Serving the Customer www.arl.org

  40. University of Virginia Library Serving the Customer www.arl.org

  41. Comparisons with Peers • Within the University • Within ARL www.arl.org

  42. Expenditures of UVA Academic Division1989—2003 Research (+219%) Other Academic Support (+200%) Total Academic Division (+140%) Libraries (+81%) Instruction (+80%) www.arl.org

  43. Median Faculty SalariesUniversity of Virginia LibraryCompared to ARL Median www.arl.org

  44. Qualitative Techniques • Focus Groups • Preparation for work life survey • Follow up to work life survey • Structured Interviews • Anonymous follow-up to customer service survey • Open Discussions www.arl.org

  45. Corroboration • Data are more credible if they are supported by other information • John Le Carre’s two proofs www.arl.org

  46. Analyzing Survey Results • Two Scores for Resources, Services, Facilities • Satisfaction = Mean Rating (1 to 5) • Visibility = Percentage Answering the Question • Permits comparison over time and among groups • Identifies areas that need more attention www.arl.org

  47. UVa Reference Activity and Reference Visibility in Student Surveys www.arl.org

  48. The Balanced ScorecardManaging and assessing data • The Balanced Scorecard is a layered and categorized instrument that • Identifies the important statistics • Ensures a proper balance • Organizes multiple statistics into an intelligible framework www.arl.org

  49. The scorecard measures are “balanced” into four areas • The user perspective • The finance perspective • The internal process perspective • The future (learning and growth) perspective www.arl.org

  50. Metrics • Specific targets indicating full success, partial success, and failure • At the end of the year we know if we have met our target for each metric • The metric may be a complex measure encompassing several elements www.arl.org

More Related