1 / 44

New Tricks for Old Statistics

New Tricks for Old Statistics. Margaret Fain Head of Public Services Jennifer Hughes Head of Access Services Coastal Carolina University, Conway, SC ACRL 14 th National Conference Pushing the Edge: Explore, Engage, Extend March 12-15, 2009 -Seattle, WA. Introduction. Why this workshop?

tivona
Download Presentation

New Tricks for Old Statistics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. New Tricks for Old Statistics Margaret Fain Head of Public Services Jennifer Hughes Head of Access Services Coastal Carolina University, Conway, SC ACRL 14th National Conference Pushing the Edge: Explore, Engage, Extend March 12-15, 2009 -Seattle, WA

  2. Introduction • Why this workshop? • Audience: librarians new to using statistics for meaningful assessment. • Goal: to improve use of statistics for strategic planning and assessment.

  3. Learning Outcomes • Identify new ways of using library statistics in order to achieve short and long term goals. • Learn to incorporate statistics as benchmarks into strategic planning and assessment to show administrators how the library is achieving its mission on campus. • Identify and learn to use NCES statistics and other public data for peer comparisons.

  4. Assumptions • Interest in or need to use statistics for assessment /strategic planning /budgeting. • Need for inexpensive but effective statistical gathering methods. • Lack of time/money/personnel for statistics. • No results = No budget.

  5. Assessment Systematic collection, review, and use of information on programs undertaken for the purpose of improving student learning and development.  • Ongoing • Sustainable

  6. Direct and Indirect Measures: • Information Literacy tests • LibQual+ • Surveys • Users • Staff • Focus Groups • Statistics

  7. Accreditation and Assessment Assessment may be characterized as the third element of a four-step planning-assessment cycle: 1. Defining clearly articulated institutional and unit-level goals; 2. Implementing strategies to achieve those goals; 3. Assessing achievement of those goals; and 4. Using the results of those assessments to improve programs and services and inform planning and resource allocation decisions. (Middle States)

  8. Accrediting Bodies MSCHE http://www.msche.org/publications/Assessment_Expectations051222081842.pdf NEASC http://cihe.neasc.org/standards_policies/standards/standards_html_version NCACS http://www.ncahlc.org/index.php?option=com_content&task=view&id=37&Itemid=116 NWCCU http://www.nwccu.org/Standards%20and%20Policies/Accreditation%20Standards/Accreditation%20Standards.htm SACS http://www.sacscoc.org/pdf/2008PrinciplesofAccreditation.pdf WACS http://www.wascsenior.org/findit/files/forms/Handbook_of_Accreditation___July_2008.pdf

  9. Assessment and Accountability • Focus on student learning outcomes. • Focus on improvements in services and programs (SACS). • Focus on demonstrating service or resource is effective, adequate or appropriate.

  10. Sample from SACS Core statement: The institution, through ownership or formal arrangements or agreements, provides and supports student and faculty access and user privileges to adequate library collections as well as to other learning and information resources consistent with the degrees offered. These collections and resources are sufficient to support all educational, research, and public service programs. Comprehensive statement: The institution provides facilities, services, and other learning/informational resources that are appropriate to support its teaching, research, and service mission.

  11. SACS Supporting Documents Data concerning physical facilities for learning resources [devoted to learning and instructional resources]. Data concerning collections and electronic access at the institution and arrangements with other institutions or organizations. [Lists of instructional resources and services]. Data concerning other information resources available to students at their learning locations.

  12. SACS Evidence Core: Description of the adequacy of learning resources for all credit coursework and programs that the institution offers. Comprehensive: Evidence that resources are appropriate and adequate.

  13. Traditional Statistics Collected on Use of: • Services • Collections • Resources • Buildings

  14. Exercise What are the statistics currently collected in your library or area of responsibility? • Audience responses

  15. Responses

  16. Statistics in a New Light Traditional • Quantitative data • No context • Demonstrates use but not impact • No measures of quality

  17. Statistics in a New Light Assessment oriented • Quantitative AND Qualitative data • Contextual. • Shows impact of library services on users. • Demonstrates library’s contribution to own and institutional goals and objectives.

  18. Courtesy Notices • Quantitative data • number of items overdue per semester • number of items renewed per semester • fine money owed per semester • fine money collected per semester

  19. Courtesy Notices • Qualitative data: • Student survey responses and student and faculty comments indicated general unhappiness with current system • Staff requests to reduce amount of time spent renewing overdue materials (very labor intensive) • Objective: to reduce number of items overdue and reduce fines paid by students and improve overall satisfaction with overdue process.

  20. Courtesy Notices • Changes Made • Implemented email courtesy notices for all patrons Traditional: We are Done New: We have to follow-up and show that it did make a difference.

  21. Courtesy Notices : Analysis of Results • Quantitative data showed • decrease in overdue items • increase in items renewed • decrease in fines owed and fines collected • decrease in number of fines forgiven/reduced. • Qualitative data showed decrease in number of complaints about overdues and fines by all users.

  22. What Are “New” Statistics? • Statistics with value added • Statistics combined with other data sources to “prove” results. • Statistics that show how patrons are using or not using resources/services.

  23. Creative Applications of “New” Statistics • Expand hours with new staff • Building use stats • Circulation stats • Student surveys (3) re: hours of operation • Hours of operation • Current staff working hours

  24. Group Exercise Brainstorm “new” ways of using or combining traditional statistics to demonstrate that resources and services are appropriate and adequate.

  25. Responses

  26. Responses

  27. Break 20 minutes

  28. New Ways to Analyze and Collect Data • Excel • NCES

  29. Excel Basics

  30. Excel Basics

  31. Excel Basics

  32. Excel Basics

  33. Excel Advanced

  34. Excel Advanced Instructions @ http://www.coastal.edu/library/presentations/index.html

  35. NCES for Peer Comparisons • NCES Library Statistics Program http://nces.ed.gov/surveys/libraries/

  36. Quick and Dirty Data Collection • Quick surveys (SNAP, Survey Monkey, Zoomerang) • Automated reports • Samples (data snapshots)

  37. Closing the Loop • Strategic Planning • Short term / Long Term Objectives • Determine data needs up front • What you currently collect / have. • What will be needed to demonstrate objective is accomplished. • Timeframe for collection

  38. Closing the Loop • How will results be used? • How will you document changes made to show improvement? • How will you revise data collection for the next year?

  39. Action Plans Using traditional statistics in new ways, draft a sample action plan which uses the “new” statistics as benchmarks to provide outcomes and indicators of changes made.

  40. Template to Downloadhttp://www.coastal.edu/library/presentations/index.htm Action Plan Objective to be accomplished: Data to be collected: Sources of additional data: Benchmark/Metric: Expected outcomes: Expected indicators of changes made:

  41. Responses

  42. Conclusion

  43. Literature Blake, J. and S. Schleper. “From Data to Decisions: Using Surveys and Statistics to Make Collection Management Decisions." Library Collections, Acquisitions, & Technical Services 28 (2004): 460-464. Cheng, R., S. Bischof, and A. Nathanson. “Data Collection for User-oriented Library Services: Wesleyan University Library’s Experience.” OCLC Systems & Services 18.4 (2002): 195-204. Dilevko, J. “Inferential Statistics and Librarianship.” Library and Information Science Research 29 (2007): 209-229. Intner, S. “Making Your Collections Work for You: Collection Evaluation Myths & Realities.” Library Collections, Acquisitions, & Technical Services 27 (2003):339-350. Luzius, J. “A Look at Circulation Statistics.” Journal of Access Services 2.4 (2004): 15- 22. Welch, J. “Who says we’re not busy? Library web page usage as a measure of public service activity.” Reference Services Review 33.4 (2005): 371-379.

  44. Additional Resources • ARL: New Measures and Assessment Initiatives • http://www.arl.org/stats/initiatives/ • Library Research Service • http://www.lrs.org/index.php • IPEDS • http://nces.ed.gov/IPEDS/

More Related