1 / 33

The Role of Assessment in Research Libraries

The Role of Assessment in Research Libraries. Initiative to Recruit a Diverse Workforce Leadership Symposium January 21, 2006 • San Antonio, TX Julia C. Blixrud, ARL Assistant Executive Director, External Relations. Familiar Measures. Inputs Collection size Expenditures Staffing Outputs

Download Presentation

The Role of Assessment in Research Libraries

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Role of Assessment in Research Libraries Initiative to Recruit a Diverse Workforce Leadership Symposium January 21, 2006 • San Antonio, TX Julia C. Blixrud, ARL Assistant Executive Director, External Relations

  2. Familiar Measures • Inputs • Collection size • Expenditures • Staffing • Outputs • Services • People served • Ratios (inputs  outputs) • e.g., expenditures per FTE

  3. Higher Education Challenges Educational institutions today face new and significant challenges stemming from disruptions of financial markets, introduction of new technologies, demands for greater efficiency, and unprecedented requirements for investment in faculty, research, and infrastructure Some Early Reflections on TIAA-CREF by Herbert M. Allison (February 2003)

  4. Research Library Environment • Increased customer and stakeholder expectations for services, including quality and responsiveness • Greater demands for accountability • Exploding growth in use and applications of technology • Increasing competition for resources • Need for use of reliable and valid data

  5. Opportunities and Pressures • Increasing demand for libraries to demonstrate outcomes/impacts in areas of importance to institution • Increasing pressure to maximize use of resources through benchmarking resulting in: • Cost savings • Reallocation

  6. Measures that Matter • Input --> Output --> Outcome --> Impact • Consistent with organizational mission, goals, and objectives • Integration with program review • Balance customer, stakeholder, and employee interests and needs • Establish accountability • Collection and use of reliable and valid data • Benchmarking and best practice • Over time

  7. The Challenge “The difficulty lies in trying to find a single model or set of simple indicators that can be used by different institutions, and that will compare something across large groups that is by definition only locally applicable—i.e., how well a library meets the needs of its institution. Librarians have either made do with oversimplified national data or have undertaken customized local evaluations of effectiveness, but there has not been devised an effective way to link the two” Sarah Pritchard

  8. ARL New Measures Begins Tuscon, AZ, January 1999 • Ease and Breadth of Access • User Satisfaction • Library Impact on Teaching and Learning • Library Impact on Research • Cost Effectiveness of Library Operations and Services • Space and Facilities • Market Penetration • Organizational Capacity Source: <http://www.arl.org/stats/newmeas/nmbackground.html>

  9. E-Metrics Brief History • ARL Supplementary Statistics tracking expenditures for electronic resources since 1993 • Facilitated retreat at Scottsdale in February 2000 • Contract with the Information Use and Management Policy Institute at Florida State University • Phase One: Environmental Scan • Phase Two: Proposed Measures and Testing • Phase Three: Training Modules • Measures for Electronic Resources (E-Metrics) by Wonsik ‘Jeff’ Shim, Charles McClure, and John Bertot (Washington, DC: Association of Research Libraries, 2002) • 2002-2003 extended pilot with 39 libraries • Revised supplementary statistics data collection

  10. Learning Outcomes • Development of strategy for involving library in campus assessment activities to demonstrate the value of the library to the learning community • Move from content view (books, subject knowledge) to competency view (what students are able to do) • Understand learning outcomes of academic degree programs • Develop curriculum segments or “offerings” through which the library achieves outcomes • Information Literacy Competency Standards for Higher Education approved by the Association of College and Research Libraries in January 2000

  11. Project SAILS • Developed by Kent State University • Based on ACRL Standards • IMLS Grant as well as Ohio Board of Regents collaborative grant with Bowling Green State University • 3 year research project involving 80 institutions and more than 42,000 students • Measures cohorts of students • Benchmarking and comparative reports on skill sets

  12. Assessment’s Purpose How can a library answer the question, Do We Make a Difference?

  13. User-Centered Library Allservices and activities are viewed through the eyes of the customers Customers determine quality Library services and resources add value to the customer Culture of Assessment Organizational environment in which decisions are based on facts, research and analysis, Services are planned and delivered to maximize positive customer outcomes Rise of User-Centered Library and the Culture of Assessment in the 1990s

  14. Culture of Assessment Key Elements • Basic value - customer & learning focus • A Culture of Assessment is an organizational environment in which decisions are based on facts, research and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for library clients • A Culture of Assessment exists in organizations where staff care to know what results they produce and how those results relate to customer expectations • Organizational mission, values, structures, and systems support behavior that is performance and learning focused

  15. Why Do Libraries Need a Culture of Assessment? • Role within the parent organization • Relationship to central mission • Accountability for operations, resources, added-value • Need for efficiency and effectiveness of operations • Management of resources • Decision-making based on data • Institutionalization of planning process • Response to customers • High quality service • Focus on added value

  16. Important Characteristics • Leadership has sense of purpose, urgency, resolve, and flexibility • Organizational focus is on customers • Feedback is welcomed and used (atmosphere of integrity and trust • Staff care about outcomes and impact • Environment is one in which facts are analyzed and research is conducted • Staff are learning how to measure accurately from the customers’ point of view • Organization can anticipate future needs • Organization is building relationships with customers

  17. In Building a Culture of Assessment -- We Often Have a GAP

  18. The Importance of Appropriate Measures Measure what is important, not just what is measurable because What you measure is what you will pay attention to and work toward

  19. Performance Management Maxim If you can’t measure it, you can’t manage it. What gets measured matters.

  20. Issues in Using Data Effectively • Library leadership • Organizational culture • Priorities of the library • Sufficiency of resources • Data infrastructure • Assessment skills and expertise • Sustainability • Presenting results • Using results to improve libraries

  21. Choosing the Right Method • Appropriate for the information needed • Timely • Cost effective • Level of user involvement • Representativeness of population • Support for staff/training available • Possiblity/probability for results to lead to positive change

  22. Surveys Employee survey Total market survey Transaction-based questionnaires User survey Internal record-keeping Service data capture Transaction logs Survey methods Email Paper Telephone Web-based Quantitative Measurement Tools

  23. Advisory teams Complaint system Customer visit teams Employee field reporting Employee visit teams Focus groups Mystery shopping service Observation Portfolios Service reviews Spot comment cards Structured interviews Toll-free hotlines Usability studies User groups Qualitative Measurement Tools

  24. Standardized tests Pre Post Assignments Papers and essays Oral presentations Demonstrations Exhibitions Portfolios Capstone experiences Surrogates Grades/GPA Self-reports Interviews Methods of Assessing Students

  25. Multiple Methods Provide More Effective Measurement • Complementary • Appropriateness • Large projects can be divided up • Quantitative and qualitative information • Multi-dimensional views of issues or users • “Two Proofs” (cross validation) • Use of existing data

  26. Barriers to Using Data Effectively in Libraries • Organizational culture/leadership support • Time/Staff/Resources • Data issues – too much, compatibility, validity • Establishing priorities • Knowing what to measure and methods to use • Inexperience, perceived lack of skills and expertise • Understanding, presenting and knowing what to do with the results Hiller, S. and Self, J. (2004). From Measurement to Management: Using Data Wisely for Planning and Decision-Making. Library Trends.

  27. Statistics are no substitute for judgment -- Henry Clay

  28. Assessment Challenges • Resources (i.e., time and money) • Buy-in • Access to individuals to evaluate • Expertise to conduct evaluation • Project management experience • Appropriate benchmarks • Conceptual clarity • Measurement & design requirements • Instrument validity and reliability

  29. Julia C. Blixrud Director of Information Services Association of Research Libraries 21 Dupont Circle, Ste 800 Washington, DC 20036 jblix@arl.org 202-296-2296 ext. 133 202-872-0884 (fax) 202-251-4678 (cell)

More Related