1 / 127

Library Performance Indicators Does it really make sense to measure them?

Library Performance Indicators Does it really make sense to measure them?. Błażej Feret Library, Technical University of Lodz, Poland Blazej.Feret@bg.p.lodz.pl Marzena Marcinek Main Library, Cracow University of Technology, Poland marcinek@biblos.pk.edu.pl. Zagadnienia. Introduction

samuru
Download Presentation

Library Performance Indicators Does it really make sense to measure them?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Library Performance IndicatorsDoes it really make sense to measure them? Błażej Feret Library, Technical University of Lodz, Poland Blazej.Feret@bg.p.lodz.pl Marzena Marcinek Main Library, Cracow University of Technology, Poland marcinek@biblos.pk.edu.pl

  2. Zagadnienia • Introduction • LPI-s for traditional libraries (ISO) • LPI-s for digital/electronic libraries (ISO) • Measuring service quality: LibQUAL+/Rodski • Performance Analysis of Polish Research Libraries Project (Marzena) • New context of the library/Googlization • Place of the future library • LPI-s for libraries of the 21st Century? • Conclusions

  3. Introduction How good we (the libraries) are? perception quality satisfy needs performance Service INDICATORS satisfaction

  4. Introduction. Quality – is it important? High quality of library performance is crucial for each research library to survive. Wide on-line access to information makes researchers and students demand the highest quality library services. It is the quality of library services that decides on the perception of the library within its parent institution and the society. Derfert-Wolf, Górski, Marcinek @IFLA 2005

  5. Introduction. What is quality? Glossary: quality = fitness for purpose, fitness for use, conformity to requirements and absence of defects ISO Standard 11620 (Performance Indicators for Libraries) defines „quality” as: Totality of features and characteristics of a product or service that bear on the library's ability to satisfy stated or implied needs. In the TQM context : „The quality of service is defined by the customer's perception of both the quality of the product and the service providing it” (Barnard, 1994)

  6. Introduction. Quality. Quality assessment depends not only on the product or service as it is, but also on a person or institution involved in the assessment process.

  7. Introduction. Quality. Who decides about the quality and who evaluates its level and assesses quality („fitness to purpose”) of the library? "Many librarians maintain that only they, the professionals, have the expertise to assess the quality of library service. They assert that users cannot judge quality, users do not know what they want or need, and professional hegemony will be undermined if they kowtow to users. Such opinions about services, in fact, are irrelevant. The only thing that matters is the customer opinions, because without users there is no need for libraries except to serve as warehouses… After all, customers (present, potential, and former ones) believe that the library's reason for being open is to meet their needs. Each customer evaluates the quality of service received and decides when (or if) there will be further interaction with that organization” (Hernon, Altman 1998)

  8. Quality is a very relative term Introduction. Quality. • The quality of a library is defined and assessed from a perspective of different groups of people • Basic element of the quality is user satisfaction • Users in different countries or even different user groups may have different needs and expectations and therefore different level of satisfaction from the same service • User satisfaction is NOT an objective value (though it is measurable) • User satisfaction “is the emotional reaction to a specific transaction or service encounter” (Hernon, Altman) • User satisfaction from a single transaction is determined by many different factors including service quality, user’s past experience with the service provider, actual emotional status of user, etc. (Hernon, Altman)

  9. Introduction. Service quality. The better service quality – the higher satisfaction of users, but: The “perceived quality” is different from “the objective quality” The “service quality” is dependent on the customers perception of what they can expect from a service and what they believe they have received, rather than it is any “objective” standard as determined by a professional group or in conventional performance measurement. R.Cullen,Library Trends, vol. 49, no. 4, 2001

  10. Introduction. User satisfaction. User is satisfied when provision of the service meets his/her expectations. If there is a gap between service delivered and user expectations – the user is not satisfied from the service.

  11. Introduction. Gap analysis. The gaps between users’ expectations and perceptions (SERVQUAL model): • The discrepancy between users’ expectations and management’s perception of these expectations • The discrepancy between management’s perception of users’ expectations and service quality expectations • The discrepancy between service quality specifications and actual service delivery • The discrepancy between actual service delivery and what is communicated to users about it • The discrepancy between users’ expected service and perception of service delivered.

  12. Introduction. • Quality • User satisfaction • Other parameters • measuring user satisfaction • measuring other parameters • quality surveys (LibQual+, Rodski) • Library performance indicators • ISO norms

  13. Introduction • Why to measure library performance? • Library management and decision-making process • Monitoring implementation of strategic plans • Optimization of library activities; service enhancements • Acquiring and rational allocation of financial resources • Library marketing • Accreditation • Benchmarking, rankings, to compare… • … • academic vs. public libraries context

  14. to compare… How’s your wife?” Compared to what? M.Lynch :Compared to What? Or, Where to Find the Stats. AMERICAN LIBRARIES, September 1999

  15. Introduction • Norm ISO 11620:1998, ISO 11620:1998/AD1:2003 Information and Documentation. Library performance indicators • Technical Report ISO/TR 20983:2003 Information and documentation. Performance indicators for electronic library services • Poll R., te Boekhorst P. “Measuring Quality: International Guidelines for Performance Measurement in Academic Libraries”. IFLA 1996. • ICOLC Project: Guidelines for Statistical Measures of Usage of Web-Based Information Resources. ICOLC, 2001 (http://www.library.yale.edu/consortia/2001webstats.htm) • Projekt COUNTER (www.projectcounter.org) • Guidelines for Statistical Measures of Usage of Web-Based Information Resources. ICOLC, 2001 (http://www.library.yale.edu/consortia/2001webstats.htm) How to measure library performance?

  16. ISO 11620 (1998) • This International standard is applicable to all types of libraries in all countries. • Indicators may be used for comparison over time within the same library. Comparison between libraries may also be made, but only with extreme caution. • This International Standard does not include indicators for the evaluation of the impact of libraries either on individuals or on society. Information and documentation – – Library Performance Indicators

  17. ISO 11620 (1998) • User perception • General • USER SATISFACTION • Public Services • General • Percent of Target Population Reached • Cost Per User • Library Visits per Capita • Cost per Library Visit • Providing Documents • Titles availability • Required Titles Availability • Percentage of Required Titles in the Collection • Required Titles Extended Availability • In-library Use per Capita • Document Use Rate

  18. ISO 11620 (1998) • Public Services (cont.) • Retrieving Documents • Median Time of Document Retrieval from Closed Stacks • Median Time of Document Retrieval from Open Access Areas • Lending Documents • Collection Turnover • Loans per Capita • Documents on Loan per Capita • Cost per Loan • Loans per Employee • Document delivery from external sources • Speed of Interlibrary Lending • Enquiry and reference services • Correct Answer Fill Rate

  19. ISO 11620 (1998) • Public Services (cont.) • Information searching • Title Catalogue Search Success Rate • Subject Catalogue Search Success Rate • User education • NO INDICATOR • Facilities • Facilities Availability • Facilities Use Rate • Seat Occupancy Rate • Automated Systems Availability • Technical Services • Acquiring documents • Median Time of Document Acquisition • Processing documents • Median Time of Document Acquisition

  20. ISO 11620 (1998) • Technical Services (cont.) • Cataloguing • Cost per Title Catalogued • Promotion of services • NO INDICATOR • Availability and use of human resources • NO INDICATOR

  21. ISO 11620 Amendment 1 (2003) Additional indicators: • Public services • Providing documents • Proportion of Stock Not Used • Shelving Accuracy • Lending Documents • Proportion of Stock on Loan • User services • User Services Staff per Capita • User Services Staff as Percentage of Total Staff

  22. ISO/TR 20983 (2003) Information and documentation – – Performance indicators for electronic libraries The performance indicators described in this Technical Report are used as tools to compare the effectiveness, efficiency and quality of the library services and products to the library’s mission and goals. They can be used for evaluation purposes in the following areas: • Comparing a single library performance over years • Support for management decisions • Demonstrating the library performance and its cost to the funders, the population and public • Comparing performance between libraries of similar structure • Whether the library’s performance or the use of its services has changed over years • How far the performance or use in one library differs from that in other libraries

  23. ISO/TR 20983 (2003) • Public Services • General • Percentage of Population Reached by Electronic Services • Providing electronic library services • Percentage of Expenditure on Information Provision Spent on the Electronic Collection • Retrieving documents • Number of Documents Downloaded Per Session • Cost Per Database Session • Cost Per Document Downloaded • Percentage of Rejected Sessions • Percentage of Remote OPAC Sessions • Virtual Visits as Percentage of Total Visits • Enquiry and reference services • Percentage of Information Requests Submitted Electronically

  24. ISO/TR 20983 (2003) • Public Services (cont.) • User education • Number of User Attendances at Electronic Service Training Lessons Per Capita • Facilities • Workstation Hours Available Per Capita • Population Per Public Access Workstation • Workstation Use rate • Availability and use of human resources • Staff training • Number of Attendances at Formal IT and Related Training Lessons Per Staff Member • Deployment of Staff • Percentage of Library Staff Providing and Developing Electronic Services

  25. Measuring library service quality • LibQUAL+ • RODSKI • Performance Analysis for Polish Research Libraries

  26. LibQUAL+ (ARL) • LibQUAL+ has 22 standard statements and the option to select five local service quality assessment statements. For each of which, the client is asked to rate three times– for the minimum, desired and perceived levels of service quality. These are all scaled 1-9, with 9 being the most favourable. There is an open ended comments box about library services in general.

  27. RODSKI (Rodski) (rodski.com.au) • Rodski is the Australian behavioural research company which develops its own surveys. • Rodski had 38 statements and the option to include up to 15 local service quality assessment statements which clients are asked to rate twice – firstly to measure the importance of each of the statements to them, and secondly to measure their impression of the library’s performance on each statement. These are scaled 1-7, with 7 being the most favourable. There are two comments boxes at the end of the survey – one for general comments and one for “the one area we could improve on to assist you”?

  28. Performance Analysis for Polish Research LibrariesMarzena Marcinek

  29. Total National Library Academic Libraries Libraries of the Polish Academy of Sciences (PAS) Libraries of branch R&D units Public libraries Other 1225 1 989 94 99 11 31 Research Libraries in Poland

  30. Collection of Polish research libraries (excl. e-collection)

  31. Readers, loans and staff of research libraries

  32. Academic libraries Regulations • the Library Act of 1997 • the Higher Education Act of 2005 Funds • budgets of parent institutions from the resources of the appropriate ministries, e.g. the Ministry of Science and Higher Education (usually cover only current expenditure) • various grants and projects

  33. Official library statistics in Poland • Central Statistical Office (CSO) data collected every second year • The Higher Educationpublished by the Ministry of Science and Higher Education data collected every year

  34. Assessment of higher education institutions (and libraries) • State Accreditation Commission • Journals • University / parent institution bodies • Libraries

  35. Characteristics of library statistics and performance measurein Poland • lack of national library statistics system • data on libraries are gathered every second year by the Central Statistical Office - insufficient for comparable analyses and not consistent with ISO 2789 • lack of unified criteria to evaluate and compare library performance • lack of tools for systematic data gathering • lack of body/institution responsible for developing methods and tools for library evaluation • the State Accreditation Commission - dealing with library issues in a very general manner

  36. Quality initiatives and user surveys in Polish academic libraries • Development of Library Management as Part of the University Total Quality Management(EU Tempus grant, 1998-2000) • "Analysis of current state of libraries with selected performance indicators" • user survey (LIBRA package) • Comparative studies of Polish research libraries(national conference, Krakow 2001) • a lot of separate researches and surveys • the need for common patterns and results

  37. The Group for Standardisation for Polish Research Libraries • formed in 2001, initially as an informal team • activities incorporated into the overall plan of tasks of the Standing Conference of the Directors of Higher Education Libraries • "Performance Analysis for Polish Research Libraries” – a project based on the agreement on cooperation signed by 8 institutions employing members of the Group (2004) • Project co-financed by the Ministry of National Education and Sport, 2004

  38. A Common Project of Polish Research Libraries on Comparable Measures Objectives • to define methods for the assessment of Polish research libraries • to select a set of performance indicators and standards for library performance (quantity, quality and effectiveness)

  39. Goals • to collect libraries' statistical data for a computer database • to conduct a comparative research • to prepare and publish yearly reports

  40. Tasks • identification of publications on library performance and national solutions in different countries • preparation and further modification of a questionnaire for the survey of library performance • preparation and further modification of a dedicated software for the acquisition and analysis of data collected in the surveys • data collection • promotion • detailed analysis of data

  41. Questionnaire • Staff • Collection • Budget • Infrastructure • Circulation • Information services • Didactics • Publications and data bases created by the library • Library cooperation, organisation of library events, professional activity of library staff

  42. Patterns for the Polish Questionnaire • EU TEMPUS PHARE JEP 13242-98 “Development of Library Management as part of the University TQM” • ISO 11620:1998, AD1:2003 Information and Documentation. Library performance indicators • ISO 2789:2003 Information and Documentation. International Library Statistics • R. Poll, P. te Boekhorst “Measuring Quality : International Guidelines for Performance Measurement in Academic Libraries”. IFLA 1996

  43. Changes and modifications to the questionnaire (2004) • more indicators and formulas based mainly on the ISO 11620 and ISO 2789 standards (information services, electronic sources and usage) • problems reported by librarians or observed by the administrator of the database • more notes and comments (financial and staff issues)

  44. Questionnaire • 48 questions of various types • refer to easily accessible or computable data (e.g. size ofcollection, number of users etc.) • closed questions about the services offered (e.g. on-line reservation: Yes/No) • 88 performance indicators • 19 calculated by librarians • 69 calculated automatically

  45. Why so many indicators ? • the need for a comprehensive analysis of current state of Polish research libraries • the need to cover all aspects of library activities included in questionnaires • the need to develop standards for library evaluation in the future, on the basis of current performance indicators • usefulness for different purposes, both for libraries and another institutions and authorities • “three times-calculated” selected indicators (lack of FTE student equivalent)

  46. Examples of performance indicators required to complete the questionnaire • library expenditure per student/user, • expenditures for library materials/books per student/user • ratio of library budget to the budget of its parent university • time required for the technical processing of a document • collection on the computer system as a % of the whole collection of the library • percent of catalogue descriptions acquired from outside resources

  47. Examples of performance indicators calculated automatically • Registered users as % of potential users • Total books per student/user • Books added per student/user • Number of students/users per one library staff member • Total library area per student/user • Number of students/users per one study place in reading rooms • Loans per registered user • Loans per library staff member • User services staff as % of total staff • Staff with higher LIS education as % of total staff • Open access printed books as % of total printed books

  48. Software for the acquisitionand analysis of data – requirements • on-line access to the questionnaire (submission, modification) • selected performance indicators automatically calculated and presented • automatic control and verification of the accuracy of data in the fields • multi-aspect comparative analysis of selected data and performance indicators • access to analysing functions for individual libraries • Internet website - information about the Project, a set of instructions, questionnaires, useful links, results of research • module for librarians - an on-line questionnaire, multi-aspect analysis of data concerning one’s own library • administrator’s module - registration of libraries and direct contacts • database - incorporate and register data from the questionnaires (dynamic form) • module for the Group - statistical analyses on data and performance indicators

More Related