library services assessment l.
Skip this Video
Loading SlideShow in 5 Seconds..
Library Services Assessment PowerPoint Presentation
Download Presentation
Library Services Assessment

Loading in 2 Seconds...

play fullscreen
1 / 62

Library Services Assessment - PowerPoint PPT Presentation

  • Uploaded on

Library Services Assessment. Isla Jordan, Carleton University Julie McKenna, University of Regina February 2, 2007 OLA Super Conference 2007: Session 1408. Outline. Definition and Purpose Survey of Assessment Practices Types of Assessment Benchmarks, Standards and EBL

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Library Services Assessment' - briar

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
library services assessment

Library Services Assessment

Isla Jordan, Carleton University

Julie McKenna, University of Regina

February 2, 2007

OLA Super Conference 2007: Session 1408

  • Definition and Purpose
  • Survey of Assessment Practices
  • Types of Assessment
  • Benchmarks, Standards and EBL
  • Drivers of Assessment
  • Tools and Techniques
  • Assessment strategy
  • Questions

… a critical tool for understanding library customers and offering services, spaces, collections, and tools that best meet their needs. Without good assessment, libraries could lose touch with users’ desires and needs and even become irrelevant.

Nardini (2001)


…any activities that seek to measure the library’s impact on teaching, learning and research as well as initiatives that seek to identify user needs or gauge user satisfaction or perceptions with the overall goal being the data-based and user-centered continuous improvement of our collections and services.

Pam Ryan,

the purpose of assessment in libraries
The purpose of assessmentin libraries
  • To understand user interaction with library resources and services; and
  • To capture data that inform the planning, management and implementation of library resources and services.

Bertot, 2004

survey of assessment practices purpose
Survey of Assessment Practices - Purpose
  • Benchmark services assessment practice
  • Capture some measures about the culture of assessment
survey sections
Survey Sections
  • Demographic Information
  • Assessment Planning
  • Involvement in Assessment in Organization
  • Collection and Use of Data to Inform Decision-Making
  • Final Comments
survey participants
Survey Participants
  • Invitation to complete a web-based survey to all University Librarians of:
    • Council of Prairie and Pacific University Libraries (COPPUL)
    • Ontario Council of University Libraries (OCUL)
    • Council of Atlantic University Libraries (CAUL/AUBO)
  • Invitation (February 12, 2007) to complete a French edition of the web-based survey:
    • members of Conférence des recteurs et des principaux des universités du Québec (CREPUQ)
survey of assessment practices
Survey of Assessment Practices
  • English Survey
    • 60 invitations; 39 respondents
    • 65% response rate
  • French Survey
    • To launch February 12, 2007
thank you to
University of Toronto


Queen’s University


University of Windsor

York University

Guelph University

Nipissing University

University of Waterloo

Carleton University

Brock University

Memorial University

University of Saskatchewan


University of Alberta

And many more….

Thank you to …
types of assessment
Types of Assessment
  • Input & Output
  • Service Quality
  • Performance Measures
  • Outcomes or Impact
1 input output
1. Input & Output
  • Input measures: expenditures & resources
    • Funding allocations, # of registered students, print holdings, etc.
  • Output measures: activities & service traffic
    • Reference transactions, lending and borrowing transactions, # of instruction sessions, program attendance, etc.
  • Ratios
    • Students/librarians, print volume holdings/student, reference transactions/student, etc.
survey results how output data is used
Type of data

Gate count

Body counts

Reference transactions

Circulation statistics



Staffing & scheduling

Service points

Collection decisions

Survey Results – how output data is used
2 service quality
2. Service Quality
  • Services defined as all programs, activities, facilities, events, …
  • Measure capture results from interactions with services
  • Subjective evaluation of “customer service”
  • Measure of the affective relationship
zeithaml parasuraman and berry 1990
(Zeithaml, Parasuraman and Berry 1990)

“The only criteria that count in evaluating service quality are defined by customers. Only customers judge quality; all other judgments are essentially irrelevant.”

  • Association of Research Libraries
    • Standard for service quality assessment (2003)
  • Total market survey
    • Based in Gap Analysis Theory
      • User perceptions and expectations of services
    • Measures outcomes and impacts
3 performance measures
3. Performance Measures

Involves the use of efficiency and effectiveness measures

  • Availability of resources
  • Usability of programs, resources and services
  • Web page analysis
  • Content analysis
  • Functionality analysis
  • Cost analysis
4 outcomes or impacts
4. Outcomes or Impacts

“the ways in which library users are changed as a result of their interaction with the Library's resources and programs”

Association of College & Research Libraries Task Force on Academic Library Outcomes Assessment Report, 1998

  • The electronic journals were used by 65 scholars in the successful pursuit of a total of $1.7 million in research grants in 2004.
  • In a 2003 study, eighty-five percent of new faculty reported that library collections were a key factor in their recruitment.
libqual measures outcomes
LibQUAL+ Measures Outcomes
  • The library helps me stay abreast of developments in my field(s) of interest.
  • The library aids my advancement in my academic discipline.
  • The library enables me to be more efficient in my academic pursuits.
  • The library helps me distinguish between trustworthy and untrustworthy information.
benchmarks standards and ebl
Benchmarks, standards and EBL
  • Standards: “Measures that tie the value of libraries more closely to the benefits they create for their users”

NISO 2001 (National Information Standards Organization)

  • Benchmarking: improving ourselves by learning from others (UK Public Sector Benchmarking Service)
benchmarks standards and ebl23
Benchmarks, standards and EBL
  • EBL (Evidence Based Librarianship): “attempts to integrate user reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making.”

(Booth, “Counting What Counts” 2006)

example of a standard
Example of a Standard

Example: Information Literacy Standards for Science and Engineering Technology (ACRL 2006)

Standard #1: The information literate student determines the nature and extent of the information needed.

Performance Indicator #3: The information literate student has a working knowledge of the literature of the field and how it is produced.

Outcome #a: ... student knows how scientific, technical, and related information is formally and informally produced, organized, and disseminated.

cacul standards committee
CACUL Standards Committee
  • Goals:
    • Add Canadian context to existing standards in college and university libraries, e.g. ACRL
    • prepare report for CACUL AGM at CLA 2007
    • form new team in summer 2007

contactJennifer Soutter

survey results drivers of assessment
Survey Results: Drivers of Assessment

University Library Administration 92%

Need for evidence to inform planning 87%

University Administration 62%

CARL, ARL or regional lib. Consortium 54%

multiple methods of listening to customers
Multiple Methods of Listening to Customers
  • Transactional surveys
  • Mystery shopping
  • New, declining, and lost-customer surveys
  • Focus group interviews
  • Customer advisory panels
  • Service reviews
  • Customer complaint, comment, and inquiry capture
  • Total market surveys
  • Employee field reporting
  • Employee surveys
  • Service operating data capture

Note. A. Parasuraman. The SERVQUAL Model: Its Evolution And Current Status. (2000). Paper presented at ARL Symposium on Measuring Service Quality, Washington, D.C.

canadian adoption of libqual benefits
Canadian Adoption of LibQUAL+: Benefits
  • Quick, inexpensive
  • Standardized and tested instrument and practice
  • Data set of comparables for Canada
    • Insight into best practices at peer institutions
  • Build staff expertise and encourage evidence based practice and practitioners
  • Opportunity to introduce Canadian changes to instrument
user surveys libsat libpas
User Surveys: LibSAT, LibPAS
  • continuous customer feedback
  • LibSAT measures satisfaction
  • LibPAS (beta) measures performance

usability testing
Usability testing
  • gives user perspective
  • often for website design:
      • e.g. “user driven web portal design” (U Toronto 2006)
  • also for physical space:
      • e.g. “wayfinding” in library:
instruction program example assessment methods
Instruction Program Example -- Assessment Methods
  • Learning outcomes
    • Student performance on examinations, assignments
    • Pre- and post-test results
    • Level of "information literacy"
  • Program service measures (outputs)
    • # of instruction sessions offered, requests for course specific support, # of session attendees, by discipline, by faculty member, by course, logins to library-created online tutorials, # of course pages created within university’s learning portal, etc.
  • Student course evaluations & peer evaluations
    • Qualitative and quantitative
  • Service quality assessment
    • LibQUAL+ (gap between expectations and perceptions)
  • Use patterns
    • laptop loans, GIS over paper maps, eBooks…
  • Space usage studies
    • e.g. Learning Commons study (University of Massachusetts Amherst)
  • Instruction and Information Literacy
    • e.g. use of online learning modules
electronic resources assessment
Electronic resources assessment
  • statistics not being systematically captured for digital collections or services
  • need for standard measures for use of digital collections is increasingly important:
    • to justify huge expenses of electronic collections
    • decline in use of traditional services (reference, ILL)
electronic resources assessment34
Electronic resources assessment

COUNTER: Real-time acquisition of usage statistics:

  • imports usage statistics from content vendors in a uniform format (COUNTER - Counting Online Usage of Networked Electronic Resources)
  • reduces need to retrieve statistical data on a resource-by-resource basis
  • can compare usage statistics with cost information to evaluate service benefits of e-resources
electronic resources assessment35
Electronic resources assessment
  • Output statistics for ScholarsPortal databases and e-journals, e.g.
      • the number of requests for articles
      • holdings of different aggregators, to see overlap
      • Web logs, to see patterns of use
survey results electronic resources assessment
Survey Results: Electronic resources assessment
  • "we are gathering e-resources stats as part of an overall journal review "
  • “The Library is currently reviewing Scholarly Statistics, a product designed to gather and present for analysis e-resource statistics. Also under consideration is an ERM which, along with its other capabilities, will provide statistic analysis.”
electronic resources assessment38
Electronic resources assessment

“I have been busy this week with the compilation of electronic journal usage statistics for ARL. To complete Section 15 (Number of successful full-text article requests) in the Supplementary Statistics section, I am limiting myself to Counter-compliant JR1 statistics provided by the publisher. Still, I am encountering unexpected complexities. .. The JR1 format is based on an the calendar year, but the ARL statistics are reported on the budget year. This means for every publisher I have to compile two years worth of data and manipulate it.”

surveys interviews focus groups
Surveys, Interviews, Focus Groups
  • Surveys
    • quick to implement, difficult to design
    • identify issues, pick up anomalies
    • wording is critical
    • test, test, test ….
    • users over-surveyed
  • Interviews and focus groups
    • more scope for follow-up, explanation
    • subjective, time-consuming
survey results top 5 planned assessment studies
Survey Results: Top 5 planned assessment studies
  • User satisfaction survey / LibQUAL
  • Gate traffic study
  • Electronic database use
  • Electronic journal use
  • Usability of the website
survey results staff abilities

Formal presentations

Formal reports

Draw conclusions

Make recommendations

Project management

Facilitate focus groups



Research design

Focus group research

Survey design

Qualitative analysis

Survey Results: Staff Abilities
challenges of assessment
Challenges of assessment
  • Gathering meaningful data
  • Acquiring methodological skills
  • Managing assessment data
  • Organizing assessment as a core activity
  • Interpreting data within the context of user behaviours and constraints.

(Troll Covey, 2002)

survey results where is assessment placed
Survey Results: Where is assessment placed?
  • Assessment Librarian (2 institutions)
  • Assessment Coordinator
  • Libraries Assessment and Statistics Coordinator
  • Library Assessment and Information Technology Projects Coordinator
  • Librarian, Evaluation & Analysis
  • Manager, Evaluation & Analysis
survey results who else is assigned assessment responsibility
Survey Results: Who else is assigned assessment responsibility?
  • distributed to all unit heads or team leaders (4)
  • AULs have responsibility (6)
  • UL or Director (3)
  • administrative or executive officer (4)
  • access services or circulation (3)
  • other positions (12)
survey results committees
Survey Results: Committees
  • Assessment Committee
  • Priorities and Resources Committee
  • Statistics Committee
  • LibQual Committee
  • LibQUAL+ Working Group
  • Library Services Assessment Committee
  • Community Needs Assessment Committee
  • PR/Communications Committee
  • Accreditation Self-Study Steering Committee
  • Senior Management Group
  • Cooperative Planning Team
services assessment strategy
Services Assessment Strategy

“The evaluation environment is increasingly complex, and requires knowledge of multiple evaluation frameworks, methodologies, data analysis techniques, and communication skills”

Note. J.T. Snead et al. Developing Best-Fit Evaluation Strategies. (2006).Paper presented at Library Assessment Conference, Virginia.

assessment continuing commitment
Assessment – Continuing Commitment

Research Question




services assessment strategy52
Services Assessment Strategy
  • Decide what you need to know and why
    • Assign priorities
    • Confirm timelines
  • Commit to and carry out methodologies for discovery
  • Analysis and reporting
  • Continuous assessment and reporting commitment
culture of assessment
Culture of Assessment
  • is an organizational environment in which decisions are based on facts, research and analysis
  • where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders
  • exists in organizations where staff care to know what results they produce and how those results relate to customers’ expectations
  • organizational mission, values, structures, and systems support behavior that is performance and learning focused.

(Lakos, Phipps and Wilson, 1998-2002)



  • ARL New Measures website (background info)
  • Canadian LibQUAL consortium
    • summer 2007 workshop
    • Sam Kalb
  • Service Quality Evaluation Academy (“boot camp”)

ARL (cont’d):

  • ARL visit: “Making Library Assessment Work”
    • 11/2 day visit from Steve Hiller and Jim Self
    • pre-visit survey, presentation to staff, interviews, meetings, written report
    • UWO participated - for more information, contact Margaret Martin Gardiner
  • 2006 Library Assessment Conference

Assessment blog:

Journals, conferences:

  • Performance Measurement and Metrics
  • Evidence Based Library and Information Practice
  • Northumbria International Conference on Performance Measures

Books & Papers:

  • Blecic, D.D., Fiscella, J.B. and Wiberley, S.E.Jr. (2007) Measurement of Use of Electronic Resources: Advances in Use Statistics and Innovations in Resource Functionality, College & Research Libraries,68 (1), 26-44.
  • Booth, A. (2006) Counting what counts: performance measurement and evidence-based practice. Performance Measurement and Metrics,7 (2), 63-74
  • Brophy, P. (2006) Measuring Library Performance: principles and techniques, London, Facet Publishing.

Books & Papers:

  • Bertot, J.C. et al. (2004) Functionality, usability, and accessibility: Iterative user-centered evaluation strategies for digital libraries. Performance Measurement and Metrics,7 (1) 17-28.
  • Brekke, E. (1994) User surveys in ARL libraries. SPEC Kit 205, Chicago, American Library Association
  • Covey, D.T. (2002) Academic library assessment: new duties and dilemmas, New Library World,103 (1175/1176), 156-164.

Books & Papers:

  • Lakos, A., Phipps, S. and Wilson, B. (1998-2000) Defining a “Culture of Assessment”.
  • Nardini, H.G. (2001) Building a Culture of Assessment, ARL Bimonthly Report, 218 (Oct 2001).

Books & Papers:

  • Snead, J.T. et al. (2006) Developing Best-Fit Evaluation Strategies. Library Assessment Conference, Virginia. <>
  • Zeithaml, V.A., Parasuraman, A. and Berry, L.L. (1990) Delivering Quality Service: balancing customer perceptions and expectations, London, Collier Macmillan.
thank you

Thank you!

Questions or comments are welcome

contact us
Contact us:

Isla Jordan, Carleton University

Julie McKenna, University of Regina