Libqual beyond applying your survey results other performance measures in library practice
1 / 96

LibQUAL+ ® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice - PowerPoint PPT Presentation

  • Updated On :
  • Presentation posted in: Travel / Places

LibQUAL+ ® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice. LibQUAL+ ® Canada Workshop October 24-25, 2007 Ottawa, Ontario, Canada. Martha Kyrillidou , Director, Statistics and Service Quality Programs, ARL

Related searches for LibQUAL+ ® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation

LibQUAL+ ® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

LibQUAL+® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice

LibQUAL+® Canada Workshop

October 24-25, 2007

Ottawa, Ontario, Canada

Martha Kyrillidou, Director, Statistics and Service Quality Programs, ARL

Steve Hiller, Director, Assessment and Planning, UW

Jim Self, Director, Management and Information Services, UVA

Martha Kyrillidou, Director

Statistics and Service Quality Programs

Association of Research Libraries

What’s in a “Library”?

A word is not crystal, transparent and unchanged; it is the skin of a living thought, and may vary greatly in color and content according to the circumstances and time in which it is used.

--Justice Oliver Wendell Holmes

What’s in a word?

What makes a qualitylibrary?

“Quality much like beauty is in the eye of the beholder”

Library Assessment and its Global Dimensions

  • Markets and people exposed to economic and social frameworks unheard of before

  • Competing internationally

  • Library users exposed to global forces

  • Libraries facing similar challenges

  • Libraries as the Internet

  • Libraries as Google

  • Libraries as Collaborative Spaces

Library Assessment

Library assessment provides a structured process to learn about our communities, their work and the libraries connection to what they do

The information acquired through library assessment is used in an iterative manner to improve library programs and services and make our libraries responsive to the needs of our communities.

Academic libraries do not exist in a vacuum but are part of a larger institution. Assessment within the institution may take place in individual areas as well as at the broad institutional level.

Thinking Strategically About Library Futures: Some Assessment-Related Questions

  • What is the central work of the library and how can we do more, differently, and at less cost?

  • What important services does the library provide that others can’t?

  • What advantages does the research library possess?

  • How is customer behavior changing?

  • How do we add value to our customers work?

  • What are the essential factors responsible for library success now and in the future?

Free speech wall, Charlottesville, Sept 2006

IFLA: Measuring Quality

  • Resources, infrastructure: What services does the library offer?

  • Use: How are the services accepted?

  • Efficiency: Are the services offered cost-effectively?

  • Potentials and Development: Are there sufficient potentials for future development?

Assessment at ARL

  • A gateway to assessment tools: StatsQUAL®:

    • ARL Statistics -- E-Metrics

    • LibQUAL+®

    • DigiQUAL®

    • MINES for Libraries®

  • Library Assessment Conferences

  • Service Quality Evaluation Academy

  • Library Assessment Blog

  • Making Library Assessment Work

  • ESP Assessment

    • Effective, Sustainable, Practical

Assessment at CARL

Assessment at SCONUL

Assessment at CAUL

Assessing the Value of Networked Electronic Services

The MINES survey

Measuring the Impact of Networked Electronic Services (MINES) - MINES for Libraries®

What Are We Measuring? Reviewing the ARL Statistics

October 2005, ARL Board approved a study to:

  • Determine if there are new ways of describing research library collections.

    • What is it we are currently measuring

    • Are they the right data

    • Develop alternative models

  • Develop a profile of the characteristics of a contemporary research library

  • Determine/develop new meaningful measures to augment current ones to support this profile

Quantitative Stats(Per Bruce Thompson)

  • Expenditure Focused Index (EFI)

  • Current ARL stats that could be used for benchmarking

    • Collections

    • User interactions

      • # Participants in group presentations

      • # Presentations to library groups

      • # Reference transactions

    • Collaborative Activities - Interlibrary loan activities

      • Borrowed total items

      • Loaned total items

  • Set of statistics related to the digital library (from ARL supplementary statistics)

Qualitative ProfileDeveloping New Metrics (per Yvonna Lincoln)

  • Uniqueness of collections

  • Defining the value of consortia

  • Administrative and budgetary efficiencies

  • Student outcomes/student learning/graduate success

  • Contributions to faculty productivity

  • Social frameworks/intellectual networks

  • Generating new knowledge

  • Creating the collective good with reusable assets

What Makes a Research Library?

  • Breadth and quality of collections and services

  • Sustained institutional commitment to the library

  • Distinctive resources in a variety of media

  • Services to the scholarly community

  • Preservation of research resources

  • Contributions of staff to the profession

  • Effective and innovative use of technology

  • Engagement of the library in academic planning

    Association of Research Libraries ‘Principles of Membership’

Group discussion

  • How do you go about developing a profile that is succinct and rich?

  • Other important areas that should be part of a qualitative profile?

  • Can LibQUAL+® be used in the profiles?

Library of the Future Will Also Need . . .

. . . To have it’s own data collection and management personnel, individuals who constantly collect, analyze and prepare reports on data regarding what services are being used, which portions of the collection are getting the highest usage, what materials are being lent through interlibrary loan, and who patrons are.

Documenting the libraries contributions to quality teaching, student outcomes, research productivity will become critical.

Making Library Assessment Work

  • ARL project approved in 2004

  • Funded by participating libraries

  • Site visits by Steve and Jim

    • Presentation

    • Interviews and meetings

    • Report to the Library

  • 24 libraries in U.S. and Canada visited in 2005-06

  • Succeeded by Effective, Sustainable and Practical Library Assessment in 2007

    • Open to all libraries

    • 6 libraries participating in 2007

What We Found

  • Strong interest in using assessment to improve customer service and demonstrate value of library

  • Many libraries uncertain on how to establish, maintain, and sustain effective assessment

  • Effectiveness of assessment program not dependent on library size or budget

  • Each library has a unique culture and mission. No “one size fits all” approach works.

  • Strong customer-focus and leadership support were keys to developing an effective and sustainable assessment

What are the lessons learned?

  • Understanding changes in users approach to information resources.

  • Service quality improvement is a key factor.

  • Understanding the impact of e-resources on library services - TRL.

  • Learning how to compete with Google.

  • Upfront investment in design and development.

  • Making the assessment service affordable, practical,&effective.

  • Assessment needs to be satisfying and fun.

User Needs Assessment and Academic Library Performance

Steve Hiller


Assessment and Planning

University of Washington Libraries

An “Aha” Moment

“[Access to online resources] has changed the way I do library research. It used to be a stage process: Initial trip, follow-up trip, fine-tuning trip. Now it’s a continuous interactive thing. I can follow-up anything at any time. While I’m writing I can keep going back and looking up items or verifying information.”

Graduate Student, Psychology (2002 UW Libraries focus group)

What Do We Need to Know About Our Customers?

  • Who are our customers (and potential customers)?

  • What are their teaching, learning, and research interests?

  • How do they work? What’s important to them?

  • How do they find information needed for their work?

  • How do they use library services? What would they change?

  • How do they differ from each other in library use/needs?

    How does the library add value to their work?

    How does the library contribute to their success?

How Do We Get Customer Information?

  • Surveys

  • Usage statistics

  • Focus groups

  • Observation

  • Usability

  • Interviews

  • Embedding

  • Data mining (local, institutional)

  • Logged activities

University of Washington(Site of the 2008 Library Assessment Conference!)

  • Located in beautiful Seattle metro population 3.2 million

  • Comprehensive public research university

    • 27,000 undergraduate students

    • 12,000 graduate and professional students (80 doctoral programs)

    • 4,000 research and teaching faculty

  • $800 million annually in federal research funds (2nd in U.S.)

  • Large research library system

    • $40 million annual budget

    • 150 librarians on 3 campuses

UW Libraries Assessment PrioritiesCustomer Needs, Use and Success

  • Information seeking behavior and use

  • Patterns of library use

  • Value of library

  • User needs

  • Library contribution to customer success

  • User satisfaction with services, collections, overall

  • Data to make informed and wise decisions that lead to resources and services that contribute to user success

UW Libraries:Assessment Methods Used

  • Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004, 2007

    • All faculty

    • Samples of undergraduate and graduate students

    • Research scientists, Health Sciences fellow/residents 2004-

  • In-library use surveys every 3 years beginning 1993

  • LibQUAL+™ from 2000-2003

  • Focus groups/Interviews (annually since 1998)

  • Observation (guided and non-obtrusive)

  • Usability

  • Use statistics/data mining

    Information about assessment program available at:

Our Latest Assessment Method

The Qualitative Provides the Key

  • Increasing use of such qualitative methods as, comments interviews, focus groups, usability, observation

  • Statistics/quantitative data often can’t tell us

    • Who, how, why

    • Value, impact, outcomes

  • Qualitative provides information directly from users

    • Their language

    • Their issues

    • Their work

  • Qualitative provides understanding

Researchers and Libraries:3 Recent Studies with Qualitative Focus

  • University of Minnesota

    • Extremely comfortable with electronic sources

    • Interdisciplinary critical in sciences

    • Inadequate methods for organizing research materials

  • New York University

    • Researchers (all disciplines) no longer tied to physical library

    • Physical library can play a “community” role

    • Expectations for info shaped by Web and commercial sector

  • University of Washington (Biosciences)

    • Start info search outside library space (virtual and physical)

    • All digital all the time

    • Could not come up with “new library services” unprompted

Reasons for UW Libraries Biosciences Review

  • Better understand how bioscientists work

  • Growing inter/multi/trans disciplinary work

  • Significant change in use patterns

  • Libraries responsiveness to these changes

  • Value of research enterprise to the University

  • Strengthening library connection to research

    Ensuring our services and resources support the work of the biosciences community

Biosciences Review Process (2006)

  • Define scope (e.g. what is “bioscience”?)

  • Identify and mine existing data sources

    • Extensive library assessment data

    • Institutional and external data

  • Acquire new information through a customer-centered qualitative approach

    • Environmental scan

    • Interviews

    • Focus groups

    • Peer library surveys


Biosciences Faculty Interview Themes

  • Library seen primarily as E-Journal provider

  • Physical library used only for items not available online

  • Start information search with Google and PubMed

  • Too busy for training, instruction, workshops

  • Faculty who teach undergrads use libraries differently

  • Could not come up with “new library services” unprompted

Biosciences Focus Group Themes

  • Content is primary link to the library

    • Identify library with ejournals; want more titles & backfiles

  • Provide library-related services and resources in our space not yours

    • Discovery begins primarily outside of library space with Google and Pub Med; Web of Science also important

    • Library services/tools seen as overly complex and fragmented

  • Print is dead, really dead

    • If not online want digital delivery/too many libraries

    • Go to physical library only as last resort

  • Data and reference management important to some

    • Bioresearcher toolkit, EndNote, JabRef, StatA

Biosciences Task Force Recommendations

  • Integrate search/discovery tools into users workflow

  • Expand/improve information/service delivery options

  • Make physical libraries more inviting/easier to use

    • Consolidate libraries, collections and service points

    • Reduce print holdings; focus on services

  • Use an integrated approach to collection allocations

  • Get librarians to work outside library space

  • Lead/partner in scholarly communications & E-science

  • Provide more targeted communication and marketing

Biosciences Review Follow-up : 2007 Actions

  • Appointed a Director, Cyberinfrastructure Initiatives & Special Asst to the Univ Libr for Biosciences & E-Science

  • Libraries Strategic Plan priorities for 2007 include:

    • Improve discovery to delivery (WorldCat Local etc.)

    • Reshape our physical facilities as discovery and learning centers

    • Strengthen existing delivery services, both physical and digital, while developing new, more rapid delivery services

    • Enhance and strengthen the Libraries support for UW’s scientific research infrastructure

    • Do market research before developing & promoting services

  • Informed development of Libraries 2007 Triennial Survey

In God We Trust: All Others Must Bring Data

UW Triennial Survey 2007 – Selected Questions

Mode of access/physical library uses and users

Resource type importance

Sources consulted for research

Primary reasons for using Libraries Web sites

Information literacy

Libraries contribution to work and academic success

Useful library services (new and/or expanded)


UW Triennial Library Survey Number of Respondents and Response Rate 1992-2007

I only wish I could reproduce the graduate reading room in my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.

Associate Professor, Psychology

Physical Library Users by Group forSelected Libraries (2005 In-Library Use Survey)

Undergrads 70%, Grads 25%, Faculty/Staff 5%

Physical Library Use by Academic Area(2005 In-Library Use Survey)

Off-Campus Remote Use1998-2007(Percentage using library services/collections at least 2xweek)

Importance of Books, Journals, DatabasesAcademic Area (2007, Faculty, Scale of 1 “not important” to 5 “very important)

Overall Collections Satisfaction in Selected Hum/Soc Sci Colleges (2007, Faculty and Grads)

Sources Consulted for Information on Research Topics(2007, Scale of 1 “Not at All” to 5 “Usually”)

“If it’s not on the Internet, it doesn’t exit.” My students at all levels behave this way.

They also all rely on Wikipedia almost exclusively for basic information.

Associate Professor, English

Primary Reasons for Using Libraries Web Sites2007 Faculty(at least 2x per week)

The ability to access full-text or PDF research articles online through the library subscriptions is my primary use of the library and is central to my research.Neurobiology Grad Student

Information Literacy: Importance to Undergrad Success & Rating Student Performance(% of Faculty marking 4 or 5 on scale of 1 “Low” to 5 “High” in 2007 Triennial Survey)

It is difficult to help students understand how to use sources, what the libraries can provide them, and help them appreciate the resources available to them beyond Google. Do you have any suggestions? Assistant Professor, Art

Undergrad Rating of Usefulness (Mean score on scale of 1 “Not Useful” to 5 “Very Useful”)

Usefulness of New/Expanded Services Faculty and Grad (% responding yes for each service)

You’re considering a freescanning service for journal articles? That would change my life!

Wow! I didn’t even know I could want that. Now I want that! Post-Doc, Oceanography

Usefulness of New/Expanded Services Undergrads(Physical Library Services in Red)

Odegaard needs a facelift. The lighting in terrible and the workspaces are old--not somewhere that you want to spend hours studying. I live in Suzzallo, however and I love it. Undergrad

Libraries Contribution to: (Scale of 1 “Minor” to 5 “Major”)

The UW libraries and librarians are the BEST. Our ability to access the system from the road (or home) and to review/download current articles is absolutely super. The resources on HealthLinks have helped train many young doctors and saved COUNTLESS lives. --Associate Professor, Medicine

2007 Triennial Survey Key Findings

  • Library satisfaction exceptionally high

  • Long-term changes in mode of use continue

    • Sharp increase in off-campus remote use by faculty/grad

    • Library as place still important to undergraduates

  • Open Internet gains as primary discovery medium

    • Library provided bibliographic databases decline in importance

  • Users want content delivered to them in their space & desired format

  • Faculty see information literacy as important to student success

    • Student performance in this area is rated low

    • Student evaluation of effectiveness is mixed

  • Libraries is major contributor to faculty research productivity and grad student academic success

What We’ve Learned about the UW Community

  • Libraries are still important source of information; however library less integrated into work/learn “flows”

  • Library needs/use patterns vary by and within academic areas and groups

  • Remote access is preferred method for faculty and grad students and has changed the way they use libraries

  • Faculty and students find information and use libraries differently than librarians prefer them too

  • Library/information environment is perceived as too complex; users find simpler ways (Google) to get info

  • Customers cannot predict the Libraries future

How UW Libraries Has Used Assessment

  • Extend hours in Undergraduate Library

  • Create more diversified student learning spaces

  • Eliminate print copies of journals

  • Enhance usability of discovery tools and website

  • Provide standardized service training for all staff

  • Stop activities that do not add value

  • Consolidate and merge branch libraries

  • Change/reallocate collections allocations

  • Change/reallocate staffing

  • Support budget requests to University

Closing the Loop: Using Data Effectively in Management

  • Use multiple assessment methods

  • Focus on user work and how they find & use information

  • Increase reliance on qualitative info to identify issues from user perspective

  • Learn from our users

  • Partner with other campus programs/institutions

  • Mine/repurpose existing data

    Decisions based on data not assumptions -“assumicide”

Our Challenge: Maintain High Value and Satisfaction(UW Overall Satisfaction 1995-2007)

You guys andgals rock!!!!!! We need to invest in our library system to keep it the best system in America. The tops! My reputation is in large part due to you. Professor, Forest Resources

Measures that Matter: Designing a Balanced Score Card

Jim Self, Director,

Management Information Services, UVA

The University of Virginia

  • 14,000 undergraduates

    • 66% in-state, 34% out of state

    • Most notable for liberal arts

    • Highly ranked by U.S. News

  • 6,000 graduate students

    • Prominent for humanities, law, business

    • Recent expansion in sciences

  • Located in Charlottesville

    • Metro population of 160,000

The University Libraries

  • 5 million volumes

  • 15 libraries

  • 350 FTE staff

  • $35 million budget

  • Top 20 in ARL

  • 2005 ACRL Academic Library of the Year

U.Va. Library Innovations

  • Electronic Text Center -- 1992

  • Customer Surveys – 1993, 1994

  • LEO Faculty Delivery -- 1994

  • MIS unit – 1996

  • Library café -- 1998

  • Balanced Scorecard – 2002

  • Scholars’ Lab -- 2006

Management Information Services

  • MIS committee formed in 1992

  • Evolved into a department 1996-2000

  • Currently three staff

  • Coordinates collection of statistics

  • Publishes annual statistical report

  • Coordinates assessment

  • Resource for management and staff

Collecting the Data at U.Va.

  • Customer Surveys

  • Staff Surveys

  • Mining Existing Records

  • Comparisons with peers

  • Qualitative techniques


  • Data are more credible if they are supported by other information

  • John Le Carre’s two proofs

UVa Customer Surveys

  • Faculty

    • 1993, 1996, 2000, 2004

    • Separate analysis for each academic unit

    • Response rates 59% to 70%

  • Students

    • 1994, 1998, 2001, 2005

    • Separate analysis for grads and undergrads

    • Undergrad response rates 43% to 50%

    • Grad response rates 54% to 63%

  • LibQUAL+® in 2006

    • Response rates 14% to 24%

Analyzing U.Va. Survey Results

  • Two Scores for Resources, Services, Facilities

    • Satisfaction = Mean Rating (1 to 5)

    • Visibility = Percentage Answering the Question

  • Permits comparison over time and among groups

  • Identifies areas that need more attention

Reference Activity and Visibility in Student Surveys

Constructing a Balanced Scorecard

  • Select a limited number of meaningful and measurable indicators for each dimension

  • Select targets for each indicator

  • Four dimensions:

    • User perspective

    • Internal processes perspective

    • Finance perspective

    • Future/growth perspective

Importance of Targets

  • Measure quantitatively

  • Set challenging, but achievable targets

  • Consider two sets of targets:

    • Complete success

    • Partial success

  • Aggregate regularly to provide feedback

  • Address problems that are revealed

The BSC at the U.Va. Library

  • Implemented in 2001

  • Results tallied FY02 through FY07

  • Completing metrics for FY08

  • Reporting results for FY07

  • A work in progress

Choosing the Metrics --Reflecting Values

  • What is important?

  • What are we trying to accomplish?

Choosing the Metrics--Diversity and Balance

  • Innovations and operations

  • Variety of measurements

Choosing the Metrics--Ensuring validity

  • Does the measurement accurately

    reflect the reality?

Choosing the Metrics--Being Practical

  • Use existing measures when possible

  • Use sampling

  • Collect data centrally

  • Minimize work by front line

What Do We Measure at U.Va.?

  • Customer survey ratings

  • Staff survey ratings

  • Timeliness and cost of service

  • Usability testing of web resources

  • Success in fund raising

  • Comparisons with peers

Reviewing the Perspectives

  • User

  • Internal Processes

  • Finance

  • Learning and Growth

Balanced ScorecardUVA Fiscal Year 2007

Metric U.1.A: Overall rating in student and faculty surveys

  • Target1: An average score of at least 4.25 (out of 5.00) from each of the major constituencies.

  • Target2: A score of at least 4.00.

    FY07 Result: Target2

    • Graduate students 4.08

    • Undergraduates 4.11

Metric I.1.A: Processing time for routine acquisitions

  • Target1: Process 90% of in-print books from North America within one month.

  • Target2: Process 80% of in-print books from North America within one month.

  • Result FY07: Target1.

    • 94% processed within one month.

Metric I.2.A.Staff Rating of Internal Communications

  • Target1: Positive scores (4 or 5) on 80% of responses to internal communications statement in biennial work life survey.

  • Target2: Positive scores on 60% or responses.

  • Result FY07: Did not meet target.

    • 48% or responses were positive.

Metric F.1.B. Library spending compared to University expenditures

  • Target1: : The University Library will account for at least 2.50% of the University’s academic division expenditures.

  • Target2: : The Library will account for at least 2.25% of expenditures.

  • Result FY07: Target1.

    • 2.71% ($26.2M of $963M)

Metric F.1.C.Amount of unrestricted development receipts.

  • Target1: Increase unrestricted (or minimally restricted) giving by 10% each year.

  • Target2: Increase of 5% per year.

  • Result FY07: Target1.

    • FY07 unrestricted receipts were $871,000; target was $411,000.

Metric F.2.A: Unit Cost of Electronic Serial Use

  • Target1: There should be no increase in unit cost each year.

  • Target2: Less than 5% annual increase in unit cost.

  • Result FY07: Target1.

    • Cost per journal article downloaded in FY07 was $1.98, compared to $2.10 in FY06.

Metric L.2.C.Comparing librarian salaries to peer groups.

  • Target1: Average librarian salaries should rank in the top 40% of average salaries at ARL libraries.

  • Target2: Rank in top 50%.

  • Result FY07: Target1.

    • Ranked 33 of 113. (Top 28%)

Trying your hand at a Scorecard

  • Devise one or two metrics per dimension

    • Should be something that matters

    • How would you measure it?

    • How do you define success?

Two more metrics from U.Va.

  • Representing values of the Library

Metric U.3.A: Circulation of new monographs

  • Target1: 60% of all newly cataloged print monographs should circulate within two years.

  • Target2: 50% should circulate within two years.

  • Result FY07: Target1.

    • 63% of monographs purchased in FY05 circulated within two years.

Metric U.4.B: Turnaround time for user requests

  • Target1: 90% of user requests for new books should be filled within 7 days.

  • Target2: 80% of user requests for new books should be filled within 7 days.

  • Result FY07: Target1.

    • 77% filled within 7 days.

To summarize…The Balanced Scorecard

  • Reflects the organization’s vision

  • Clarifies and communicates the vision

  • Provides a quick, but comprehensive, picture of the organization’s health

  • Login