Libqual beyond applying your survey results other performance measures in library practice
Download
1 / 96

- PowerPoint PPT Presentation


  • 788 Views
  • Updated On :

LibQUAL+ ® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice. LibQUAL+ ® Canada Workshop October 24-25, 2007 Ottawa, Ontario, Canada. Martha Kyrillidou , Director, Statistics and Service Quality Programs, ARL

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '' - Pat_Xavi


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Libqual beyond applying your survey results other performance measures in library practice l.jpg

LibQUAL+® & Beyond: Applying Your Survey Results & Other Performance Measures in Library Practice

LibQUAL+® Canada Workshop

October 24-25, 2007

Ottawa, Ontario, Canada

Martha Kyrillidou, Director, Statistics and Service Quality Programs, ARL

Steve Hiller, Director, Assessment and Planning, UW

Jim Self, Director, Management and Information Services, UVA


Slide2 l.jpg

Martha Kyrillidou, Director

Statistics and Service Quality Programs

Association of Research Libraries


What s in a library l.jpg
What’s in a “Library”?

A word is not crystal, transparent and unchanged; it is the skin of a living thought, and may vary greatly in color and content according to the circumstances and time in which it is used.

--Justice Oliver Wendell Holmes


What s in a word l.jpg
What’s in a word?

What makes a qualitylibrary?

“Quality much like beauty is in the eye of the beholder”


Library assessment and its global dimensions l.jpg
Library Assessment and its Global Dimensions

  • Markets and people exposed to economic and social frameworks unheard of before

  • Competing internationally

  • Library users exposed to global forces

  • Libraries facing similar challenges

  • Libraries as the Internet

  • Libraries as Google

  • Libraries as Collaborative Spaces


Library assessment l.jpg
Library Assessment

Library assessment provides a structured process to learn about our communities, their work and the libraries connection to what they do

The information acquired through library assessment is used in an iterative manner to improve library programs and services and make our libraries responsive to the needs of our communities.

Academic libraries do not exist in a vacuum but are part of a larger institution. Assessment within the institution may take place in individual areas as well as at the broad institutional level.


Thinking strategically about library futures some assessment related questions l.jpg
Thinking Strategically About Library Futures: Some Assessment-Related Questions

  • What is the central work of the library and how can we do more, differently, and at less cost?

  • What important services does the library provide that others can’t?

  • What advantages does the research library possess?

  • How is customer behavior changing?

  • How do we add value to our customers work?

  • What are the essential factors responsible for library success now and in the future?


Free speech wall charlottesville sept 2006 l.jpg
Free speech wall, Assessment-Related QuestionsCharlottesville, Sept 2006


Ifla measuring quality l.jpg
IFLA: Measuring Quality Assessment-Related Questions

  • Resources, infrastructure: What services does the library offer?

  • Use: How are the services accepted?

  • Efficiency: Are the services offered cost-effectively?

  • Potentials and Development: Are there sufficient potentials for future development?


Assessment at arl l.jpg
Assessment at ARL Assessment-Related Questions

  • A gateway to assessment tools: StatsQUAL®:

    • ARL Statistics -- E-Metrics

    • LibQUAL+®

    • DigiQUAL®

    • MINES for Libraries®

  • Library Assessment Conferences

  • Service Quality Evaluation Academy

  • Library Assessment Blog

  • Making Library Assessment Work

  • ESP Assessment

    • Effective, Sustainable, Practical


Assessment at carl l.jpg
Assessment at CARL Assessment-Related Questions


Assessment at sconul l.jpg
Assessment at SCONUL Assessment-Related Questions


Assessment at caul l.jpg
Assessment at CAUL Assessment-Related Questions


Assessing the value of networked electronic services l.jpg
Assessing the Value of Networked Assessment-Related QuestionsElectronic Services

The MINES survey

Measuring the Impact of Networked Electronic Services (MINES) - MINES for Libraries®


What are we measuring reviewing the arl statistics l.jpg
What Are We Measuring? Assessment-Related QuestionsReviewing the ARL Statistics

October 2005, ARL Board approved a study to:

  • Determine if there are new ways of describing research library collections.

    • What is it we are currently measuring

    • Are they the right data

    • Develop alternative models

  • Develop a profile of the characteristics of a contemporary research library

  • Determine/develop new meaningful measures to augment current ones to support this profile


Quantitative stats per bruce thompson l.jpg
Quantitative Stats Assessment-Related Questions(Per Bruce Thompson)

  • Expenditure Focused Index (EFI)

  • Current ARL stats that could be used for benchmarking

    • Collections

    • User interactions

      • # Participants in group presentations

      • # Presentations to library groups

      • # Reference transactions

    • Collaborative Activities - Interlibrary loan activities

      • Borrowed total items

      • Loaned total items

  • Set of statistics related to the digital library (from ARL supplementary statistics)


Qualitative profile developing new metrics per yvonna lincoln l.jpg
Qualitative Profile Assessment-Related QuestionsDeveloping New Metrics (per Yvonna Lincoln)

  • Uniqueness of collections

  • Defining the value of consortia

  • Administrative and budgetary efficiencies

  • Student outcomes/student learning/graduate success

  • Contributions to faculty productivity

  • Social frameworks/intellectual networks

  • Generating new knowledge

  • Creating the collective good with reusable assets


What makes a research library l.jpg
What Makes a Research Library? Assessment-Related Questions

  • Breadth and quality of collections and services

  • Sustained institutional commitment to the library

  • Distinctive resources in a variety of media

  • Services to the scholarly community

  • Preservation of research resources

  • Contributions of staff to the profession

  • Effective and innovative use of technology

  • Engagement of the library in academic planning

    Association of Research Libraries ‘Principles of Membership’


Group discussion l.jpg
Group discussion Assessment-Related Questions

  • How do you go about developing a profile that is succinct and rich?

  • Other important areas that should be part of a qualitative profile?

  • Can LibQUAL+® be used in the profiles?


Library of the future will also need l.jpg
Library of the Future Will Also Need . . . Assessment-Related Questions

. . . To have it’s own data collection and management personnel, individuals who constantly collect, analyze and prepare reports on data regarding what services are being used, which portions of the collection are getting the highest usage, what materials are being lent through interlibrary loan, and who patrons are.

Documenting the libraries contributions to quality teaching, student outcomes, research productivity will become critical.


Making library assessment work l.jpg
Making Library Assessment Work Assessment-Related Questions

  • ARL project approved in 2004

  • Funded by participating libraries

  • Site visits by Steve and Jim

    • Presentation

    • Interviews and meetings

    • Report to the Library

  • 24 libraries in U.S. and Canada visited in 2005-06

  • Succeeded by Effective, Sustainable and Practical Library Assessment in 2007

    • Open to all libraries

    • 6 libraries participating in 2007


What we found l.jpg
What We Found Assessment-Related Questions

  • Strong interest in using assessment to improve customer service and demonstrate value of library

  • Many libraries uncertain on how to establish, maintain, and sustain effective assessment

  • Effectiveness of assessment program not dependent on library size or budget

  • Each library has a unique culture and mission. No “one size fits all” approach works.

  • Strong customer-focus and leadership support were keys to developing an effective and sustainable assessment


What are the lessons learned l.jpg
What are the lessons learned? Assessment-Related Questions

  • Understanding changes in users approach to information resources.

  • Service quality improvement is a key factor.

  • Understanding the impact of e-resources on library services - TRL.

  • Learning how to compete with Google.

  • Upfront investment in design and development.

  • Making the assessment service affordable, practical,&effective.

  • Assessment needs to be satisfying and fun.


User needs assessment and academic library performance l.jpg

User Needs Assessment and Academic Library Performance Assessment-Related Questions

Steve Hiller

Director

Assessment and Planning

University of Washington Libraries


An aha moment l.jpg
An “Aha” Moment Assessment-Related Questions

“[Access to online resources] has changed the way I do library research. It used to be a stage process: Initial trip, follow-up trip, fine-tuning trip. Now it’s a continuous interactive thing. I can follow-up anything at any time. While I’m writing I can keep going back and looking up items or verifying information.”

Graduate Student, Psychology (2002 UW Libraries focus group)


What do we need to know about our customers l.jpg
What Do We Need to Know About Our Customers? Assessment-Related Questions

  • Who are our customers (and potential customers)?

  • What are their teaching, learning, and research interests?

  • How do they work? What’s important to them?

  • How do they find information needed for their work?

  • How do they use library services? What would they change?

  • How do they differ from each other in library use/needs?

    How does the library add value to their work?

    How does the library contribute to their success?


How do we get customer information l.jpg
How Do We Get Customer Information? Assessment-Related Questions

  • Surveys

  • Usage statistics

  • Focus groups

  • Observation

  • Usability

  • Interviews

  • Embedding

  • Data mining (local, institutional)

  • Logged activities


University of washington site of the 2008 library assessment conference l.jpg
University of Washington Assessment-Related Questions(Site of the 2008 Library Assessment Conference!)

  • Located in beautiful Seattle metro population 3.2 million

  • Comprehensive public research university

    • 27,000 undergraduate students

    • 12,000 graduate and professional students (80 doctoral programs)

    • 4,000 research and teaching faculty

  • $800 million annually in federal research funds (2nd in U.S.)

  • Large research library system

    • $40 million annual budget

    • 150 librarians on 3 campuses


Uw libraries assessment priorities customer needs use and success l.jpg
UW Libraries Assessment Priorities Assessment-Related QuestionsCustomer Needs, Use and Success

  • Information seeking behavior and use

  • Patterns of library use

  • Value of library

  • User needs

  • Library contribution to customer success

  • User satisfaction with services, collections, overall

  • Data to make informed and wise decisions that lead to resources and services that contribute to user success


Uw libraries assessment methods used l.jpg
UW Libraries: Assessment-Related QuestionsAssessment Methods Used

  • Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004, 2007

    • All faculty

    • Samples of undergraduate and graduate students

    • Research scientists, Health Sciences fellow/residents 2004-

  • In-library use surveys every 3 years beginning 1993

  • LibQUAL+™ from 2000-2003

  • Focus groups/Interviews (annually since 1998)

  • Observation (guided and non-obtrusive)

  • Usability

  • Use statistics/data mining

    Information about assessment program available at:

    http://www.lib.washington.edu/assessment/


Our latest assessment method l.jpg
Our Latest Assessment Method Assessment-Related Questions


The qualitative provides the key l.jpg
The Qualitative Provides the Key Assessment-Related Questions

  • Increasing use of such qualitative methods as, comments interviews, focus groups, usability, observation

  • Statistics/quantitative data often can’t tell us

    • Who, how, why

    • Value, impact, outcomes

  • Qualitative provides information directly from users

    • Their language

    • Their issues

    • Their work

  • Qualitative provides understanding


Researchers and libraries 3 recent studies with qualitative focus l.jpg
Researchers and Libraries: Assessment-Related Questions3 Recent Studies with Qualitative Focus

  • University of Minnesota

    • Extremely comfortable with electronic sources

    • Interdisciplinary critical in sciences

    • Inadequate methods for organizing research materials

  • New York University

    • Researchers (all disciplines) no longer tied to physical library

    • Physical library can play a “community” role

    • Expectations for info shaped by Web and commercial sector

  • University of Washington (Biosciences)

    • Start info search outside library space (virtual and physical)

    • All digital all the time

    • Could not come up with “new library services” unprompted


Reasons for uw libraries biosciences review l.jpg
Reasons for UW Libraries Biosciences Review Assessment-Related Questions

  • Better understand how bioscientists work

  • Growing inter/multi/trans disciplinary work

  • Significant change in use patterns

  • Libraries responsiveness to these changes

  • Value of research enterprise to the University

  • Strengthening library connection to research

    Ensuring our services and resources support the work of the biosciences community


Biosciences review process 2006 l.jpg
Biosciences Review Process Assessment-Related Questions(2006)

  • Define scope (e.g. what is “bioscience”?)

  • Identify and mine existing data sources

    • Extensive library assessment data

    • Institutional and external data

  • Acquire new information through a customer-centered qualitative approach

    • Environmental scan

    • Interviews

    • Focus groups

    • Peer library surveys

      NO NEW USER SURVEYS


Biosciences faculty interview themes l.jpg
Biosciences Faculty Interview Themes Assessment-Related Questions

  • Library seen primarily as E-Journal provider

  • Physical library used only for items not available online

  • Start information search with Google and PubMed

  • Too busy for training, instruction, workshops

  • Faculty who teach undergrads use libraries differently

  • Could not come up with “new library services” unprompted


Biosciences focus group themes l.jpg
Biosciences Focus Group Themes Assessment-Related Questions

  • Content is primary link to the library

    • Identify library with ejournals; want more titles & backfiles

  • Provide library-related services and resources in our space not yours

    • Discovery begins primarily outside of library space with Google and Pub Med; Web of Science also important

    • Library services/tools seen as overly complex and fragmented

  • Print is dead, really dead

    • If not online want digital delivery/too many libraries

    • Go to physical library only as last resort

  • Data and reference management important to some

    • Bioresearcher toolkit, EndNote, JabRef, StatA


Biosciences task force recommendations l.jpg
Biosciences Task Force Recommendations Assessment-Related Questions

  • Integrate search/discovery tools into users workflow

  • Expand/improve information/service delivery options

  • Make physical libraries more inviting/easier to use

    • Consolidate libraries, collections and service points

    • Reduce print holdings; focus on services

  • Use an integrated approach to collection allocations

  • Get librarians to work outside library space

  • Lead/partner in scholarly communications & E-science

  • Provide more targeted communication and marketing


Biosciences review follow up 2007 actions l.jpg
Biosciences Review Follow-up : 2007 Actions Assessment-Related Questions

  • Appointed a Director, Cyberinfrastructure Initiatives & Special Asst to the Univ Libr for Biosciences & E-Science

  • Libraries Strategic Plan priorities for 2007 include:

    • Improve discovery to delivery (WorldCat Local etc.)

    • Reshape our physical facilities as discovery and learning centers

    • Strengthen existing delivery services, both physical and digital, while developing new, more rapid delivery services

    • Enhance and strengthen the Libraries support for UW’s scientific research infrastructure

    • Do market research before developing & promoting services

  • Informed development of Libraries 2007 Triennial Survey


In god we trust all others must bring data l.jpg
In God We Trust: Assessment-Related QuestionsAll Others Must Bring Data

UW Triennial Survey 2007 – Selected Questions

Mode of access/physical library uses and users

Resource type importance

Sources consulted for research

Primary reasons for using Libraries Web sites

Information literacy

Libraries contribution to work and academic success

Useful library services (new and/or expanded)

Satisfaction


Uw triennial library survey number of respondents and response rate 1992 2007 l.jpg
UW Triennial Library Survey Assessment-Related QuestionsNumber of Respondents and Response Rate 1992-2007


Slide42 l.jpg

I only wish I could reproduce the graduate reading room in my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.

Associate Professor, Psychology


Physical library users by group for selected libraries 2005 in library use survey l.jpg
Physical Library Users by Group for my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.Selected Libraries (2005 In-Library Use Survey)

Undergrads 70%, Grads 25%, Faculty/Staff 5%


Physical library use by academic area 2005 in library use survey l.jpg
Physical Library Use by Academic Area my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.(2005 In-Library Use Survey)


Off campus remote use 1998 2007 percentage using library services collections at least 2x week l.jpg
Off-Campus Remote Use my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.1998-2007(Percentage using library services/collections at least 2xweek)


Slide50 l.jpg
Importance of Books, Journals, Databases my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.Academic Area (2007, Faculty, Scale of 1 “not important” to 5 “very important)


Overall collections satisfaction in selected hum soc sci colleges 2007 faculty and grads l.jpg
Overall Collections Satisfaction my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.in Selected Hum/Soc Sci Colleges (2007, Faculty and Grads)


Sources consulted for information on research topics 2007 scale of 1 not at all to 5 usually l.jpg
Sources Consulted for Information on Research Topics my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.(2007, Scale of 1 “Not at All” to 5 “Usually”)

“If it’s not on the Internet, it doesn’t exit.” My students at all levels behave this way.

They also all rely on Wikipedia almost exclusively for basic information.

Associate Professor, English


Primary reasons for using libraries web sites 2007 faculty at least 2x per week l.jpg
Primary Reasons for Using Libraries Web Sites my home because I do so much of my reading/research online now. Oh well, at least I can be in my slippers.2007 Faculty(at least 2x per week)

The ability to access full-text or PDF research articles online through the library subscriptions is my primary use of the library and is central to my research.Neurobiology Grad Student


Slide55 l.jpg

Information Literacy: Importance to Undergrad Success & Rating Student Performance(% of Faculty marking 4 or 5 on scale of 1 “Low” to 5 “High” in 2007 Triennial Survey)

It is difficult to help students understand how to use sources, what the libraries can provide them, and help them appreciate the resources available to them beyond Google. Do you have any suggestions? Assistant Professor, Art


Undergrad rating of usefulness mean score on scale of 1 not useful to 5 very useful l.jpg
Undergrad Rating of Usefulness Rating Student Performance(Mean score on scale of 1 “Not Useful” to 5 “Very Useful”)


Usefulness of new expanded services faculty and grad responding yes for each service l.jpg
Usefulness of New/Expanded Services Rating Student PerformanceFaculty and Grad (% responding yes for each service)

You’re considering a freescanning service for journal articles? That would change my life!

Wow! I didn’t even know I could want that. Now I want that! Post-Doc, Oceanography


Usefulness of new expanded services undergrads physical library services in red l.jpg
Usefulness of New/Expanded Services Undergrads Rating Student Performance(Physical Library Services in Red)

Odegaard needs a facelift. The lighting in terrible and the workspaces are old--not somewhere that you want to spend hours studying. I live in Suzzallo, however and I love it. Undergrad


Libraries contribution to scale of 1 minor to 5 major l.jpg
Libraries Contribution to: Rating Student Performance (Scale of 1 “Minor” to 5 “Major”)

The UW libraries and librarians are the BEST. Our ability to access the system from the road (or home) and to review/download current articles is absolutely super. The resources on HealthLinks have helped train many young doctors and saved COUNTLESS lives. --Associate Professor, Medicine


2007 triennial survey key findings l.jpg
2007 Triennial Survey Key Findings Rating Student Performance

  • Library satisfaction exceptionally high

  • Long-term changes in mode of use continue

    • Sharp increase in off-campus remote use by faculty/grad

    • Library as place still important to undergraduates

  • Open Internet gains as primary discovery medium

    • Library provided bibliographic databases decline in importance

  • Users want content delivered to them in their space & desired format

  • Faculty see information literacy as important to student success

    • Student performance in this area is rated low

    • Student evaluation of effectiveness is mixed

  • Libraries is major contributor to faculty research productivity and grad student academic success


What we ve learned about the uw community l.jpg
What We’ve Learned about the UW Community Rating Student Performance

  • Libraries are still important source of information; however library less integrated into work/learn “flows”

  • Library needs/use patterns vary by and within academic areas and groups

  • Remote access is preferred method for faculty and grad students and has changed the way they use libraries

  • Faculty and students find information and use libraries differently than librarians prefer them too

  • Library/information environment is perceived as too complex; users find simpler ways (Google) to get info

  • Customers cannot predict the Libraries future


How uw libraries has used assessment l.jpg
How UW Libraries Has Used Assessment Rating Student Performance

  • Extend hours in Undergraduate Library

  • Create more diversified student learning spaces

  • Eliminate print copies of journals

  • Enhance usability of discovery tools and website

  • Provide standardized service training for all staff

  • Stop activities that do not add value

  • Consolidate and merge branch libraries

  • Change/reallocate collections allocations

  • Change/reallocate staffing

  • Support budget requests to University


Closing the loop using data effectively in management l.jpg
Closing the Loop: Rating Student PerformanceUsing Data Effectively in Management

  • Use multiple assessment methods

  • Focus on user work and how they find & use information

  • Increase reliance on qualitative info to identify issues from user perspective

  • Learn from our users

  • Partner with other campus programs/institutions

  • Mine/repurpose existing data

    Decisions based on data not assumptions -“assumicide”


Our challenge maintain high value and satisfaction uw overall satisfaction 1995 2007 l.jpg
Our Challenge: Maintain High Value and Satisfaction Rating Student Performance(UW Overall Satisfaction 1995-2007)

You guys andgals rock!!!!!! We need to invest in our library system to keep it the best system in America. The tops! My reputation is in large part due to you. Professor, Forest Resources


Measures that matter designing a balanced score card l.jpg

Measures that Matter: Designing a Balanced Score Card Rating Student Performance

Jim Self, Director,

Management Information Services, UVA


The university of virginia l.jpg
The University of Virginia Rating Student Performance

  • 14,000 undergraduates

    • 66% in-state, 34% out of state

    • Most notable for liberal arts

    • Highly ranked by U.S. News

  • 6,000 graduate students

    • Prominent for humanities, law, business

    • Recent expansion in sciences

  • Located in Charlottesville

    • Metro population of 160,000


The university libraries l.jpg
The University Libraries Rating Student Performance

  • 5 million volumes

  • 15 libraries

  • 350 FTE staff

  • $35 million budget

  • Top 20 in ARL

  • 2005 ACRL Academic Library of the Year


U va library innovations l.jpg
U.Va. Library Innovations Rating Student Performance

  • Electronic Text Center -- 1992

  • Customer Surveys – 1993, 1994

  • LEO Faculty Delivery -- 1994

  • MIS unit – 1996

  • Library café -- 1998

  • Balanced Scorecard – 2002

  • Scholars’ Lab -- 2006


Management information services l.jpg
Management Information Services Rating Student Performance

  • MIS committee formed in 1992

  • Evolved into a department 1996-2000

  • Currently three staff

  • Coordinates collection of statistics

  • Publishes annual statistical report

  • Coordinates assessment

  • Resource for management and staff


Collecting the data at u va l.jpg
Collecting the Data at U.Va. Rating Student Performance

  • Customer Surveys

  • Staff Surveys

  • Mining Existing Records

  • Comparisons with peers

  • Qualitative techniques


Corroboration l.jpg
Corroboration Rating Student Performance

  • Data are more credible if they are supported by other information

  • John Le Carre’s two proofs


Uva customer surveys l.jpg
UVa Customer Surveys Rating Student Performance

  • Faculty

    • 1993, 1996, 2000, 2004

    • Separate analysis for each academic unit

    • Response rates 59% to 70%

  • Students

    • 1994, 1998, 2001, 2005

    • Separate analysis for grads and undergrads

    • Undergrad response rates 43% to 50%

    • Grad response rates 54% to 63%

  • LibQUAL+® in 2006

    • Response rates 14% to 24%


Analyzing u va survey results l.jpg
Analyzing U.Va. Survey Results Rating Student Performance

  • Two Scores for Resources, Services, Facilities

    • Satisfaction = Mean Rating (1 to 5)

    • Visibility = Percentage Answering the Question

  • Permits comparison over time and among groups

  • Identifies areas that need more attention



Constructing a balanced scorecard l.jpg
Constructing a Balanced Scorecard Rating Student Performance

  • Select a limited number of meaningful and measurable indicators for each dimension

  • Select targets for each indicator

  • Four dimensions:

    • User perspective

    • Internal processes perspective

    • Finance perspective

    • Future/growth perspective


Importance of targets l.jpg
Importance of Targets Rating Student Performance

  • Measure quantitatively

  • Set challenging, but achievable targets

  • Consider two sets of targets:

    • Complete success

    • Partial success

  • Aggregate regularly to provide feedback

  • Address problems that are revealed


The bsc at the u va library l.jpg
The BSC at the U.Va. Library Rating Student Performance

  • Implemented in 2001

  • Results tallied FY02 through FY07

  • Completing metrics for FY08

  • Reporting results for FY07

  • A work in progress


Choosing the metrics reflecting values l.jpg
Choosing the Metrics Rating Student Performance--Reflecting Values

  • What is important?

  • What are we trying to accomplish?


Choosing the metrics diversity and balance l.jpg
Choosing the Metrics Rating Student Performance--Diversity and Balance

  • Innovations and operations

  • Variety of measurements


Choosing the metrics ensuring validity l.jpg
Choosing the Metrics Rating Student Performance--Ensuring validity

  • Does the measurement accurately

    reflect the reality?


Choosing the metrics being practical l.jpg
Choosing the Metrics Rating Student Performance--Being Practical

  • Use existing measures when possible

  • Use sampling

  • Collect data centrally

  • Minimize work by front line


What do we measure at u va l.jpg
What Do We Measure at U.Va.? Rating Student Performance

  • Customer survey ratings

  • Staff survey ratings

  • Timeliness and cost of service

  • Usability testing of web resources

  • Success in fund raising

  • Comparisons with peers


Reviewing the perspectives l.jpg
Reviewing the Perspectives Rating Student Performance

  • User

  • Internal Processes

  • Finance

  • Learning and Growth


Balanced scorecard uva fiscal year 2007 l.jpg
Balanced Scorecard Rating Student PerformanceUVA Fiscal Year 2007


Metric u 1 a overall rating in student and faculty surveys l.jpg
Metric U.1.A: Overall rating in student and faculty surveys Rating Student Performance

  • Target1: An average score of at least 4.25 (out of 5.00) from each of the major constituencies.

  • Target2: A score of at least 4.00.

    FY07 Result: Target2

    • Graduate students 4.08

    • Undergraduates 4.11


Metric i 1 a processing time for routine acquisitions l.jpg
Metric I.1.A: Processing time for routine acquisitions Rating Student Performance

  • Target1: Process 90% of in-print books from North America within one month.

  • Target2: Process 80% of in-print books from North America within one month.

  • Result FY07: Target1.

    • 94% processed within one month.


Metric i 2 a staff rating of internal communications l.jpg
Metric I.2.A. Rating Student PerformanceStaff Rating of Internal Communications

  • Target1: Positive scores (4 or 5) on 80% of responses to internal communications statement in biennial work life survey.

  • Target2: Positive scores on 60% or responses.

  • Result FY07: Did not meet target.

    • 48% or responses were positive.


Metric f 1 b library spending compared to university expenditures l.jpg
Metric F.1.B. Library spending compared to University Rating Student Performanceexpenditures

  • Target1: : The University Library will account for at least 2.50% of the University’s academic division expenditures.

  • Target2: : The Library will account for at least 2.25% of expenditures.

  • Result FY07: Target1.

    • 2.71% ($26.2M of $963M)


Metric f 1 c amount of unrestricted development receipts l.jpg
Metric F.1.C. Rating Student PerformanceAmount of unrestricted development receipts.

  • Target1: Increase unrestricted (or minimally restricted) giving by 10% each year.

  • Target2: Increase of 5% per year.

  • Result FY07: Target1.

    • FY07 unrestricted receipts were $871,000; target was $411,000.


Metric f 2 a unit cost of electronic serial use l.jpg
Metric F.2.A: Unit Cost of Electronic Serial Use Rating Student Performance

  • Target1: There should be no increase in unit cost each year.

  • Target2: Less than 5% annual increase in unit cost.

  • Result FY07: Target1.

    • Cost per journal article downloaded in FY07 was $1.98, compared to $2.10 in FY06.


Metric l 2 c comparing librarian salaries to peer groups l.jpg
Metric L.2.C. Rating Student PerformanceComparing librarian salaries to peer groups.

  • Target1: Average librarian salaries should rank in the top 40% of average salaries at ARL libraries.

  • Target2: Rank in top 50%.

  • Result FY07: Target1.

    • Ranked 33 of 113. (Top 28%)


Trying your hand at a scorecard l.jpg
Trying your hand at a Scorecard Rating Student Performance

  • Devise one or two metrics per dimension

    • Should be something that matters

    • How would you measure it?

    • How do you define success?


Two more metrics from u va l.jpg
Two more metrics from U.Va. Rating Student Performance

  • Representing values of the Library


Metric u 3 a circulation of new monographs l.jpg
Metric U.3.A: Circulation of new monographs Rating Student Performance

  • Target1: 60% of all newly cataloged print monographs should circulate within two years.

  • Target2: 50% should circulate within two years.

  • Result FY07: Target1.

    • 63% of monographs purchased in FY05 circulated within two years.


Metric u 4 b turnaround time for user requests l.jpg
Metric U.4.B: Turnaround time for user requests Rating Student Performance

  • Target1: 90% of user requests for new books should be filled within 7 days.

  • Target2: 80% of user requests for new books should be filled within 7 days.

  • Result FY07: Target1.

    • 77% filled within 7 days.


To summarize the balanced scorecard l.jpg
To summarize… Rating Student PerformanceThe Balanced Scorecard

  • Reflects the organization’s vision

  • Clarifies and communicates the vision

  • Provides a quick, but comprehensive, picture of the organization’s health