1 / 13

Assessment: Guiding Efforts & Documenting Success

Assessment: Guiding Efforts & Documenting Success. Paul Beavers Assessment Officer Wayne State University. Assessment and Evaluation. “Assessment” and “evaluation” are commonly used interchangeably “Assessment” in academia is often restricted to the measurement of student learning outcomes

talasi
Download Presentation

Assessment: Guiding Efforts & Documenting Success

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment:Guiding Efforts & Documenting Success Paul Beavers Assessment Officer Wayne State University

  2. Assessment and Evaluation • “Assessment” and “evaluation” are commonly used interchangeably • “Assessment” in academia is often restricted to the measurement of student learning outcomes • Libraries effect student learning outcomes but our missions and goals entail a wider range of concerns

  3. The Task of Assessment • It is focused on the needs of those we serve • It measures the benefits for those we serve • It is based on our specific missions and goals • Its object is guiding improvement • It is distinct from evaluating staff or unit performance

  4. The Benefits of Assessment • It demonstrates our worth and commitment to continuous improvement to • Our community • Our “governors” • Our supporters (or potential supporters) • It allows us to see our own worth • It allows us to adapt to a changing environment • We can focus on current needs rather than traditional services • We can more clearly understand when and why we are succeeding

  5. Establishing Assessment Criteria • No project or service is all things to all people • The library’s strategic plan and/or the project plan supply the specifics • Who is being served? • What are the benefits to them? • How are the benefits to be measured? • What amount of use is expected? • What level of use is expected?

  6. Ask the Customer • Comment/Complaint Boxes • Conversations • Focus Groups • Surveys • Evaluation Forms

  7. Observe the customer • An award winning marketing campaign, web site, or building design are unimportant if they don’t draw customers • What we ought to have vs. what are customers want • What our customers say they want vs. what they show they want • Use of our collections print and electronic • Use of our facilities • When our services and facilities are actually used

  8. Statistics • Keep statistics that allow the quality of service to be inferred • The number of searches for items missing from the stacks • The time taken to retrieve an item from storage or move it from one branch to another • The number of circulations per month/year and the percentage of the collection that circulates • Place the statistics in context • Are the numbers rising or falling • Are their ratios that make the numbers meaningful to the library staff, library management, our governors, our public? • Distinguish between ongoing statistics and special projects

  9. Benchmarking • Keeping “standard” statistics and participation in standard surveys allows libraries to identify peer institutions • Management, “governors”, and potential supporters want to know how we measure up against peers • Benchmarking can help use formulate reasonable goals • Benchmarking can help us decide which programs and services to emulate • Statistics have made the library community aware of changes in our shared environment

  10. Share Data with the Public • Web pages pages tell our customers what we do and how well we do it • A Statistical Profile of the Wayne State University Libraries http://www.lib.wayne.edu/geninfo/about/stats/ • The Ann Arbor District Library Statistics http://www.aadl.org/aboutus/annualreport/statistics • PennLibrary Facts http://metrics.library.upenn.edu/FACTS07.pdf

  11. Share Data with the Staff • Staff at all levels want to know the results of our assessments and statistics gathering • University of Washington Triennial Surveys http://www.lib.washington.edu/assessment/surveys/survey2007/ • Clear assessment strategies will increase staff buy-in of the planning and assessment process • IUPUI Planning and Assessment http://www.ulib.iupui.edu/prod/portfolio/plan/ • Measurable outcomes should be a part of project planning and team charges

  12. Decision Making should be Data Driven • University of Virginia Management Information Systems is a strong example http://www.lib.virginia.edu/mis/ • It has created a “management dashboard” though the Balanced Scorecard http://www.lib.virginia.edu/bsc/overview.html • It has well defined targets for its operations http://www.lib.virginia.edu/bsc/metrics/all0607.html

  13. Thanks Paul Beavers paul.beavers@wayne.edu

More Related