1 / 17

Assessment-based strategies for building connections with academic departments

This presentation discusses assessment-based strategies for building connections with academic departments. It covers topics such as the benefits of process evaluation, ways to link assessment and outreach activities, and approaches that can be applied in libraries to engage stakeholders.

jeannetter
Download Presentation

Assessment-based strategies for building connections with academic departments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment-based strategies for building connections with academic departments 2008 Library Assessment Conference August 5, 2008 Seattle, WA

  2. Goals (and evaluation criteria?) • Recognize the evaluation capacity building benefits of process evaluation • Gain insight into ways that assessment and outreach activities can be linked • Identify approaches that might be applied in your library to inform and engage stakeholders

  3. Presentation outline • Background, approach, context • Case study illustration - Academic department reports • Findings • Implications

  4. Context and background • Perkins Library & Duke • University re-accreditation • Library strategic plan emphasis on assessment • Center for Instructional Technology • High visibility and reputation for leadership in assessment • Library public services and CIT shared focus and challenges for outreach to academic departments, engagement

  5. Utilization-focused evaluation • Engaging stakeholders in the entire evaluation process from design to implementation of recommendations • Prioritize issues of greatest importance to those in a position to directly make use of findings • Reduce org. culture barriers that inhibit use of results by increasing transparency, empowering stakeholders • Process Use benefits

  6. Process use ‘Ways in which being engaged in the processes of evaluation can be useful quite apart from the findings that may emerge from these processes’ (Patton, 1997) Includes - • Organizational development, specifically evaluation capacity building • Increased capacity to make use of evaluation findings, know how to use evaluation information Patton, 2004, “On Evaluation Use: Evaluative Thinking and Process Use”

  7. Evaluation capacity building Assessment & evaluation activities • Direct consequences • Knowledge • Use of findings • Process use • Evaluation • capacity • Skills • Eval knowledge & logic • Organizational learning • capacity • culture of experimentation Adapted from Cousins, Goh, Clark & Lee (2004)

  8. Case Study Academic department reporting • CIT department report experience • Internal evangelism for stakeholder-focused assessment • General heightened interest in assessment and effective use of data among library leadership • Institutional context

  9. Overview of the project • Data audit • Buy-in from leadership, key internal constituencies • Prototypes • Internal stakeholder review (multiple iterations) • Distribution and outreach • Assessment Sample reports, cover letter: http://ww.duke.edu/~ybelang/lac08

  10. Types of data included • Service descriptions & contact info • Courses receiving custom support (instruction sessions, web guides, e-reserves, digitization, etc.) • Funded/supported faculty projects • Faculty inquiries, consultations (anonymous, CIT only) • CMS use (Blackboard) Sample reports, cover letter: http://ww.duke.edu/~ybelang/lac08

  11. Offline data sources • Excel spreadsheets • XML files extracted from Aleph Gory technical details Business Intelligence software to group, filter, summarize and funnel static and dynamic Content into formatted template (Crystal Reports) • Live server data connections • ODBC to MySQL database PDF or RTF reports, cover letters

  12. Static content Back-end view of Crystal Reports Subreports pull from multiple dynamic live databases and clean warehoused sources

  13. Key considerations • Who’s your audience? • A department’s chairs point of view • What kind of reaction do you hope for? Fear? • What action do you want the reader to take? • Best and worst case scenarios, political considerations

  14. Major hurdles / milestones • Pervasive unit of reporting • Finding and implementing the right software, license $ • Figuring out what data of value exist, and who has it • Data clean-up, reformatting • Getting buy-in from multiple stakeholder groups with different perspectives • Managing those you don’t manage

  15. Lessons learned • Get input from different kinds of stakeholders early in the project • Patience and persistence • Look for opportunities to demonstrate value “Good, quick, cheap – choose 2”

  16. Works Cited • Cousins, J. B., Goh, S., Clark, S., & Lee, L. (2004). Integrating evaluative inquiry into the organizational culture: A review and synthesis of the knowledge. Canadian Journal of Program Evaluation, 19(2), 99-141. • Patton, M. Q. (2004). "On evaluation use: Evaluative thinking and process use." The Evaluation Exchange IX(4). • Patton, M. Q. 1997. Utilization-focused evaluation: The new century text (3rd ed.). Thousand Oaks, CA: Sage.

  17. Contact information Yvonne Belanger Head, Program Evaluation Academic Technology & Instructional Services Perkins Library, Duke University yvonne.belanger@duke.edu

More Related