1 / 53

Evaluating and Assessing Library Instruction

Evaluating and Assessing Library Instruction. Examples & Best Practices. Presenters. Martin Garnar – Regis University Rhonda Gonzales & Eleni Adrian – CSU-Pueblo Lorrie Evans – Auraria Library Patrick McCarthy - CSU. Evaluating and Assessing Library Instruction. Regis University

hedy
Download Presentation

Evaluating and Assessing Library Instruction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating and Assessing Library Instruction Examples & Best Practices

  2. Presenters • Martin Garnar – Regis University • Rhonda Gonzales & Eleni Adrian – CSU-Pueblo • Lorrie Evans – Auraria Library • Patrick McCarthy - CSU

  3. Evaluating and Assessing Library Instruction Regis University Martin Garnar

  4. ACRL Guidelines for Evaluation and Assessment • program evaluation plan • criteria for evaluation • learning outcomes and assessment methods • coordination of assessment with teaching faculty • data gathering and analysis

  5. Evaluating and Assessing Library Instruction CSU-Pueblo Eleni Adrian & Rhonda Gonzales

  6. CSU-Pueblo Library Instruction Evaluation Overview • Librarians at CSU-Pueblo administered a brief evaluation form after each instruction session. • Collected these evaluations from 1999-2004. • Entered results into an Access Database. • Data Clean-up • Statistical Analysis • Conclusions

  7. CSU-Pueblo Library Instruction Evaluation Evaluation Form Background questions Satisfaction questions Ranking question

  8. CSU-Pueblo Library Instruction Evaluation Evaluation Forms Collected Fall 1999 through Summer 2004. 181 classes 2441 evaluation forms

  9. CSU-Pueblo Library Instruction Evaluation Access Database Entered several semesters’ of data Revealed errors in our database design New database structure New input forms

  10. CSU-Pueblo Library Instruction Evaluation Access Database

  11. CSU-Pueblo Library Instruction Evaluation Access Database

  12. CSU-Pueblo Library Instruction Evaluation Data Fixes: New Database Design Relationship between the classes and the evaluations tables. No answers for background questions.

  13. CSU-Pueblo Library Instruction Evaluation Data Fixes: Form Changes Two questions added, two questions removed. Re-entered all old evaluation form data to match the final version. Double-checked that the form changes were accurately re-entered.

  14. CSU-Pueblo Library Instruction Evaluation Data Fixes: Data Integrity Duplicate Class records Duplicate and blank Evaluations records Typos in Status (year in school) General data entry errors

  15. CSU-Pueblo Library Instruction Evaluation Results – Background Questions Year in School

  16. CSU-Pueblo Library Instruction Evaluation Results – Background Questions Attended Previous Session

  17. CSU-Pueblo Library Instruction Evaluation Results – Background Questions Specific Assignment

  18. CSU-Pueblo Library Instruction Evaluation Results – Background Questions Professor Participated

  19. CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Start Research

  20. CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Clear Presentation

  21. CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Hands-on Experience

  22. CSU-Pueblo Library Instruction Evaluation Results – Satisfaction Questions Respect

  23. CSU-Pueblo Library Instruction Evaluation Results – Ranking Question Most Important Thing Learned

  24. CSU-Pueblo Library Instruction Evaluation Comparisons

  25. CSU-Pueblo Library Instruction Evaluation Comparisons

  26. CSU-Pueblo Library Instruction Evaluation Conclusions • Distribution of responses skewed towards high end. • Student evaluations of faculty teaching across campus are not normally skewed towards the positive to this extent. • These results show that a large majority of students felt that our instruction sessions were helpful!

  27. CSU-Pueblo Library Instruction Evaluation Conclusions - Meeting Our Goals? Since most of our sessions are introductory in nature, one of our primary goals is to show students how to start their research using our library. • 88.17% of students that took the survey indicated that they moderately or strongly agreed that the sessions helped them learn how to start their research. • 36.12% of students who took the survey responded that learning how to locate a journal article or book was the most important thing they learned.

  28. CSU-Pueblo Library Instruction Evaluation Conclusions - Factors Affecting Results • What factors may have affected student responses? • Chi-square tests indicate that the following factors proved to be statistically significant: • Having a specific assignment in conjunction with this presentation was a significant factor on the number of students that strongly agreed with A & C • Attending a prior session was a slightly significant factor in determining how many students strongly disagreed with D • Survey Design: • Even number of response choices forced students to choose positive or negative • Unclear wording – “My Professor participated in this session.” • Lack of N/A forced students to choose a response even if it did not apply • Survey Administration • Administered at the end of each session, when students were in a hurry to leave. • Tendency of librarians may be to skip the survey if the session hasn’t gone well.

  29. CSU-Pueblo Library Instruction Evaluation Indications for Best Practices Role of Evaluation: combine learning assessment with program evaluation Survey Design: concise, clearly worded, avoid mild statements, allow neutral and N/A responses, define hypotheses prior to creating the form

  30. Evaluating and Assessing Library Instruction Auraria Library Lorrie Evans

  31. Evaluation of instruction at Auraria Library Survey student learning in single session library classes (random selection). Survey students in full semester credit bearing classes (assignments, final project) Survey the faculty toward the end of the semester – after papers and assignment are turned in.

  32. Challenges in the single session library class • Do the students understand the most important concepts? • No other chance for a follow up or review. • When students leave the room we will not see them again

  33. Why? Look in the mirror – are we teaching what we think we are teaching. Collaborate with campus faculty on a pro-active level. Make library instruction a more integrated part of the academic culture. Accreditation – assessment mandates driven externally.

  34. The tool – short online survey In class – three to five questions mapped to learning goals. Customized for each class Broadcast results, review answers & main concepts. Students share concerns. • Students see their answer along with classmate responses. • Librarian can respond to student input. • Review main points.

  35. Sample results • English 1020 • Student comments view • Psychology freshman seminar data • Graduate Political Science - single session post test. • Graduate English Literature - semester course Jump past results

  36. Comments return

  37. Psych Freshman Seminar sample items return

  38. Graduate English Literature survey: sample items return

  39. Political Science post test return

  40. Right tool for the job question

  41. Faculty survey -- end of the semester • Program effectiveness as reflected in quality of student projects and assignments. • Satisfaction with content, scheduling and teaching.

  42. Faculty Evaluationsurvey results Zoomerang link

  43. Faculty evaluationssample items

  44. How can we improve?

  45. Feedback loop • In class survey – instant feedback for both students and librarian. Librarian can go over confusing concepts. • Results discussed at departmental meetings – used as source of information to make adjustments in teaching (more depth with less quantity). • Results made available to instructors and students. • End of the year report submitted to the campus Outcomes Assessment Advisory Committee. The committee provides a review of the report, recommends improvements.

  46. Evaluating and Assessing Library Instruction CSU – Fort Collins Patrick McCarthy

  47. Evaluating Learning Outcomes at the Reference Desk - CSU • Why? • Determine if learning takes place as a result of interactions at the reference desk • Identify those skills most suitable to reference desk instruction • Establish reasonable reference desk learning outcome benchmarks • Recommend behaviors and methods that lead to student learning

  48. Categories of Reference Assessment • Accuracy response rate • User satisfaction • Course/instructor opinion survey • Learning outcomes

  49. Methodology? • Quantitative approach • Powerful, not practical • Qualitative approach • Powerful, versatile

  50. Procedures • Patron completes preliminary survey at completion of reference interaction • Researcher analyzes preliminary surveys and selects relevant patrons for follow-up assessment • Selected patrons are interviewed on specifics of reference transaction • Interview recordings are transcribed and coded for assessment

More Related