1 / 46

Assessing Information Literacy How University Libraries Can Contribute to the Measurement of Student Skills

Assessing Information Literacy How University Libraries Can Contribute to the Measurement of Student Skills. 2004 Assessment Institute November 2, 2004. Overview. Past Approaches to Information Literacy (IL) Assessment Self-Report Measures Standardized Tests

albert
Download Presentation

Assessing Information Literacy How University Libraries Can Contribute to the Measurement of Student Skills

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing Information LiteracyHow University Libraries Can Contribute to the Measurement of Student Skills 2004 Assessment Institute November 2, 2004

  2. Overview • Past Approaches to Information Literacy (IL) Assessment • Self-Report Measures • Standardized Tests • A Different Approach to IL Assessment • Transforming Professional Standards • Using Rubrics • Applying Rubric Assessment to an IL Tutorial • Results & Future Plans

  3. Past Approaches to Assessment of Information Literacy • Self-Report Measures • Standardized Assessment of Information Literacy Skills (SAILS)

  4. Self-Report Measures The pace of the instruction was just right for me. __ Strongly agree __ Agree __ Neutral __ Disagree __ Strongly disagree http://www.library.ubc.ca/home/forms/studentevalform.html What can you discover about student learning from the answer to this question?

  5. Self-Report Measures The library instructor was knowledgeable and helpful. __ Strongly agree __ Agree __ Neutral __ Disagree __ Strongly disagree http://www.library.ubc.ca/home/forms/studentevalform.html What can you discover about student learning from the answer to this question?

  6. SAILS Standardized Assessment of Information Literacy Skills • Multiple-choice test. • 30 questions, delivered on the Web. • Purpose • Program evaluation • Cross-institutional comparison. http://sails.lms.kent.edu/publications/aahe_files/frame.htm

  7. Why this doesn’t work for us… • Does not adapt unwieldy IL Standards to a manageable instructional context. • Validity & reliability not yet demonstrated. • Multiple-choice, not performance focused. • Seems to focus on lower-end thinking skills. • Not testing what is taught. • Too far removed from instruction to be useful in “closing the loop”.

  8. Why Focus on Evaluating Direct Forms of Student Learning? • Indirect forms of student learning don’t always help you understand where you can make improvements in your program. • They don’t always tell you how your program contributes to student development and learning. • Multiple forms of methods provide you more evidence to make a more informed decision. • Direct methods help you improve programs while you are still delivering them (e.g., formative)

  9. Why Focus on Evaluating Student Learning? • “The concepts of learning, personal development, and student development are inextricably intertwined and inseparable.” – The Student Learning Imperative • “Good assessment is based fundamentally on collaboration among colleagues. And since student learning takes place both inside and outside the classroom, some of the most interesting and intellectually exciting work in assessment involves collaboration among faculty and student affairs professionals.” -Banta et alia.

  10. Why Focus on Evaluating Student Learning?, Cont. • “As resources decline and the competition for resources within institutions increases, every program and service must demonstrate its importance and worth.” - Upcraft and Schuh • “…advances in the study of thinking and learning (cognitive science) and in the field of measurement have stimulated people to think in new ways about how students learn and what they know, what is therefore worth assessing, and how to obtain useful information about student competencies.” - National Research Council

  11. Why Focus on Evaluating Student Learning?, Cont. • “To assure that students have sufficient and various kinds of educational opportunities to learn or develop desired outcomes, faculty and staff often engage in curricular and co-curricular mapping” – Peggy L. Maki • Regional and some Professional Accreditation Agencies • AAC&U Greater Expectations • NASPA Learning Reconsidered

  12. A Different Approach to Information Literacy Assessment • Needs: • To assess IL outcomes in a meaningful & contextualized way. • To integrate assessment into course curriculum. • To fold the assessment process into regular workflow. • To end up with data we could use to “close the loop” and • impress our friends! 

  13. Our Vehicle • Embed assessment in the curriculum via an online tutorial, LOBO. • What is LOBO? • Benefits to this approach: • Student motivation • Curricular fidelity • Potential relationship to general education assessment • Cross-campus collaboration & commitment • Campus visibility • Needed an assessment plan!

  14. The IterativeSystematicAssessment CycleAdapted from Peggy Maki, Ph.D. by Marilee J. Bresciani, Ph.D. Gather Evidence Interpret Evidence Mission/Purposes Goals Outcomes Implement Methods to Deliver Outcomes and Methods to Gather Evidence Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability

  15. TransformingProfessional Standards • Work from “official” standards language. • Eliminate redundancy, reword for clarity. • Rewrite contextualized outcomes…but keep codes that allow you to trace origins.

  16. The IterativeSystematicAssessment CycleAdapted from Peggy Maki, Ph.D. by Marilee J. Bresciani, Ph.D. Gather Evidence Interpret Evidence Mission/Purposes Goals Outcomes Implement Methods to Deliver Outcomes and Methods to Gather Evidence Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability

  17. Questions prompt students to demonstrate their achievement of outcomes.

  18. Answers to questions are transferred to a printable worksheet and saved to a database.

  19. The IterativeSystematicAssessment CycleAdapted from Peggy Maki, Ph.D. by Marilee J. Bresciani, Ph.D. Gather Evidence Interpret Evidence Mission/Purposes Goals Outcomes Implement Methods to Deliver Outcomes and Methods to Gather Evidence Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability

  20. What is a Rubric? A rubric is "a set of criteria and a scoring scale that is used to assess and evaluate students' work. Often rubrics identify levels or ranks with criteria indicated for each level." - - Campbell, Melenyzer, Nettles, and Wyman, 2000

  21. Why Use a Rubric • Provide evaluators and those whose work is being evaluated with rich and detailed descriptions of what is being learned and what is not • Combats accusations that evaluator does not know what he/she is looking for in learning and development • Can be used as a teaching tool – students and staff begin to understand what it is they are or are not learning or are or are not able to demonstrate what they know and can do

  22. For example • You can use a rubric to • Make meaning out of national standards and indicators • Norm faculty and staff’s expectations • Inform students of what you are looking for • Give students an opportunity to see how they have improved • Make rankings, ratings, and grades more meaningful • Help students identify their own learning and development or absence thereof • Assess a student, course, or a program • Quantify student learning

  23. Using Rubrics to Evaluate Large Groups of Students • You don’t have to evaluate every artifact of learning • Random sample • Random stratified sample • Purposeful sample • “Best case and worst case” sample • Remember – the point is to get an idea of how well students learned in order to know what to improve in the delivery of that learning

  24. A Pilot Test Web Site Evaluation

  25. Rubric Criteria • Based on LOBO Outcomes (derived from ACRL/AAHE IL Standards). • Focused on 4 areas for web site evaluation: • Using Criteria Terminology • Citing Criteria Indicators • Citing Examples of Indicators from the Site • Judging Whether or Not to Use the Site

  26. Levels of Performance Exemplary Meets outcome completely. What a “good” answer looks like. Developing Shows progress toward meeting outcome, but does not meet it completely. What a “medium” answer looks like. Beginning Does not meet outcome. What a “poor” answer looks like.

  27. Authority • 68% seemed to be aware of authority issues but did NOT use terms like “authority”, “sponsorship”, or “authorship” to describe it. • 70% could cite specific clues of web site authority, but only 32% gave examples from the site they were evaluating. • 44% indicated whether or not they’d use the site based on authority issues and said why.

  28. Currency • 60% seemed to be aware of currency issues and used terms like “currency” or “timeliness” to describe it. • 60% could cite specific clues of web site currency, and 60% gave examples from the site they were evaluating. • 44% indicated whether or not they’d use the site based on currency issues and said why.

  29. Bias • 68% seemed to be aware of bias issues and used terms like “bias”, “perspective”, or “point of view” to describe it. • 46% could cite specific clues of web site bias, but only 32% gave examples from the site they were evaluating. • 18% indicated whether or not they’d use the site based on bias issues and said why.

  30. Examples of Reporting for Stakeholders While 44% of students can determine whether or not a web site is appropriate for their purpose and provide a rationale for that decision based on authority or currency, only 18% demonstrate this ability based on bias. (Language from LOBO Outcome 3.2.2)

  31. The IterativeSystematicAssessment CycleAdapted from Peggy Maki, Ph.D. by Marilee J. Bresciani, Ph.D. Gather Evidence Interpret Evidence Mission/Purposes Objectives/Goals Outcomes Implement Methods to Deliver Outcomes and Methods to Gather Evidence Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability

  32. Before

  33. After More content!

  34. After More direction! Better responses?

  35. The Next Step Applying for a grant… • To test consistency of rubric approach to IL assessment. • Rater groups: • Students • ENG 101 Instructors • Librarians • External Experts

  36. Things to Remember when Working with Multiple Reviewers • Agree on an outcome • Agree on method of data collection • Agree on the meaning for the outcome and definition – in other words agree on how you know the outcome is met and what it will look like when you see it met • Agree on the systematic implementation of the assignments and the rubric

  37. Things to Remember when Working with Multiple Reviewers, Cont. • Norm the reviewers • Select 3-5 artifacts of learning representing various levels on the rubric • Review together and discuss differences of opinions • Tweak Rubric or adjust expectations • Or ask the reviewers to create artifacts that would be evaluated in the various levels of the rubric (repeat aforementioned process)

  38. Things to Remember when Working with Multiple Reviewers, Cont. • If reviewers help create the rubric, quite a bit of norming occurs in the discussion portion of the creation of the rubric

  39. Questions? Megan Oakleaf megan_oakleaf@ncsu.edu www.lib.ncsu.edu/lobo2 Marilee Bresciani mbresciani@tamu.edu

More Related