1 / 27

Susan Agre-Kippenhan, Portland State University Professor, Art Department

Susan Agre-Kippenhan, Portland State University Professor, Art Department. Evaluating the Effectiveness of Service Learning. Why evaluate?. Assessment as a continual Service Learning improvement tool

saeran
Download Presentation

Susan Agre-Kippenhan, Portland State University Professor, Art Department

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning

  2. Why evaluate? • Assessment as a continual Service Learning improvement tool • Clarification of critical learning objectives, ability to focus on Service Learning specifics • Yields useful information for programs, faculty, students • Evidence and support claims

  3. SL Outcomes: • Service Learning Claims: • Understanding of course content • Theory to practice • Understanding of Community • Community related skills • Civic Engagement • Diversity • MORE?

  4. Student Learning Focus • Value of authentic work samples • Moving from self-report • Moving from less integrated forms of assessment (i.e. observation, focus group)

  5. Continual Improvement • Feedback to actual courses, programs • Applicable/actionable assessment • Can be course/assignment based • Can serve other programmatic goals

  6. How do we know what works? A working model for assessment based on the premise that there are: • Key questions to ask • Logical steps to follow • Usable data • Transforms student learning and community experiences

  7. 3 2 4 6 1 5 Assessment Cycle Model Develop Learning Objectives DetermineAssessment Criteria/Quality Identify Work Samples TestValidity /Reliability Locate Critical Junctures DisseminateRevise/Revisit

  8. Assessment Cycle Model 3 Locate Critical Junctures 2 Identify Work Samples DetermineAssessment Criteria/Quality 4 6 1 TestValidity /Reliability 5 DisseminateRevise/Revisit Develop Learning Objectives

  9. 1 Assessment Cycle Model What am I trying to assess? Action: Define Student Learning Outcome • Student Learning Outcomes in the context of: • Mission • Priorities • Program Goals

  10. 2 Assessment Cycle Model What are appropriate work samples? Action: Identify Student Work Samples • Qualitative vs. Quantitative • Examples: Portfolios, Papers (writing samples, essays, reflective papers, editorials, journals…) Performances, Artwork, Lab reports

  11. 3 Assessment Cycle Model When should I assess? Action: Identify critical junctures • Critical Junctures • Entering students • Passages • Culminating

  12. 4 Assessment Cycle Model What am I looking for in student work? Action: Define scoring criteria • Rubric Development: Scoring Criteria • What indicators will I use to demonstrate desired student learning? • Examples: Bloom's taxonomy

  13. 4 Assessment Cycle Model How do I evaluate the range of student work? Action: Define performance levels • Rubric Development: Performance standards • How many levels will my rubric have and how will I define performance at each level? • Specific clear descriptors at each level • Qualitative differences • Common errors

  14. Sample Rubric Performance Level 1- Low Performance Level 2- Medium Performance Level 3- High Work Sample Scoring Criteria 1 Scoring Criteria 2 Scoring Criteria 3

  15. 5 Assessment Cycle Model Once I have a rubric how do I know it's valid? How do I know if the rubric is reliable? • Evaluate the rubric against the learning objectives, learning opportunities etc… Is there clear alignment? • Pilot test with multiple evaluators • Discuss how well the rubric works • Compare scores for consistency • Determine procedures for inconsistent scores

  16. 6 Assessment Cycle Model What's next? Action: Back to the cycle • Revise and revisit - learning outcomes, learning opportunities, assignments, critical junctures, rubrics, program, course offerings/order, curriculum, pedagogy…

  17. SL Case Study • University Studies - General Education • 4 Year integrated program • Interdisciplinary program • Critical Features • Clearly defined goals Communication, Critical thinking, Diversity, Civic Responsibility • Common Pedagogy • Required Courses

  18. 1 SL Case Study Student Learning Outcomes Action: Define Student Learning Outcome Appreciation and Understanding of Civic Responsibility • PSU Mission - Let Knowledge Serve the City • University Studies Goal - Civic Responsibility

  19. 2 SL Case Study Work samples Action: Identify Student Work Samples Common assignment given in each class • Paper

  20. 3 SL Case Study Critical Junctures Action: Identify critical junctures Pilot in Senior Capstone Classes • Senior Capstone required course • Culminating group experience for major and general education • Application of expertise to a real project in the community

  21. SL Case Study Common Assignment “Please write a 3-5-page reflective analysis summarizing the ways you have gained experience with the University Studies goal of Civic Responsibility. Use specific examples from your discussions and course readings, or in completing your project.”

  22. 4 SL Case Study Scoring Criteria and Performance Levels Action: Define scoring criteria/levels Civic Responsibility Stages • Based on Bloom's taxonomy

  23. 5 SL Case Study Valid Rubric How do I know if the rubric is reliable? • PSU Mission - Let Knowledge Serve the City • University Studies Goal - Civic Responsibility • Pilot test with multiple evaluators • Compare scores for consistency. • Discuss how well the rubric works.

  24. 6 SL Case Study Revisiting Process Suggestions for improvements • Revise assignment for clarity • Revise rubrics for specificity • Clarify pedagogy Confirmed • Program goals addressed in class • Basis of assignment is valid • Keep assessment in Capstone

  25. SL Assessment in practice • Focus on student learning objectives • What are appropriate work samples? • Are there existing critical junctures? • How do we evaluate quality? • How can this improve our practice?

  26. Lessons Learned • Connect to goals and mission • Knowledge of the process that is transferable • Assessment as a tool for partnerships • Applicable/actionable assessment

  27. For More Information Susan Agre-Kippenhan Agrekis@pdx.edu 503-725-8506

More Related