1 / 30

Program Assessment: A Process From Start to Finish

Learn how assessment design informs program design, understand the learning assessment cycle, and identify effective frameworks for designing learning outcomes.

melissar
Download Presentation

Program Assessment: A Process From Start to Finish

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Assessment: A Process From Start to Finish RJ Ohgren – Office of Judicial Affairs Mandalyn Swanson, M.S. – Center for Assessment and Research Studies James Madison University

  2. Session Outcomes By the end of this session, attendees will be able to: Explainhow assessment design informs program design Describe the “Learning Assessment Cycle” Express the difference between a goal, learning objective and program objective Identify effective frameworks to design learning outcomes Define fidelity assessment and recognize its role in the Learning Assessment Cycle

  3. By The Numbers Where We Were v. Where We Wanted to Be

  4. Why Assess? It’s simple: • The assessment cycle keeps us accountable and intentional • We want to determine if the benefits we anticipated occur • Are changes in student performance due to our program? If we don’t assess: • Programming could be ineffective – we won’t know • Our effective program could be terminated – we have no proof it’s working

  5. Typical Assessment We’ll, we’ve got to do a program. Let’s put some activities together. Let’s ask them questions about what we hope they get out of it afterwards. Um…let’s ask if they liked the program too. And let’s track attendance. Survey says….well, they didn’t really learn what we’d hoped. But they liked it? And a good bit of people came? Success!

  6. Proper Assessment What do we want students to know, think or do as a result of this program? Let’s define goals and objectives that get at what we want students to know, think or do. What specific, measurable things could show that we’re making progress towards these goals and objectives? What activities can we incorporate to get at those goals and objectives? We have a program!

  7. How are these approaches different?

  8. Learning Assessment Cycle

  9. Program Goals vs. Learning Goals

  10. Goals, Objectives, & Items Item Item Objective Item Item Objective Item Item Goal Item Item Objective Objective Item Item Item Item

  11. Goals v. Objectives Goals can be seen as the broad, general expectations for the program Objectives can be seen as the means by which those goals are met Items measure our progress towards those objectives and goals

  12. Goals vs. Objectives Goal Objective Statement of what students should be able to do or how they should change developmentally as a result of the program More specific; measurable Example: Upon completion of the BTN program, 80% of students will be able to identify 2 JMU Policies relating to alcohol. • General expectation of student (or program) outcome • Can be broad and vague • Example: Students will understand and/or recognize JMU alcohol and drug policies.

  13. Putting it All Together MISSION GOAL GOAL GOAL Objective Objective Objective Objective Objective Objective Assessment Assessment Assessment Assessment Assessment

  14. By The Numbers Program Goal 1 of 3 • Goal: To provide a positive classroom experience for students sanctioned to By the Numbers • Objective: 80% of students will report that the class met or exceeded their expectations of the class. • Item: Class Evaluation #15 – Overall, I feel like this class… • Objective: 80% of students will agree (or better) with the statement “the facilitators presented the material in an non-judgemental way.” • Item: Class Evaluation #5.5 – The facilitators presented the material in a non-judgemental way. • Objective: 60% of students will report an engaging classroom experience. • Item: Class Evaluation #5.1 – The facilitators encouraged participation. • Item: Class Evaluation #5.4 – The facilitators encouraged discussion between participants.

  15. By The Numbers Learning Goal 1 of 5 • Goal: To ensure student understanding and/or recognition of JMU alcohol and drug policies. • Objective: After completing BTN, 80% of students will be able to identify 2 JMU Policies relating to alcohol. • Objective: …identify the circumstances for parental notification. • Objective: …identify the parties able to apply for amnesty in a given situation. • Objective: …identify the geographic locations in which JMU will address an alcohol/drug violation. • Objective: …articulate the three strike policy.

  16. By The Numbers Learning Goal 2 of 5 • Goal: To ensure student understanding and/or recognition of concepts surrounding alcohol. • Objective: After completing BTN, 60% of students will be able to provide the definition of a standard drink for beer, wine, and liquor. • Objective: …identify the definition for BAC. • Objective: …describe the relationship between tolerance and BAC. • Objective: …identify at least 2 factors that influence BAC. • Objective… identify the definition of the point of diminishing returns. • Objective: …identify how the body processes alcohol and its effects on the body.

  17. By The Numbers Learning Goal • Goal: To ensure student understanding and/or recognition of concepts surrounding alcohol consumption. • Objective: After completing BTN, 80% of students will be able to correctly identify the definition of the point of diminishing returns. • Item: Assessment Question #12, #29 • Activity: Tolerance Activity, Point of Diminishing Returns discussion • Objective: After completing BTN, 80% of students will be able to identify how the body processes alcohol and its effects on the body. • Item: Assessment Question #8, #9, #10 • Activity: Alcohol in the Body Activity

  18. Developing Learning Outcomes Should be Student Focused – Worded to express what the student will learn, know, or do (Knowledge, Attitude, or Behavior) Should be Reasonable – should reflect what is possible to accomplish with the program Should be Measurable –“Know” and “understand” are not measurable. The action one can take from knowing or understanding is. Should have Success Defined –What is going to be considered passing?

  19. Bloom’s Taxonomy Less complex More complex

  20. Bloom’s Taxonomy

  21. The ABCD Method • A = Audience • What population are you assessing? • B = Behavior • What is expected of the participant? • C = Conditions • Under what circumstances is the behavior to be performed? • D = Degree • How well must the behavior be performed? To what level? From “How to Write Clear Objectives”

  22. The ABCD Method: Example • Objective: After completing BTN, 80% of students will be able to describe the relationship between tolerance and BAC.

  23. Common Mistakes Vague behavior Example: Have a thorough understanding of the university honor code. Gibberish Example: Have a deep awareness and thorough humanizing grasp on… Not Student-Focused Example: Train students on how and where to find information.

  24. Program Implementation Give the program you say you will. How?

  25. Pre Test (Low Item Score) Program Post Test (High Item Score)

  26. Pre Test (Low Item Score) Program Post Test (Low Item Score)

  27. Fidelity Assessment Are you doing what you say you’re doing? Helps to ensure your program is implemented as you intended Links learning outcomes to programming Helps to answer “why” we aren’t observing the outcomes we think we should be observing

  28. Fidelity Components • Program Differentiation • How are the many components of your program different from one another? • Adherence • Was your program delivered as intended? • Quality • How well were the components administered? • Exposure • How long did each component last? How many students attended? • Responsiveness • Were participants engaged during the program?

  29. Fidelity Checklist - Generic • What is rated? • The live/videotaped program • Who does the rating? • Independent auditors • Facilitators • Participants

  30. Please walk away with this: You Must Assess.

More Related