1 / 36

CAPP Evaluation August 2011 Update

CAPP Evaluation August 2011 Update . Jane Powers, Amanda Purington, Jenny Parise ACT for Youth Center of Excellence. Overview. Review Evaluation Plan: Documenting EBP Implementation Ways to improve documentation Revisions to the Tools Next Steps. CAPP Initiative .

lorand
Download Presentation

CAPP Evaluation August 2011 Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CAPP EvaluationAugust 2011 Update Jane Powers, Amanda Purington, Jenny Parise ACT for Youth Center of Excellence

  2. Overview • Review Evaluation Plan: Documenting EBP Implementation • Ways to improve documentation • Revisions to the Tools • Next Steps

  3. CAPP Initiative Goal 1:Promote healthy sexual behaviors and reduce the practice of risky sexual behaviors among adolescents Core Strategy 1: Provide comprehensive, age appropriate, evidence-based and medically accurate sexuality education to promote healthy sexual behaviors

  4. We know a lot about what works to prevent teen pregnancy • Increase age of first intercourse • Increase use of condoms • Decrease # sexual partners • Decrease frequency of sex EBP Decrease Teen Pregnancy Promote Adol Sexual Health

  5. What does it take to implement EBPs in real world settings???

  6. Research to Practice While many EBPs have yielded positive outcomes in research settings, the record at the local level of “practice” is mixed (Wandersman, 2009; Lesesne et al, 2008; Fixsen, 2005).

  7. What do we know about Implementation? • Implementation influences program outcomes • If EBPs are not implemented with fidelity and quality, not likely to result in outcomes observed in research • Achieving good implementation increases chances of program success and stronger benefits for participants (Durlak and DuPre, 2008)

  8. Factors Affecting Implementation • Community Level • Organizational Capacity • Program Characteristics • Training and TA • Facilitator Characteristics

  9. CAPPAn Opportunity to Learn!!!

  10. Opportunity to learn: • How do EBPs work in different settings? • How do EBPs work with diverse youth populations? • What adaptations are needed to make EBPs work more effectively? • Are EBPs successful in getting to outcomes?

  11. Documenting Implementation is critical! • Helps us learn how the intervention was delivered • Helps us understand challenges faced • Helps us learn about what does and does not work re implementing EBPs

  12. Tools to evaluate Implementation • Fidelity Checklist: document what you did, adaptations made, successes/challenges • Attendance Record: who, where, when, how much

  13. Role of the Fidelity Checklists

  14. Role of the Fidelity Checklists • Document: • What you did • Timing • Adaptations made and why • Challenges and successes

  15. Role of the Fidelity Checklists Help you Prepare!!

  16. Using Fidelity Checklist to Plan • Review activities • What equipment is needed? • Timing • Anticipating issues • Questions that might come up • Prepare for next cycle

  17. What have we learned?

  18. Ways to Improve Documentation 1. Provide More Details! • WHAT was changed? • WHYwas it changed?

  19. GREAT example “We took the entire class period to implement and process this activity. Although it is timed at 15 minutes, it took ten minutes alone to break the class of 32 students into groups, hand out cards and explain the activity. We also needed more time to allow groups to respond as we physically had more groups than if we were working in a small group setting.”

  20. Good example “Due to the age of the participants and the material in the Robert Townsend video clip, the teacher did not find it to be appropriate for the classroom. Instead we discussed the information in the clip and used a short role-playing exercise to make the information clear.” • Why wasn’t it appropriate? • What was the role-playing exercise?

  21. Need more details “ran out of time” • Tells us WHY but not WHAT was changed • Was the activity shortened (if so, how?) or skipped altogether? “switched the order of these activities” • Tells us WHAT the change was but not WHY it was made

  22. 2. Document ALL adaptations Was Activity Carried Out According to Direction in the Facilitator’s Curriculum? • Leave something out • Change order • Add new content • Shorten activity • Substitute activity • Modify activity

  23. 2. Document ALL adaptations cont. • Include info on pre-approved adaptations • Were NO adaptations at all made?? • Example • NO adaptations made • General Comment: “It would have been nice to have more time for each activity/module” • Were activities completed according to the curriculum? • Were some activities omitted or shortened?

  24. 3. Complete the Fidelity Checklist right away! • Capture this information when the experience of the module and the activities are fresh in your mind

  25. 4. Complete the right number of tools One set of tools per cycle per set of participants

  26. Class 1 Fidelity Checklist Fidelity Checklist Fidelity Checklist Fidelity Checklist Fidelity Checklist Attendance Record Attendance Record Attendance Record Attendance Record Attendance Record Class 2 Class 3 Class 4 Class 5

  27. Teacher 1 Fidelity Checklist Attendance Record Teacher 2 Teacher 3

  28. Common Adaptations • Common adaptations include: • Updating videos, or using something more relevant in place of videos • Changes to accommodate timing issues • Changing language/names to be more inclusive • How can the COE support these common adaptations? • Providing/linking to updates on medical information • Identifying alternative videos (updated) • Developing alternatives to the condom demonstration

  29. Documentation Details • Block names of individual participants • Do not use acronyms – spell these out • Write legibly • Review your documentation for typos beforehand • Bundle all evaluation materials for each cycle (fidelity checklists and attendance records) and mail to: Amy Breese Cornell University ACT for Youth Center of Excellence Beebe Hall Ithaca, NY 14853

  30. Submit tools electronically • Save the fidelity check list and attendance record under a different name and input the information • Preserve the template for future use • Send the documents to Amanda Purington (ald17@cornell.edu)

  31. Revisions to the Fidelity Checklists • Clarified the adaptations description field: “If Changed, WHAT was changed and WHY? Please be specific: describe things you left out, added, or changed and WHY.” • Added “Were changes (if any) pre-approved?” • Added a comments section after each module: “Please use this space if you have comments on this module or any of its activities:”

  32. Revisions to the Attendance Record Added “Age” column for participants

  33. Next Steps • Additional tools: • Participant Feedback Form • Facilitator Feedback Tools • Pre/Post Tests • Other needs? • Tools and presentations on www.actforyouth.net - Search for “CAPP” in the site search box at the top of the home page. • Revise Tools for Year 2

  34. Contact Information Amanda Purington: ald17@cornell.edu 607-255-1861 or Jane Powers: jlp5@cornell.edu 607-255-3993

More Related