1 / 48

Partnership For Success (PFS) Orientation to Evaluation

Partnership For Success (PFS) Orientation to Evaluation. September 26, 2013. Partnership For Success. Welcome! Strategic Prevention Framework - Connecting the Steps Why Planning for Evaluation is Critical. Why Evaluate? . Help understand how efforts work

sienna
Download Presentation

Partnership For Success (PFS) Orientation to Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Partnership For Success (PFS) Orientation to Evaluation September 26, 2013

  2. Partnership For Success • Welcome! • Strategic Prevention Framework - Connecting the Steps • Why Planning for Evaluation is Critical

  3. Why Evaluate? • Help understand how efforts work • Ongoing feedback can improve community work • Gives community members voice and opportunity to improve efforts • Hold accountable those groups doing, supporting, and funding the work

  4. Partnership For Success Requirements • Process Measures • Number of Evidence-Based Strategies Implemented • Number of Active Partners • Number of Individuals Reached by Strategy Type • Number of Prevention Activities Supported by Leveraging Resources

  5. Partnership For Success Requirements • Outcome Measures • Youth Past 30-Day Alcohol Use • Youth Past Two-Week Binge Drinking

  6. Evaluation: Documenting Activity • Documentation in the Online Documentation and Support System (ODSS) • WHAT: Documentation of activities and accomplishments • HOW OFTEN: Weekly documentation • WHEN: End of week • Required • Training Provided by KU Work Group (KUWG) • Feedback monthly from KUWG

  7. Navigating to the ODSS

  8. Evaluation: Documenting Outcomes • Participation in Kansas Communities That Care (KCTC) Student Survey • 80% at grades 6, 8, 10 & 12 • Annually in December & January

  9. New Website • www.kctcdata.org

  10. Evaluation: Action Planning • Action Planning • WHAT: Targeted objectives and list of action steps • HOW OFTEN: Before Implementation, then annually • WHEN: Prior to strategy implementation, then annually • Upload to Workstation • Writing SMART-C objectives • Setting reasonable goals • Evidence-based strategies

  11. Using Standards to Identify Indicators to Support Evaluation • Standards for Good Evaluation • Utility (e.g., does the evaluation contribute to improvement) • Feasibility (e.g., is it practical) • Propriety (e.g., protection of participants) • Accuracy (e.g., is the information credible)

  12. Setting Targets: From here to where • Shotgun approach to changing outcomes • Ready, Shoot, Aim • Importance of Identifying appropriate indicators • Identifying how much a community can move an indicator balancing • Accuracy • Completeness • Credibility • Resources • Time

  13. Writing Objectives • SMART+C Objectives

  14. Key Concepts for Developing Strategies and Action Plans Key Concepts: • Objectives--how much of what by when • Strategies--how • Action Plans--what changes will be sought; who will do what by when • Identified strategies and action plans as tools for evaluation

  15. Developing Strategies • Strategy • Definition—Greek, strategia; generalship; giving overall direction • How things will be accomplished • Evidence-based Strategies • Programs • Policies • Practices

  16. Reviewing and Selecting Evidence-Based Strategies • Evidence-based strategy/approach • An approach (program, policy or practice) that has been shown, by research, to “work” or be effective in improving a behavior or outcome

  17. Selecting Evidence-Based Strategies • Search for evidence-based practices • Consider strength of evidence for whether practice caused the effect • Consider whatever practice would achieve results in your situation. • Consider presence of conditions for success. • Consider and adaptations for your context

  18. Criteria for Strategies and Changes • Effectiveness • Consistency with Objectives • Fit with resources • Anticipated resistance • Reach

  19. Gathering Evidence to Address the Evaluation Questions • Evidence—information that could be used to assess the merit or worth of a program • Gathering credible evidence—Overview: • Indicators of Success • Sources of Evidence • Quality of Evidence • Quantity of Evidence • Logistics for Gathering Information

  20. Identifying Indicator Sources • Required sources: • KCTC student survey • Pre/post program data • Supplemental local data sources if available: • Enforcement (# MIP citations; #safety and sobriety checkpoints • Survey • Considerations • Data quality • Data collection & consistency

  21. Developing Action Plans • Action Plans vs. Logic Models • Elements of an action plan • Changes (interventions)—to be sought or implemented • Action Steps— who will do what by when to bring them about

  22. Developing Action Plans • Create action steps • When creating a plan with action steps for each activity sought, minimally describe: 1. What specific change or aspect of the intervention will occur 2. Who will carry it out 3. When it will be completed or its duration

  23. Implementing and Evaluating Action Plans Evaluate and implement action plan using criteria of: • Completeness • Clarity • Sufficiency • Currency • Flexibility with Fidelity

  24. Using Objectives, Strategies, and Action Plans to Guide and Enhance Your Work • Use to communicate your initiative’s purpose to others • Make use in ongoing evaluation and organizational and program development • Review to identify additional partners • Review at regular intervals • Integrate into your routine activities

  25. Implementation Fidelity • Strategies have essential components necessary to re-produce the outcomes that led to the program being certified as evidence-based. • Fidelity consists of implementing these essential components of a program - the degree to which an intervention or program is delivered as intended.

  26. Implementation Fidelity • Evaluation of implementation fidelity is important because it may moderate the relationship between an intervention and its outcomes • its assessment may also prevent potentially false conclusions from being drawn about an intervention's effectiveness • It may even help in the achievement of improved outcomes

  27. Evaluation: Effectiveness of Programs • Pre/Post surveys • Unique ID

  28. Coalition Roster • Excel spreadsheet to track member activity • Demonstration

  29. Quarterly Report Questions • Did your coalition/workgroups hold regular meetings, trainings, or events this quarter? • If yes please attach your Coalition Spreadsheet • Were all strategies implemented with moderate to high fidelity as defined by the fidelity checklist? • What barriers to implementation fidelity did you experience this quarter?   • How can these barriers be removed? • List Definition --How is cultural diversity being addressed in the implementation of your strategies and in your community?

  30. Quarterly Report Questions • List all strategies (e.g. school or faith-based initiatives, YMCA, BB/BS, etc.), other than those funded through K-SPF, being implemented in your community. • How did you communicate results and share success with your community and stakeholders this quarter (i.e. town hall meeting, school newsletter, parent brochure, social networking, newspaper article, etc.)? • What strategies do you intend to sustain in your community after K-SPF funding? • What resources (in-kind, cash, personnel) have been obtained this quarter to sustain these strategies? • List the accomplishments you made this quarter.

  31. Quarterly Report Due Dates • Upload reports to the Work Station • March 30 • June 30 • September 30 • December 30

  32. Putting It All Together Ain’t nothin’ to it but to do it. ~ Maya Angelou

  33. Evaluation: Sensemaking • Engage in semiannual (January and June) reflecting on quarterly community efforts across strategies • Identification of mediating contextual factors in the implementation of action steps • Addresses the question: “What does it all mean?”

  34. Community Change Defined as a new or modified program, policy, or practice • Program • Example: The addition of the SPF program, "Middle Littles" (a mentoring program through Big Brothers Big Sisters targeting middle school age students), begins in community. • Policy • Example: Coalition member, Joe Patton, introduced H 2165 to establish recklessness as a standard in unlawfully hosting minors in person's residence (aka Social Hosting Law). • Practice • Example: Katie Allen developed a coalition page on Facebook in order to begin announcing activities about the SPF grant programs.

  35. Process Questions

  36. Sample Report • Objectives

  37. Impact Questions

  38. Evaluation Team Lisa Chaney, Southeast Kansas Education Service Center, Greenbush Jerry Schultz and Momina Sims, Work Group for Community Health and Development, Univ. of Kansas

More Related