1 / 20

Formative Evaluation of the Defending Childhood Initiative

Formative Evaluation of the Defending Childhood Initiative. Michael Rempel, Melissa Labriola, Rachel Swaner, Kathryn Ford, and Julia Kohn Center for Court Innovation Peter Jaffe and Marcie Campbell Centre for Research on Violence Against Women & Children David Wolfe

keely
Download Presentation

Formative Evaluation of the Defending Childhood Initiative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formative Evaluation of the Defending ChildhoodInitiative Michael Rempel, Melissa Labriola, Rachel Swaner, Kathryn Ford, and Julia Kohn Center for Court Innovation Peter Jaffe and Marcie Campbell Centre for Research on Violence Against Women & Children David Wolfe Centre for Addition and Mental Health Presented at the Defending ChildhoodInitiative Grantee Meeting, Washington, D.C., January 25, 2011

  2. Organization of This Presentation • Evaluation 101 • The Defending ChildhoodFormative Evaluation • Evaluability Assessment

  3. Evaluation 101 • Process Evaluation • Impact Evaluation • Cost Analysis • Action Research

  4. Process Evaluation • Definition: Describes planning process, operations, and outcomes for project participants • Components: • Qualitative • Quantitative • Fidelity Analysis

  5. Process Evaluation (continued) • Qualitative: Description of: • Project goals and objectives • Planning – team members, needs, decisions, challenges • Operations – all elements of the final project model • Quantitative: Data on: • Participant baseline characteristics and service needs • Treatment dosage (e.g., days/sessions attended of each service) • Community outreach (schools, workshops, public events, etc.) • Fidelity Analysis:Did practices mirror intended model?

  6. Process Evaluation (continued) • Formative Evaluation: • Focus on planning process or early operations • More qualitative than quantitative • Participatory Evaluation: engages project planners or staff in defining evaluation scope and content

  7. Impact Evaluation • Definition: Tests project impact in achieving its goals; virtually always requires a comparison condition • Experiment: random assignment to conditions • Quasi-Experiment: naturally occurring comparison: • Pre-Post: before vs. after a project started • Contemporaneous: e.g., not enrolled for logistic reasons • Comparison Site: nearby neighborhood/jurisdiction • Non-Experiment: not valid: • Completers versus Dropouts • Participants Only: Before vs. After Participation

  8. Other Types of Evaluation • Cost Analysis: often of great interest to policymakers • Action Research: • Provides immediate and useful feedback about everyday program operations and performance • Minimal or no technical research expertise required • Typically involves tracking simple performance indicators with forms, spreadsheets, or simple databases (e.g., Access)

  9. This Evaluation • Phase One: Formative (process) evaluation only • Phase Two: Process, impact, and cost evaluation, focusing on four (4) sites

  10. Looking Ahead: Challenges for Phase Two • How conduct impact evaluation: • On a “package of strategies”? • On public awareness strategies not targeted at a specific program participant group (vs. specific comparison group)? • On strategies whose effects may not been seen for years? • Solution: must combine rigorous quantitative analysis, comparison condition(s), and alternative methods (rich observation, case studies, focus groups, etc.)

  11. Goals of the Evaluation • Implement participatory research process • Conduct formative evaluation • Identify outcomes and perform data assessment • Produce evaluability assessments and Phase II evaluation design

  12. Ecological Framework • Level of the Ecological Framework: • Societal Level • Community Level • School Level • Inter-Intrapersonal Level • Type of Strategy: • Prevention • Intervention • Public Awareness

  13. Goal One: Participatory Research Process • Literature Overview: Engage sites with relevant findings concerning the prevalence, effects, and existing strategies to address CEV • Mapping Goals and Strategies: Understand each site’s process of identifying goals, strategies, and outcomes • Logic Model: Develop comprehensive logic model for each site, linking goals to strategies to desired outcomes, through an iterative, consultative process

  14. Goal Two: Formative Evaluation • Multi-Agency Collaboration: Document persons, agencies, roles and management of each site’s initiative • Problems and Needs: Detail each site’s assessment of the local CEV problem and current unmet needs • Policies and Strategies: Provide rigorous account of each site’s prevention, intervention, and/or public awareness strategies (across the Ecological Framework) • Barriers: Describe each site’s barriers and resulting problem-solving methods or policy modifications

  15. Goal Three: Outcomes and Data Assessment • Outcome Identification: Finalize site-specific and cross-site outcomes and performance indicators relating to chosen prevention, intervention, and public awareness strategies • Data Assessment: Assess existing information systems and future data collection needs in each site

  16. Goal Four: Deliverables to NIJ • Eight Evaluability Assessments • Four-Site Phase II Evaluation Design

  17. Evaluability Assessments Outline • Project Summary • Data • Evaluability

  18. I. Project Summary • Local CEV Problem • CEV rates/problems • Status quo resources and assets • Status quo gaps and service needs • Site Defending Childhood Initiative • Structure of the initiative/collaborative • Logic Model • Description of strategies and scale

  19. II. Data • For each intermediate goal: • Data needs • Existing solutions • Planned solutions

  20. III. Evaluability Strengths, weaknesses, opportunities, and threats • Collaboration • Policy formalization • Volume (by year) • Local research capacity • Evidence-based practices • Sustainability and additional resources • Data capacity and gaps • Comparison Conditions

More Related