Implementing EBHV Programs with Fidelity - translating science into practice 2010 Strengthening Families Training Institute March 16, 2010. Presented by: Joan Sharp, MA Executive Director & Nancy Gagliano, LICSW Programs & Evaluation Director Council For Children & Families .
Implementing EBHV Programs with Fidelity - translating science into practice2010 Strengthening Families Training InstituteMarch 16, 2010
Joan Sharp, MA
Nancy Gagliano, LICSW
Programs & Evaluation Director
Council For Children & Families
$185 K of the EBP funding for underserved rural and/or tribal applicants.
$400 K based on input from NFP consortium
$150 K per Thrive by Five demonstration site
Which EBHV Models to Fund?
out to 13 different organizations
Implementing five Different EBHV
October 2007Logic model for funding EBHV programs
Everything in Perfect Order!
We had great intentions, perfect assumptions and then….
Home visiting is among the best-tested prevention and early intervention models
The relative lack of effective prevention strategies
The potential for feasible large scale community-based services where people live.
The promise of the research is not consistently supported in routine practice
We can’t launch, forget and get the same results.
Capacity Challenges –
Some organizations limited to no understanding
Long Term Outcome:
Implement EBHV Programs w/ Fidelity= outcomes.
Were programs going to achieve the outcomes that the models promised?
Long Term Outcome Funder
Demonstrate child/parent benefits of significant degree to justify the investment of state dollars.
Document benefit of implementing multiple home visiting models under this state program.
Not enough money to do RCT & would it work with multiple models?.
Most states have some level of state home visiting initiative falling in two strategies.
Single strategy efforts dominated by Healthy Families America practice
Portfolio strategies involving local choice and control
Single strategy approaches have the strongest evaluations and as a result dominate the current policy discussion
Portfolio based initiatives have weak evaluations or are not collecting and reporting data
Addressing the portfolio model evaluation is a significant area of needed work
Does the routine use of home visiting programs using various evidence informed protocols collectively result in better child and caregiver outcomes?
Can we demonstrate significant benefit to justify investment of state dollars?
Can we document benefit across a balanced portfolio approach to support the continuation of this approach?
Look at the research – comprehensive literature review around home visiting
Start with the programs before we looking at multi method approach and child parent outcomes
Translation of evidence based home visiting models from randomized controlled trials into local program practice is very challenging.
Improving program quality and implementation of the model with fidelity is a major issue for the field.
Organizational conditions and capacity are the key to a successful implementation of an EBHV model.
How does a program’s organizational capacity effect implementing with fidelity?
Not only do we need to ask core component/fidelity questions but we also have to find a way to assess organizational capacity.
We get a little help from our “FRIENDS” at the National Resource Center for Community Based Child Abuse Prevention
Integrating Evidence-Based Practices into CBCAP Program: A Tool for Critical Discussions - Utilized Appendix C- The Capacity Checklist for Implementing with Fidelity
CQI Self Assessment Document
WSU incorporated questions - data management capacity & programs ability to use data to inform program practice
Staff Training and Monitoring
Outcome Measurement/Quality Assurance
Support Available from the Program Developer or Other Technical Assistance Provider
Create state standards for program delivery and assurance of quality EBHV dissemination in Washington State
Support a learning community regarding home visiting and early intervention to inform practice
Test the practicality and relative benefit of this multi-method approach
Provide meaningful information and staff development that informs clinical decision making and program development through a continuous quality improvement process
Implementing with fidelity is critical to producing intended model results
Emphasis on fidelity varies across models
Measuring fidelity varies widely
Support is inconsistent across models in addressing fidelity
CCF evaluation plan actions
Work with developers for fidelity measurement
Define common minimum standards across models
Develop data collection strategy with programs
Develop an evaluation TA plan with program sites when needed
Recruitment aligns with the model’s intended service population
Recruitment process standards are met.
When there are exceptions (e.g., extension of a model to a new population), the outcomes are confirmed as consistent with the original model.
What occurs when the local population is different?
Program caseload structure for the model is maintained as required by the model developers
Minimum standard for the model’s service location, focus, and frequency criteria are met
Supervision meets minimum standards
Do programs have adequate internal capacity to describe their services, measure outcomes, and use information to improve practice?
Programs are using data to inform practice
CCF evaluation plan actions
Adopt agreed-to baseline to outcome assessment
Develop data sharing and data warehouse
Provide TA to address training in data collection and use in CQI
Individualized agency plans
Align our expectations to resources and capacity
Data elements to share based on model
Participant service summary (e.g., staff assigned, frequency and duration of contacts, start and end dates)
Baseline to outcome measures that are valid and aligned to the model’s principal evidence-based claims
Baseline and minimum six month assessment
Protective Factors Survey
CQI and the essential role
Training and technical assistance as integrated actions with evaluation
Work with existing management information systems or help develop the systems
With functioning MIS, develop data sharing agreements and transfer de-identified data
Without MIS, develop and support a data entry system aligned with your model reporting and extract de-identified data
Create a state HV data warehouse that can produce on-demand reports and support CQI is service delivery
Program level analysis of services and outcomes
Minimum quarterly data reporting
Opportunity to examine what a portfolio EBHV approach requires
Recognize that programs need to be active partners with resources and support
Phased development, TA and Training
Central role of open-ended CQI and information driven decision-making
Pace and scope of effort is rate-limited by available resources