Slide1 l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 32

Implementing EBHV Programs with Fidelity - translating science into practice 2010 Strengthening Families Training Institute March 16, 2010 PowerPoint PPT Presentation


  • 103 Views
  • Uploaded on
  • Presentation posted in: General

Implementing EBHV Programs with Fidelity - translating science into practice 2010 Strengthening Families Training Institute March 16, 2010. Presented by: Joan Sharp, MA Executive Director & Nancy Gagliano, LICSW Programs & Evaluation Director Council For Children & Families .

Download Presentation

Implementing EBHV Programs with Fidelity - translating science into practice 2010 Strengthening Families Training Institute March 16, 2010

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Slide1 l.jpg

Implementing EBHV Programs with Fidelity - translating science into practice2010 Strengthening Families Training InstituteMarch 16, 2010

Presented by:

Joan Sharp, MA

Executive Director

&

Nancy Gagliano, LICSW

Programs & Evaluation Director

Council For Children & Families


Topics we will cover l.jpg

Topics we will cover

  • CCF Historical Funding Approach

  • New EBHV Dollars- New Game

  • Initial Logic Model for Funding EBHV

  • Assumptions, Expectations & Early Realizations

  • Design of an Evaluation Plan

  • What Does it Really Take to Implement EBHV Programs in Diverse Communities?


Council for children families l.jpg

Council for Children & Families

  • Created by the legislature in 1982

  • Supported by state general fund, CBCAP, Children’s Trust Fund, private donations

  • Three Activities: Funding Community Based Programs, Public Awareness/Education & Partnerships


Slide4 l.jpg

CCF Historical Funding Approach – organizational capacity building in the development and use of information to guide services

  • 12-15 programs each year; 3 year funding cycle

  • Parent Education/Training, Parent Support and Mentoring, and Home Visiting

  • Local communities choose various program type/focus based on local needs, capacity, and interest.

  • Capacity-Building Framework –

  • Community needs assessment

  • Research

  • Support programs in developing evaluation processes for quality assurance, program development and sustainability!


New ebhv dollars new game l.jpg

New EBHV Dollars….New Game!

  • 2007 Washington State legislature dramatically increased its investment in child abuse and neglect prevention and early school readiness by providing new dollars for implementation of EBHV programs.

  • $3.2 million for a 2 year period for CCF to fund Evidence Based Home Visitation Programs in the across the state.

  • Earmarked:

    $185 K of the EBP funding for underserved rural and/or tribal applicants.

    $400 K based on input from NFP consortium

    $150 K per Thrive by Five demonstration site


New dollars new game cont l.jpg

New Dollars, New Game cont…

Which EBHV Models to Fund?

  • CCF Research Advisory Committee - approved EBHV models

  • Matrix- recommended for those submitting a proposal.

  • Three levels of evidence approved:

  • Best Support

  • Good Support

  • Promising Practice


A portfolio is created l.jpg

A Portfolio is Created…

fast turn-around….

legislative approval

out to 13 different organizations

Implementing five Different EBHV

  • NFP – Nurse Family Partnership

  • PAT – Parents as Teachers

  • STEEP – Steps Towards Effective Enjoyable Parenting

  • PCHP – Parent Child Home Program

  • Project SafeCare


Slide8 l.jpg

October 2007Logic model for funding EBHV programs


Everything in perfect order l.jpg

Everything in Perfect Order!

We had great intentions, perfect assumptions and then….


Home visiting works but assumptions and expectations don t always meet reality l.jpg

Home Visiting Works but…Assumptions and Expectations (don’t always meet reality!)

Home visiting is among the best-tested prevention and early intervention models

The relative lack of effective prevention strategies

The potential for feasible large scale community-based services where people live.

The promise of the research is not consistently supported in routine practice

We can’t launch, forget and get the same results.

10


Assumptions expectations l.jpg

Assumptions & Expectations

  • Implementing with fidelity yields effective practice - programs are willing to engage around fidelity implementation:

  • Good understanding model

  • Clear definition of model elements

  • Model elements guidelines actually exist

  • Able to implement with some degree of fidelity

  • Model developers are offering adequate technical assistance, monitoring and supporting implementation and development

  • Programs have internal capacity for outcomes and process evaluation - are using data to inform practice

  • Programs have organizational capacity – infrastructure and support internally


Early realizations l.jpg

Early Realizations

Capacity Challenges –

Some organizations limited to no understanding

  • logic models

  • collecting basic demographic data

  • contract compliance

  • confusion on reporting on outputs

  • Process Measures Related to Fidelity –

  • implementation demonstrating fidelity to the model.

  • “We are implementing the program with 100% fidelity.”


Early realization programs don t know what they don t know l.jpg

Early Realization…programs don’t know what they don’t know

Long Term Outcome:

Implement EBHV Programs w/ Fidelity= outcomes.

  • Asking the question was not enough!

  • programs said they were implementing with fidelity

  • did they really understand fidelity?

  • how did the different program models actually measure fidelity?

  • how consistently?

    Were programs going to achieve the outcomes that the models promised?


Early realization funders don t know what they don t know either l.jpg

Early Realizationfunders don’t know what they don’t know either!

Long Term Outcome Funder

Demonstrate child/parent benefits of significant degree to justify the investment of state dollars.

Document benefit of implementing multiple home visiting models under this state program.


Design of an evaluation plan call in wsu l.jpg

Design of an Evaluation Plan Call in WSU

  • Policy goals - system level outcomes to be addressed in the evaluation:

  • Create state standards for program delivery and improvement of quality in Washington State home visiting

  • Develop a learning community regarding home visiting and early intervention that can support progressive improvements in quality

  • Test the practicality and relative benefit of this multi-method approach of delivery of home visiting as a sustainable part of the state continuum of care.

    Not enough money to do RCT & would it work with multiple models?.


Where does washington fit in the national landscape l.jpg

Where does Washington fit in the national landscape?

Most states have some level of state home visiting initiative falling in two strategies.

Single strategy efforts dominated by Healthy Families America practice

Portfolio strategies involving local choice and control

Single strategy approaches have the strongest evaluations and as a result dominate the current policy discussion

Portfolio based initiatives have weak evaluations or are not collecting and reporting data

Addressing the portfolio model evaluation is a significant area of needed work

16


The original evaluation questions l.jpg

The original evaluation questions

Does the routine use of home visiting programs using various evidence informed protocols collectively result in better child and caregiver outcomes?

Can we demonstrate significant benefit to justify investment of state dollars?

Can we document benefit across a balanced portfolio approach to support the continuation of this approach?


Design of an evaluation plan wsu l.jpg

Design of an Evaluation Plan WSU

Look at the research – comprehensive literature review around home visiting

Start with the programs before we looking at multi method approach and child parent outcomes


Wsu reviewed the research evaluating ebhv implementing with fidelity l.jpg

WSU Reviewed the Research:Evaluating EBHV & Implementing with Fidelity

Translation of evidence based home visiting models from randomized controlled trials into local program practice is very challenging.

Improving program quality and implementation of the model with fidelity is a major issue for the field.

Organizational conditions and capacity are the key to a successful implementation of an EBHV model.


Organizational conditions for adopting evidence based programs fixsen et al 2006 l.jpg

Organizational conditions for adopting Evidence Based Programs (Fixsen, et al., 2006)

  • Support for adoption across leadership & treatment staff

    • Organizational leadership skills to support adoption of new practices

    • Staff skill level – training in specific home visiting model skills

    • Information management system and use of data for quality improvement

  • Capacity to Implement

    • Staff retention

    • Supervisory capacity and skills

    • Family engagement capacity and skills

  • Capacity to develop & sustain information- driven problem solving

    • Quality improvement practices, staff development, continuing family engagement

    • Use of information and outcomes in program development


Site visit and the discussion tool l.jpg

site visit and the “Discussion Tool”

How does a program’s organizational capacity effect implementing with fidelity?

Not only do we need to ask core component/fidelity questions but we also have to find a way to assess organizational capacity.

We get a little help from our “FRIENDS” at the National Resource Center for Community Based Child Abuse Prevention


Friends and the tailored discussion tool l.jpg

FRIENDS and the Tailored Discussion Tool

Integrating Evidence-Based Practices into CBCAP Program: A Tool for Critical Discussions - Utilized Appendix C- The Capacity Checklist for Implementing with Fidelity

CQI Self Assessment Document

WSU incorporated questions - data management capacity & programs ability to use data to inform program practice


Framework for site visits l.jpg

Framework for site visits

Model Components/Fidelity

Staff Experience

Staff Training and Monitoring

Outcome Measurement/Quality Assurance

Community Capacity

Support Available from the Program Developer or Other Technical Assistance Provider

Funding Availability

Overall Assessment


The reality sets in findings support the research l.jpg

The Reality Sets In –findings support the research

  • Programs vary in terms of organizational capacity to deliver their programs

  • Data collection and information use is a common area that needs further development and support

  • Existing outcome assessment of the model is either limited or involved measuring strategies which do not meet reliability and validity standards

  • Bottom line- programs need significant support in outcomes assessment and using the information for program improvement and clinical decision making


Ccf ebhv evaluation goals update l.jpg

CCF EBHV evaluation goals update

Policy goals

Create state standards for program delivery and assurance of quality EBHV dissemination in Washington State

Support a learning community regarding home visiting and early intervention to inform practice

Test the practicality and relative benefit of this multi-method approach

Program goals

Provide meaningful information and staff development that informs clinical decision making and program development through a continuous quality improvement process


Analysis and proposed actions to guide the evaluation plan l.jpg

Analysis and proposed actions to guide the evaluation plan

Implementing with fidelity is critical to producing intended model results

Emphasis on fidelity varies across models

Measuring fidelity varies widely

Support is inconsistent across models in addressing fidelity

CCF evaluation plan actions

Work with developers for fidelity measurement

Define common minimum standards across models

Develop data collection strategy with programs

Develop an evaluation TA plan with program sites when needed


Beginning to create a common fidelity framework for ccf programs l.jpg

Beginning to create a common fidelity framework for CCF programs

Recruitment aligns with the model’s intended service population

Recruitment process standards are met.

When there are exceptions (e.g., extension of a model to a new population), the outcomes are confirmed as consistent with the original model.

What occurs when the local population is different?

Program caseload structure for the model is maintained as required by the model developers

Minimum standard for the model’s service location, focus, and frequency criteria are met

Supervision meets minimum standards


Analysis and proposed actions to guide the evaluation plan28 l.jpg

Analysis and proposed actions to guide the evaluation plan

Do programs have adequate internal capacity to describe their services, measure outcomes, and use information to improve practice?

Agency support

Information systems

Programs are using data to inform practice

CCF evaluation plan actions

Adopt agreed-to baseline to outcome assessment

Develop data sharing and data warehouse

Provide TA to address training in data collection and use in CQI

Individualized agency plans


Creating a common data system l.jpg

Creating a common data system

Align our expectations to resources and capacity

Data elements to share based on model

Participant demographics

Participant needs

Participant service summary (e.g., staff assigned, frequency and duration of contacts, start and end dates)

Baseline to outcome measures that are valid and aligned to the model’s principal evidence-based claims

Baseline and minimum six month assessment

Protective Factors Survey

CQI and the essential role

Training and technical assistance as integrated actions with evaluation


A state home visiting management information system l.jpg

A state home visiting management information system

Work with existing management information systems or help develop the systems

With functioning MIS, develop data sharing agreements and transfer de-identified data

Without MIS, develop and support a data entry system aligned with your model reporting and extract de-identified data

Create a state HV data warehouse that can produce on-demand reports and support CQI is service delivery

CCF reports

Program level analysis of services and outcomes

Minimum quarterly data reporting


Conclusion l.jpg

Conclusion

Opportunity to examine what a portfolio EBHV approach requires

Recognize that programs need to be active partners with resources and support

Phased development, TA and Training

Central role of open-ended CQI and information driven decision-making

Pace and scope of effort is rate-limited by available resources


Questions l.jpg

Questions?

Joan Sharp

206-464-5493

[email protected]

Nancy Gagliano, LICSW

206-389-3297

[email protected]


  • Login