1 / 28

Understanding Evidence-based Programs

Understanding Evidence-based Programs. Mark Small, J.D., Ph.D. Clemson University. How would you describe your level of knowledge and experience in evidence-based programming?. a) Novice (Heard the term, not sure what it means) b) Apprentice (Knowledgeable with some experience)

rod
Download Presentation

Understanding Evidence-based Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Understanding Evidence-based Programs Mark Small, J.D., Ph.D. Clemson University

  2. How would you describe your level of knowledge and experience in evidence-based programming? a) Novice (Heard the term, not sure what it means) b) Apprentice (Knowledgeable with some experience) c) Journeyman (Extensive experience using evidence-based programs) d) Master (Experience creating evidence-based programming)

  3. What is your current relationship with evidence-based programming? a) Just looking (Flirting with possibilities) b) Engaged (Committed to doing evidence-based programming) c) Married (Implementing evidence-based programming) d) Divorced (Separated because of irreconcilable differences)

  4. Ivory Tower of Babel

  5. Historical Use of the Term

  6. Evidence-based * • Evidence-based approaches • Evidence-based interventions • Evidence-based policies • Evidence-based practices • Evidence-based programs • Evidence-based strategies • Evidence-based treatments

  7. Poll Question Which of the following is not a true citation? a)Evidence-based assessment (Achenbach, 2005) b)Evidence-based culture (Ganju, 2006) c)Evidence-based kernels (Emery, 2010) d)Evidence-based practice models (Bridge et al. 2008) e)Evidence-based quality improvement (Grimshaw et al. 2006) f)Evidence-based decision-making (Chorpita, Bernstein, & Deleiden, 2008) g)Evidence-based love-making (Masters et al. 2010)

  8. Definitions of “evidence-based” • “The integration of best-researched evidence and clinical expertise with patient values.” (Institute of Medicine of the National Academies) • The integration of the best available research with clinical expertise in the context of patient characteristics, culture and preferences.”(American Psychological Association)

  9. Why the Interest? • Body of scientific evidence has reached a critical mass • Public accountability • Efficiency—don’t need to reinvent the wheel • Increases the likelihood that programs will have the impact that they were designed to produce • Evidence helps sell the program to funders, stakeholders, and potential audiences • Data may be available to estimate cost effectiveness

  10. Evidence-based Program (EBPs) Registries: What are they? • Registries assist the public in identifying interventions and programs that • Have been scientifically tested; i.e., have an “evidence base” • Can be readily disseminated to the field • May fit specific needs and resources

  11. What do Registries Contain? • Although each Registry has different structures and content on their evidence-based programs, most contain: • Descriptive information for each program listing • Quality of Research (QOR) ratings, at the outcome level • Readiness for Dissemination (RFD) ratings • A list of studies and materials reviewed • Contact information to obtain more information on studies and implementation of the program

  12. Registries of Evidence-based Programs • Federal Government • State Government (HI, NY) • Foundations • Universities • International • Government • Non-government

  13. Example Registries • National Registry of Evidence Based Programs and Practices(nrepp.samhsa.gov/) • What Works Clearinghouse (ies.ed.gov/ncee/wwc) • Australian Research Alliance for Children and Youth Safe and Sound (aracy.org.au) • Promising Practices Network (PPN), Rand Corporation (promisingpractices.net) • Safe and Sound(Collaborative for Academic, Social, and Emotional Learning) (casel.org) • FindYouthInfo.gov: Evidence-Based Program Directory (findyouthinfo.gov)

  14. Interagency Working Group on Youth Programs: Membership 14 Corporation for National and Community Service Office of National Drug Control Policy U.S. Department of Agriculture U.S. Department of Commerce U.S. Department of Defense U.S. Department of Education U.S. Department of Health and Human Services (Chair) U.S. Department of Housing and Urban Development U.S. Department of Justice (Vice-Chair) U.S. Department of Labor U.S. Department of the Interior U.S. Department of Transportation

  15. Questions to consider when selecting and implementing Evidence-based Programs • Does your organization have the capacity to address the problem? • Are you alone? • Do you know, really know, what the problem is? • Have you considered EB programs to solve the problem? • Do you know if your approach is working? • Are you faithful (showing fidelity to program)? • Can you sustain your efforts? • Are you doing process and outcome evaluations?

  16. Fidelity: How faithful is good enough? • Which of the following is the most faithful replication of the program, LOVE? a) LOVE b) LOE c) love d) LOVE

  17. LOVE = LOVE?

  18. LOVE = LOE

  19. LOVE = love?

  20. LOVE = LOVE

  21. Poll Question • What is the best description of your experience in collecting data for youth and family programming? a) Joyous (equivalent to spa day) b) Happy ($25 scratch-off lottery ticket) c) Neutral (Zen-like detachment) d) Mild Discomfort (Slight headache) e) Painful (Dentistry level: root canal)

  22. Evaluation: Pre-packaged vs. Homemade Outcomes

  23. Evaluating EBP – Broadening the evaluation • Formative evaluation – documentation, fidelity and modifications. • Summative evaluation – outcomes and outcomes’ repetition.

  24. Evaluating evidence-based programs Evaluation of EBP should provide the proof that the program works for you, but also…. should upscale the program so that it will be generalized and adopted in other settings.

  25. Evaluation methods • Same as indicated in manuals • Same as indicated for non-evidence-based programs • Depend on programs’ identified goals, advantages and disadvantages of the measures and available resources.

  26. Summary Points • Well defined problems and goals for EBP are key components for program’s selection, implementation, evaluation and sustainability • Modifications made to a program should be well documented and taken into consideration when evaluation is planned • Evaluating EBPs requires more than following the manual guidelines • Feasibility should be balanced against best fit when selecting appropriate EBPs • Premature to require all programs to be evidence-based • But we’d be foolish to ignore the growing research evidence and EBP principles • Evidence-based should be a standard to aspire to • Need to evaluate and improve locally developed programs

  27. Thank You!

More Related