1 / 47

Building Evidence in Practice

Building Evidence in Practice. Module 3 Elaine A. Borawski, PhD. If We Want More Evidence-Based Practice… …We Need More Practice-Based Evidence. ~ Lawrence W. Green, DrPH University of California at San Francisco. Overview. Understand what “evidence-based” means

maxie
Download Presentation

Building Evidence in Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Evidence in Practice Module 3 Elaine A. Borawski, PhD

  2. If We Want More Evidence-Based Practice… …We Need More Practice-Based Evidence. ~ Lawrence W. Green, DrPH University of California at San Francisco

  3. Overview Understand what “evidence-based” means Become familiar with evidence-based programs and how to locate them Understand why evidence-based practices are important in public and community health Become familiar with strategies for selecting an appropriate evidence-based practice for your own project.

  4. Question What do you think of when you hear the term “evidence-based”?

  5. Answer An evidence-based program has been: Implemented with a group Evaluated Found to be effective.

  6. OBJECTIVE SUBJECTIVE What is Evidence? Surveillance Data Systematic Reviews of Multiple Intervention Studies An Intervention Research Study Program Evaluation Word of Mouth Personal Experience

  7. How do we define evidence in science? Scientific evidence is evidence which serves to either support or counter a scientific theory or hypothesis. Such evidence is expected to be empirical evidence and in accordance with scientific method.

  8. What does it mean to have empirical evidence? Empirical evidenceis a source of knowledge acquired by means of observation or experimentation.

  9. Why the Fuss? More Federal funders are requiring program planners to use evidence-based programs. Some consider evidence that is proven through research (explicit). Some consider evidence that is derived from experience or practice (tacit). The best evidence may be a combination of research and practice.

  10. TYPES OF EVIDENCE • TYPE I • Established through observational research. • Provide evidence for a link between a preventable risk factor (e.g., sedentary lifestyle) and some specific health outcome (e.g., obesity). • EXAMPLE: Individuals who exercise regularly have less cardiovascular disease, diabetes, and obesity. Many studies have confirmed this relationship. • However, does not tell us, HOW to best help people to establish regular exercise routines.

  11. TYPES OF EVIDENCE • TYPE II: Evidence-Based Practices • Established through clinical or behavioral trials. • A specific intervention or approach has been found to effective in changing a behavior that has been linked to a specific health outcome • EXAMPLE: A number of studies have shown that walking programs that include the use of incentives, motivational tools such as pedometers, and social support are more likely to result in participants adopting a regular walking schedule

  12. Research tells us that individuals with controlled diabetes are better at identifying carbohydrates in foods than those not in control (Type I) How do we take this information and turn it into an intervention? Carb-counting educational intervention – then compare carb knowledge AND A1c levels 2 months later with group that did not receive the intervention. If it works, that’s Type II evidence. If it doesn’t – there’s still Type I evidence – just not an evidence-based practice.

  13. Sources for Evidence Read, read, read Academic journals Govt’ websites (CDC, NIH) Advocacy websites Lay journals/mags that summarize research Guide to Community Preventive Services – http://www.thecommunityguide.org/index.html

  14. Covers a variety of public health topics http://www.thecommunityguide.org/index.html

  15. Recommendation Summaries Detailed Summaries of Studies Conducted

  16. Systematic reviews at your fingertips In seven of eight studies reviewed, increases in the price of tobacco products results in decreases in both the number of people who use tobacco and the quantity they consume. See original publications for additional information

  17. Let’s check out some evidence http://www.thecommunityguide.org/index.html

  18. Your Experience What has your experience been with evidence-based programs? Where have you heard of them before? Have any of you used these programs in the past? Are any of YOUR programs considered evidence-based?

  19. Advantages to UsingEvidence-Based Programs • Effective in the study populations • Cost effective • Shorten the time it takes to develop a program • Reduce the time it takes to research a community • Help narrow the evaluation.

  20. Barriers to Evidence-Based Programs • May limit my/our creativity. • Take too much time and/or money. • Often difficult to replicate in community settings (translation). • Too scientific. • My community is unique. An evidence-based program will not be as appropriate as if I developed the program myself. • I do not know what evidence-based programs are or where to find them.

  21. Finding an Evidence-Based Program

  22. Objectives • Be able to find evidence-based program resources. • Know how to use search options to narrow your program choices and find out what programs will and will not work with your community. • Alternative Sources for Evidence-Based Programs • Talking With the Principal Investigator • Finding an Evidence-Based Program: Case Study.

  23. Selecting Evidence-Based Community Programs • Peer reviewed literature and research • National Registry of Evidence-Based Programs and Practices (NREPP) • http://www.nrepp.samhsa.gov • Research-Tested Intervention Programs (RTIPs) • http://rtips.cancer.gov/rtips/index.do • Guide to Community Preventive Services • http://www.thecommunityguide.org/index.html

  24. Criteria for Selecting a Program • Thinking about your organization and the target population for your project: • Was the program conducted with people who had similar • Socioeconomic status • Resources • Ethnicity • Traditions • Priorities • Community structure and values. • Is the program appropriate for the age of your audience? • Choose a program that is well-matched with: • Your health topic (e.g., breast or cervical cancer, nutrition, physical activity) • What your audience is already doing about the health issue.

  25. Criteria for Selecting a Program • Context for intervention • Coverage across the range of populations or setting involved in a health concern • Knowledge of what populations or settings involved in a health concern • Knowledge of what populations interventions will be effective for and under what conditions • Role of race, ethnicity, and culture • Staff creativity, experience • Balancing fidelity and adaptation (Allensworth & Fertman, 2010)

  26. Criteria for Selecting a Program • These strategies can include: • Giving information • Enhancing skills • Improving the services and/or support systems that exist • Changing incentives or barriers that maintain the problem • Promoting access • Making suggestions for policy changes.

  27. Resources • Remember to avoid a program that takes more resources than you have. • Different evidence-based programs will take different amounts of money, labor, and/or time. • Whenever you can, speak with the team that developed the program or product in which you are interested. They can share information about the program that may be helpful.

  28. Evidence-Based Practice in Action: LifeSkills Training

  29. What is LifeSkills Training? Substance Abuse Prevention and Personal Development Curriculum Developed by Dr. Gilbert Botvin in late 1970’s Identified as a “Program that Works” by the Centers for Disease Control and Prevention Effective substance abuse prevention focuses on changing behavior

  30. The 6 Fundamentals of LST • There are three Domains of Cognitive Behavioral Theory • LST is an evidence-based program • LST changes thinking and behavior (cognitive-behavioral skills) • Booster sessions increase effectiveness • Interactive teaching methods enhance learning • Less is more

  31. Summary of Evaluation Results Middle/High School Curriculum • Reduces substance abuse by up to 87% • Effective in reducing tobacco, alcohol, marijuana, inhalants, narcotics & hallucinogens • Effects last for at least 6 years • Effective in reducing aggressive / violent behavior • Researched and proven effective with African-American, Hispanic, White, Urban, Suburban and Rural Youth

  32. Summary of Evaluation Results Elementary School Results • Reduces annual smoking rates by up to 63% • Increases self-esteem • Findings show that the evidence-based LST model is equally effective at the Elementary level

  33. Summary of Evaluation Results • Evidence of Effectiveness • Effectiveness established through rigorous scientific inquiry • Durability of effects • Learn, retain, transfer • Ease of Implementation • Fits with existing infrastructures

  34. Building Evidence Types of Evaluation

  35. Formative Evaluation • Asks: is an element of a program or policy (e.g., materials, messages) feasible, appropriate, and meaningful for the target population? • Often, done in the planning stages of a new program • Can be used to examine contextual factors • “What are the attitudes among school officials to the proposed healthy eating program” • “Are there certain schools that have healthier food environments that others?” 35

  36. Process Evaluation • Assesses the way a program is being delivered, rather than the effectiveness of a program. Asks: • Are all activities being implemented as planned? • Is the intervention feasible and acceptable? • Are you reaching intended recipients? • Provides shorter-term feedback on program implementation, content, methods, participant response, practitioner response. 36

  37. California Local Health Department Funding by Core Indicator Spo. 2004/07 2001/04 Cess. Cess. Bars License License Outdoor SSD Outdoor In 1998, smoke-free bar law went into effect. In 2002, California launched STORE campaign – one goal was to increase retail licensing ordinances. 37

  38. Types of Evaluation Impact and outcome evaluations only after program has had adequate time to yield quantitative changes. Amount of time needed depends on nature of program and changes expected. Each type of evaluation required different designs and data collection methods.

  39. Impact Evaluation Asks: Has the initiative been successful in achieving intended outcomes? • long-term or short-term feedback on knowledge, attitudes, beliefs, behaviors • uses quantitative or qualitative data • also called summative evaluation • probably more realistic endpoints for most public health programs and policies 39

  40. California and U.S. minus California adult per capita cigarette pack consumption, 1984/1985-2004/2005 Packs/Person 200 $0.25 tax increase 150 $0.02 tax increase $0.50 tax increase US minus CA 100 California 50 0 Source: California State Board of Equalization (packs sold) and California Department of Finance (population). U.S Census, Tax Burden on Tobacco, and USDA.. Note that data is by fiscal year (July 1-June 30). Prepared by: California Department of Health Services, Tobacco Control Section, February 2006. 40

  41. Outcome Evaluation • Provides long-term feedback on health status, morbidity, mortality, and QOL. • Often difficult to attribute to a particular program because it takes so l on for the effects to be seen and because outcomes are influenced by many factors. • Usually requires experimental or quasi-experimental (rather than observational) approaches to link program influences on outcomes. • Usually relies on quantitative methods • Often used in strategic plans 41

  42. Acute myocardial infarction Death Rate per 100,000 (2008-2010)

  43. Program Evaluation Designs Intervention initiated rate Can we conclude that the intervention is effective?

  44. Program Evaluation Designs Intervention initiated rate Can we conclude that the intervention is effective?

  45. Indicators for Impact and Outcome Evaluation Health indicators measure extent to which targets in health programs are being reached. For evaluation purposes, indicators are not goals in themselves – should not be confused with program objectives and targets. Indicators provide valuable benchmarks. Program evaluation aimed at impacting these benchmarks require shorter-term (intermediate) markers. i.e., # of smoke free campuses

  46. Questions?

  47. References • Powerpoint presentation adapted from “Using What Works: Adapting Evidence-Based Programs to Fit Your Needs. U.S. Department of Health and Human Services. National Cancer Institute. 2006. • http://cancercontrol.cancer.gov/use_what_works/start.htm • Health Promotion Programs: From Theory to Practice. Carl I. Fertman (Editor), Diane D. Allensworth (Editor), Society for Public Health Education ISBN: 978-0-470-24155-4

More Related