1 / 48

PAR-18-007: Participating Organizations

How Much Evidence is Needed? 2018 Dissemination & Implementation Science Workshop Edward Ellerbeck, MD, MPH Kansas University Medical Center. PAR-18-007: Participating Organizations. Objectives. Provide a framework for addressing evidence in implementation science research

jstewart
Download Presentation

PAR-18-007: Participating Organizations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How Much Evidence is Needed?2018 Dissemination & Implementation Science WorkshopEdward Ellerbeck, MD, MPHKansas University Medical Center

  2. PAR-18-007: Participating Organizations

  3. Objectives • Provide a framework for addressing evidence in implementation science research • Identify key features of Hybrid implementation-effectiveness designs

  4. Traditional Translation Framework Efficacy Effectiveness 20 + years Implementation Dissemination & Diffusion

  5. Conceptual Model of implementation research Proctor; Adm Policy Ment Health. 2009 January ; 36(1):

  6. Translating Evidence into Practice Implementation Strategy(s) Intervention Tool Processes of Care Health Outcomes

  7. Translating Evidence into Practice:Smoking Cessation Practice Facilitation 5 As for Smoking Cessation Cessation Counseling & PTX Smoking Cessation

  8. Scientific Premise

  9. Evidence for the Intervention Practice Facilitation 5 As for Smoking Cessation Cessation Counseling & PTX Smoking Cessation

  10. Pure Implementation study • Rationale: • Effectiveness of intervention clearly established • Gaps in uptake of the intervention • Justify the costly efforts to support implementation (proposed implementation is feasible) Curran et al. Med Care. 2012; 50(3):217-26

  11. Pure Implementation study Practice Facilitation Audit & Feedback Ask Assess Advise Assist Arrange 5 As for Smoking Cessation VERSUS

  12. Evidence for the Implementation Strategy Practice Facilitation 5 As for Smoking Cessation Cessation Counseling & PTX Smoking Cessation

  13. 73 Implementation Strategy(s) Powell et al. Implementation Science (2015) 10:21

  14. Implementation research:Evidence & Scientific Premise • Evidence (for intervention) • RCTs linking intervention to outcome • Practice guidelines • Pilot studies • Models and theories • Evidence (for implementation strategy) • Impact for other interventions • Pilot studies • Theoretical framework

  15. Limitations of Implementation-only studies • Interventions are generally not implemented by within the same context or by the same actors that were used in the original efficacy studies • Interventions may not have the same outcomes in new contexts or with new actors • Critical features of interventions may be lost in translation (fidelity) • Cost-effectiveness can’t be assessed in implementation-only studies

  16. Hybrid Designs https://cyberseminar.cancercontrolplanet.org/implementationscience/

  17. Hybrid trial designs Effectiveness Research Implementation Research Hybrid Type 1: -test intervention -collect data on implementation Hybrid Type 2: -test intervention and implementation simultaneously Hybrid Type 3: -test implementation -observe intervention outcomes

  18. Hybrid designs: rationale • Speed/improve translation of evidence into practice • How to implement • Barriers to implementation • Understand effectiveness in real clinical settings • Bring fidelity & ‘critical components’ to the forefront • Identify the ‘costs’ of translation into practice

  19. Dissemination and Implementation Research in Health (R01s) • Initial issue in 2002: PA-02-131 • Latest issue: PAR-18-007 • PAR-13-055: 48 funded R01s; 6 (12%) ‘hybrid’ • PAR-16-238: 25 funded R01s; 8 (32%) ‘hybrid’ • Type I: 4 • Type II: 1 • Type III: 3

  20. Hybrid Type 1 Design • Test the effectiveness of the intervention while gathering information on implementation • Rationale: • Good preliminary data, but further effectiveness studies still needed • Applying the intervention to a new population, setting, or delivery method • Adapting the intervention to a new setting/context • Intervention is of minimal risk

  21. Hybrid Type 1 implementation questions • What were the facilitators/barriers to delivering the intervention? • What were the facilitators/barriers to sustaining the intervention after the study was completed? • How might the intervention be changed to improve adoption and sustainability?

  22. Hybrid Type 1 Design Clinic or Pts Randomized Intervention Control Implementation Barriers & Facilitators Clinical Outcome

  23. Hybrid Type I: PAR-16-238 examples • M-health intervention to increase adherence to triage of HPV+ women who have performed self-collection • SURVIVORLINK: Scalability of an electronic personal health record for cancer survivors and caregivers at pediatric cancer centers • Implementing a virtual tobacco treatment intervention in community oncology practices • Translating an efficacious illness management intervention for African American youth with poorly controlled asthma to real world settings

  24. Hybrid Type I: Translating an efficacious intervention for AA youth with poorly controlled asthma to a ‘real world’ setting* • Clinical Intervention: • ‘Reach for Control’: Home-based, multi-component program delivered by CHWs • Control: Agency’s ‘standard program’ • Design: • Patient-level RCT in one hospital emergency department/community agency • Implementation strategy: Training • Implementation evaluation • Reach (identification, screening, and referral in the ED) • Barriers (from perspective of ED staff and CHWs) • Fidelity of the intervention *Deborah Ellis – Wayne State

  25. Children with asthma in ED CHW-delivered ‘Reach for Control’ Standard care Implementation Reach, Barriers, and Fidelity Asthma Exacerbations

  26. Hybrid Type I: Translating an efficacious intervention for AA youth with poorly controlled asthma to a ‘real world’ setting* • Evidence base: Demonstrated effectiveness of a similar intervention delivered in a different setting by trained professionals* • Primary Aim: Evaluation of the Reach for Control Intervention • (RCT trial design) • Secondary Aims: • Identify barriers and facilitators to the intervention • Examine intervention fidelity • Conduct a cost analysis

  27. Hybrid Type II Design • Simultaneously test intervention and implementation strategy • Intervention effectiveness study within: • a controlled implementation study (factorial design) • non-randomized study of implementation strategy (e.g. step-wedge) • Rationale: • Enough evidence to support study of both intervention and implementation • Evidence not sufficient for the specific context or population • System/policy demands for evidence on intervention and implementation • Reasonable evidence that the implementation strategy could be supported/sustained within the given context

  28. Clinics Randomized Implementation strategy 1 Implementation strategy 2 Control Intervention Implementation Barriers & Facilitators Clinical Outcome

  29. Hybrid Type II: PAR-16-238 example • A multicenter trial of a shared decision support intervention for patients offered implantable cardioverter-defibrillators: DECIDE-ICD Trial

  30. Breakout #4: Paul Estabrooks • Operationalizing Type II Hybrid Effectiveness Implementation Designs

  31. Hybrid Type III Design • Tests the implementation strategy while simultaneously gathering data on clinical outcomes • Conditions: • Strong ‘demand’ for implementation despite limited data of how the intervention influences outcomes • Strong face-validity for the clinical intervention and/or implementation strategy • At least indirect evidence to support the clinical intervention and/or implementation strategy • Effectiveness of intervention may be vulnerable to implementation fidelity or different effectiveness in the new context

  32. Clinics Randomized Implementation Strategy 1 Implementation Strategy 1 Implementation Barriers & Facilitators Clinical Intervention Clinical Outcome

  33. Hybrid Type III: PAR-16-238 examples • IAMSBIRT: Implementing alcohol misuse SBIRT in a national cohort of pediatric trauma centers • Increasing implementation of evidence-based interventions at low-wage worksites • Effectiveness and implementation of MPATH-CRC: A mobile health system for colorectal cancer screening

  34. Hybrid III – MPATH-CRCA mobile health system for CRC screening • Clinical Intervention: • MPATH-CRC: iPAD delivered CRC decision-support, self-ordering, with text message follow-up • Design: • Primary care clinics (n=28) cluster randomized to implementation strategy • Implementation strategies: • ‘High-touch’: Clinic facilitation, clinic champions, data feedback, follow-up training and adaptation • ‘Low-touch’ • Implementation framework: • Technology Acceptance Model & Dynamic Sustainability Framework

  35. Hybrid study designs and Implementation Frameworks NilsenImplementation Science (2015) 10:53

  36. Clinics Randomized N=28 High Touch Clinic Facilitation Low Touch Clinic Facilitation Implementation Barriers & Facilitators M-Path: iPad CRC decision support* Clinical Outcome *Evidence-based intervention

  37. Hybrid III – MPATH-CRCA mobile health system for CRC screening • Evidence base: R01-funded RCT in 450 pts MPATH doubled the rate of CRC screening completion* • Aim 1: Evaluation of implementation - Screening rates in ‘high’ vs ‘low’ touch clinics (RCT trial design) • Aim 2: Evaluation of intervention - CRC completion at 16 weeks post-intervention (Nested, pre-post design) • Aim 3: Maintenance of intervention; barrier and facilitators to implementation (based on provider/staff interviews) *Ann Intern Med. 2018 Apr 17;168(8):550-557

  38. Hybrid III - issues • Is there ever a time when we wouldn’t want to collect ‘effectiveness’ outcomes as part of an implementation study? • How much ‘rigor’ is required for the ‘effectiveness’ component of a type III study? • Can you use pragmatic or secondary sources for effectiveness data?

  39. Hybrid designs & RE-AIM • RE-AIM: Very common evaluation framework for hybrid trial • Must be accompanied by clear, measurable elements for each component • RE-AIM does not provide a: • Theoretical implementation model • Framework for determinants of implementation • Model for change

  40. Hybrid designs and scientific rigor • Greater demand for rigor with the primary aim • Consider an RCT • Greater demand for pragmatism with implementation strategies

  41. How much ‘evidence’ of effectiveness is required? • Evidence needed for both: • Clinical intervention • Implementation strategies • ‘Evidence’ supported by: • Prior studies in different context/population • Pilot work • Theoretical framework • Need for evidence may justify a hybrid design

  42. How much ‘evidence’ of effectiveness is required? • At the conclusion of the trial, there must be strong empiric evidence that the clinical intervention (as implemented) works • Based on prior studies (or) • Based on effectiveness data from your study (hybrid I or II design)

  43. How much ‘evidence’ of effectiveness is required? • At the outset of the trial, there must be strong empiric evidence and/or theoretical support that the implementation strategy will work • Effectiveness studies: tightly control the implementation (e.g. employ and train the actors) • Hybrid I or II: Can’t study effectiveness of the clinical intervention if the intervention is not implemented (Hybrid I and II) • Hybrid III or implementation only: Must have a theoretical framework to justify testing the implementation strategy

  44. Questions?

More Related