1 / 47

Evidence-based Application of Evidence-based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute RE

Evidence-based Application of Evidence-based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health New York, NY. Effect Sizes of Psychotherapies. Adults. Children & Adolescents. University. Mean Effect Sizes. “Real World”.

varick
Download Presentation

Evidence-based Application of Evidence-based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute RE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence-based Application of Evidence-based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health New York, NY

  2. Effect Sizes of Psychotherapies Adults Children & Adolescents University Mean Effect Sizes “Real World” Weisz et al., 1995

  3. Barriers vs. “Promoters” to Delivery of Effective Services (Jensen, 2000) Three Levels: Child & Family Factors: e.g., Access & Acceptance Provider/Organization Factors: e.g., Skills, Use of EB Systemic and Societal Factors: e.g., Organiz., Funding Policies EfficaciousTreatments “Effective” Services

  4. Teacher-Rated Inattention(CC Children Separated By Med Use) Key Differences, MedMgt vs. CC: Initial Titration Dose Dose Frequency #Visits/year Length of Visits Contact w/schools

  5. Would You Recommend Treatment? (parent) Medmgt Comb Beh Not recommend 9% 3% 5% Neutral 9% 1% 2% Slightly Recommend 4% 2% 2% Recommend 35% 15% 24% Strongly recommend 43% 79% 67%

  6. Key Challenges • Policy makers and practitioners hesitant to implement change • Vested interests in the status quo • Researchers often not interested in promoting findings beyond academic settings • Manualized interventions perceived as difficult to implement or too costly • Obstacles and disincentives actively interfere with implementation

  7. Key Challenges • Interventions implemented but “titrate the dose”, reducing effectiveness • “Clients too difficult”, “resources inadequate” used to justify bad outcomes • Research population “not the same” as youth being cared for at their clinical site • Having data and “being right” neither necessary nor sufficient to influence policy makers

  8. The Good and the Bad: Effectiveness of Interventions by Intervention Type No. of Interventions demonstrating positive or negative/inconclusive change Davis, 2000

  9. Little or No Effect (Provider & Organization-focused) : • Educational materials (e.g., distribution of recommendations for clinical care, including practice guidelines, AV materials, and electronic publications) • Didactic educational meetings Bero et al, 1998

  10. Effective Provider & Organizational Interventions: • Educational outreach visits • Reminders (manual or computerized) • Multifaceted interventions • Sustained, interactive educational meetings (participation of providers in workshops that include discussion and practice) Bero et al, 1998

  11. Implications re: Changing Provider Behaviors • Changing professional performance is complex - internal, external, and enabling factors • No “magic bullets” to change practice in all circumstances and settings (Oxman, 1995) • Multifaceted interventions targeting different barriers more effective than single interventions (Davis, 1999) • Little to no theory-based studies • Consensus guidelines approach necessary, but not sufficient. • Lack of fit w/HCP’s mental models

  12. Additional Perspectives • Messenger of equal importance as the message • Trusted • Available • Perceived as expert/competent • Adult Learning Models • Tailored to learner’s needs • Learner-defined objectives • Hands-on, with ample opportunities for practice • Sustained over time • Skill-oriented • Feedback • Attention to Maintenance and sustaining change

  13. Dissemination and Adoption of New Interventions • Sustained Interpersonal contact • Organizational support • Persistent championship of the intervention • Adaptability of the intervention to local situations • Availability of credible evidence of success • Ongoing technical assistance, consultation Source: Backer, Liberman, & Kuehnel (1986) Dissemination and Adoption of Innovative Psychosocial Interventions. Journal of Consulting and Clinical Psychology, 54:111-118; Jensen, Hoagwood, & Trickett (1997) From Ivory Towers to Earthen Trenches. J Appliied Developmental Psychology

  14. Science-based Plus Necessary “-abilities” • Palatable • Affordable • Transportable • Trainable • Adaptable, Flexible • Evaluable • Feasible • Sustainable

  15. Models for Behavior Change: (Jaccard et al, 2002) The Theory of Reasoned Action (Fishbein & Ajzen, 1975) Self-efficacy Theory (Bandura, 1977) The Theory of Planned Behavior (Ajzen, 1981) Diffusion of Innovations (Rogers, 1995)

  16. Influences on Provider Behavior • Provider Factors: • Knowledge, training • Self-efficacy • Time pressures • Fear of litigation • Attitudes & beliefs • Social conformity • Lack of information • Patient & Family Factors: • Stigma • Adherence • Negative attitudes • Rapport, engagement Prescribing Practices • Systemic & Societal Factors: • Organizational standards • Staff support/resistance • Staff Training • Funding policy • Economic Influences: • Compensation • Reimbursement • Incentives

  17. First, Use an Atypical vs. Typical Descriptives (n=19) Min/Max Mean(SD) Favor/Unfavor 0/5 3.73(1.61) Easy/Hard -1/5 4.16(1.64) Improve/No 0/5 2.84(1.57) Agree/Disagree 0/5 4.05(1.27)

  18. First Use Atypical--Advantages Advantages Count Percent of Responses Avoids typicals' side effects 13 59.1% Better patient approval/compliance 5 22.7% Atypicals effective in treating aggression 2 9.1% Other (i.e. looks better politically) 2 9.1% Total responses 22 100.0%

  19. First Use Atypical – Disadvantages Disadvantage Count Percent of Responses Typicals may work better for some patients 6 23.1% Avoids atypicals' side effects 6 23.1% If need to sedate patient, typicals may be better 6 23.1% More is known about typicals in kids 4 15.4% Can not be administered as IM’s 3 11.5% Other 1 3.8% Total responses 26 100.0%

  20. First Use Atypical—Obstacles Obstacle Count Percent of Responses Cost 5 23.8% More data supporting typicals 5 23.8% Patient history of non-response to atypicals 4 19.1% Patient resistance 3 14.3% Less available 2 9.5% Other 2 9.5% Total responses 21 100.0%

  21. Limit the Use of Stat’s & P.R.N.’s Descriptive Statistics (n=19) Min/Max Mean (SD) Favor/Unfavor -5/5 2.63(2.89) Easy/Hard -5/5 -0.38(3.22) Improve/No -2/5 2.44(1.92) Agree/Disagree -2/5 3.86(1.81)

  22. Limit Stat‘ & P.R.N.’s -- Advantages Advantage Count Percent of Responses Other (i.e avoids traumatizing patient, 6 27.3% Avoids unnecessary medication 5 22.7% Avoids unnecessary side effects 4 18.2% Allows doctor to better understand patient’s condition 4 18.2% Patient learns techniques they can apply in ‘real life’ 3 13.6% Total responses 22 100.0%

  23. Limiting Stat’s & P.R.N.'s — Disadvantages Disadvantage Count Percent of Responses Possible safety risk to patient and others 9 2.9% Other (i.e. does not address biological factors 6 28.6% Difficult for staff, who may feel less in control 4 19.0% May need to rapidly sedate patient 2 9.5% Total responses 21 100.0%

  24. Limiting Stat’s & P.R.N.'s--Obstacles Obstacle Count Percent of Responses Safety 8 33.3% Other (i.e.patient belief that p.r.n.’s condone behavior; 5 20.8% Staff resistance 4 16.7% Patient too aggressive 4 16.7% Staff availability and training 3 12.5% Total responses 24100.0%

  25. Monitor Side Effects Descriptives (n=19) Min/Max Mean(SD) Favor/Unfavor 3/5 4.57(.69) Easy/Hard -2/5 2.94(2.4) Improve/No 1/5 4.0(1.15) Agree/Disagree 3/5 4.68(.58)

  26. Use Standardized Scales for Side Effects -- Advantages Advantage Count Percent of Responses Helps captures side effects you might otherwise miss 8 27.6% Other (i.e. increases patient compliance; improves 6 20.7% communication between doctors; helps assess severity of side effects) Provides objective measure 4 13.8% Keeps doctors’ focus on side effects 4 13.8% Determines drug effectiveness for specific symptoms 4 13.8% Enables doctor to track side effects over time 3 10.3% Total responses 29 100.0%

  27. Use Standardized Scales for Side Effects--Disadvantages Disadvantage Count Percent of Responses Doctor may ignore side effects not on scale 3 27.3% May minimize importance of clinical evaluations 3 27.3% Other (i.e. may make patient more aware of side effects) 3 27.3% Methodological problems (i.e. inter-rater reliability) 2 18.2% Total responses 1 100.0%

  28. Scales for Side Effects--Obstacles Obstacle Count Percent of Responses Time 8 25.0% Scales are complicated/require training 6 18.7% Instrument availability 5 15.6% Other (i.e. staff resistance; instrument availability; 5 15.6% cost) Administrative barriers 3 9.4% Laziness 3 9.4% Clinician resistance 2 6.3% Total responses 32 100.0%

  29. New Models for Behavior Change: TMC, TII (Gollwitzer, Oettingen, Jaccard, Jensen et al, 2002; Perkins et al., 2007)

  30. Mental Contrasting/Implementation Intentions • Use mental contrasting to strengthen behavioral intentions: “What are the advantages or positive consequences associated with the use of Guideline X” • Identify Obstacles: “What gets in the way of implementing guideline X” • Form Implementation Intentions to overcome obstacles: “If I encounter obstacle Y, then I will X.”

  31. Pre-Intervention Post-Intervention Descriptive Statistics (n = 4) Descriptive Statistics (n=4) Mean Min/Max (SD) Min/Max Mean(SD) Favor/Unfavor Favor/Unfavor 1/5 3.0(1.6) 1/5 3.5(1.9) Easy/Hard Easy/Hard -3/1 -0.5(1.9) 0/4 1.8(1.7) Improve/No Improve Improve/No Improve 2/3 2.8(0.5) 1/4 2.5(1.3) Agree/Disagree Agree/Disagree 3/4 3.3(0.5) 3/5 4.3(1.0) Track Target Symptoms

  32. Pre-Intervention Post-Intervention Descriptive Statistics (n = 4) Descriptive Statistics (n=4) Mean Min/Max (SD) Min/Max Mean(SD) Favor/Unfavor Favor/Unfavor 4/5 4.8(0.5) 5/5 5.0(0.0) Easy/Hard Easy/Hard -5/5 3.3(2.9) 1/5 3.5(1.9) Improve/No Improve Improve/No Improve 4/5 4.8(0.5) 5/5 5.0(0.0) Agree/Disagree Agree/Disagree 5/5 5.0(0.0) 5/5 5.0(0.0) Use A Conservative Dosing Strategy

  33. Pre-Intervention Post-Intervention Descriptive Statistics (n = 4) Descriptive Statistics (n=4) Mean Min/Max (SD) Min/Max Mean(SD) Favor/Unfavor Favor/Unfavor -3/5 2.5(3.8) 3/5 4.5(1.0) Easy/Hard Easy/Hard -5/5 -0.8(4.2) 0/4 2.0(1.8) Improve/No Improve Improve/No Improve 2/5 3.8(1.5) 1/5 3.3(1.7) Agree/Disagree Agree/Disagree 3/5 4.5(1.0) 4/5 4.8(0.5) Limit the Use of P.R.N.s

  34. Intention to Use Guidelines in the Next Month (n=4)

  35. Barriers vs. “Promoters” to Delivery of Effective Services (Jensen, 2000) Three Levels: Child & Family Factors: e.g., Access & Acceptance Provider/Organization Factors: e.g., Skills, Use of EB Systemic and Societal Factors: e.g., Organiz., Funding Policies EfficaciousTreatments “Effective” Services

  36. CLINIC/COMMUNITY INTERVENTION DEVELOPMENT AND DEPLOYMENT MODEL (CID) (Hoagwood, Burns & Weisz, 2000) Step 1: Theoretically and clinically-informed construction, refinement, and manualizing of the protocol within the context of the practice setting where it is ultimately to be delivered Step 2: Initial efficacy trial under controlled conditions to establish potential for benefit Step 3: Single-case applications in practice setting with progressive adaptations to the protocol Step 4: Initial effectiveness test, modest in scope and cost Step 5: Full test of the effectiveness under everyday practice conditions, including cost effectiveness Step 6: Effectiveness of treatment variations, effective ingredients, core potencies, moderators, mediators, and costs Step 7: Assessment of goodness-of-fit within the host organization, practice setting, or community Step 8: Dissemination, quality, and long-term sustainability within new organizations, practice settings, or communities

  37. Partnerships & Collaborations in Community-Based Research • Why Partnerships? • partnerships -- not with other scientists per se, but with experts of a different type -- experts from families, neighborhoods, schools, in communities. • Only from these experts can we learn what is palatable, feasible, durable, affordable, and sustainable for children and adolescents at risk or in need of mental health services • “Partnership” - changes in typical university investigator - research subject relationship • Practice – based Research Networks • Bi-directional learning

  38. Partnerships & Collaborations in Community-Based Research • Traditional approach • research question posed, building on theory and body of previous research • logical next step in elegant chain of hypotheses, tests, proofs, and/or refutations • isolation of variables from larger context; limit potential confounds and alternative explanations of findings • study designed, investigator then looks for “subjects” who will “recipients of the bounty” • cannot answer questions about sustainability • unidirectional • blind to issues of ecological validity

  39. Partnerships & Collaborations in Community-Based Research • Alternative (collaborative) approach • expert-lay distinction dissolved • both partners bring critical expertise to research agenda • research methods and technical expertise from the university investigator • systems access and local-ecological expertise from the community collaborator • so-called “confounds” can provide useful “tests” of the feasibility, durability, and generalizability of the intervention • hence, importance of replication • improved validity of knowledge obtained?

  40. The REACH Institute….Putting Science to Work Step I Step II - Problem area identification - Bring key “change agents” and gatekeepers to the table (federal or state partners, consumer and professional organizations) - Identify “actionable” knowledge among experts and “consumers” - Identify E-B QI procedures that are feasible, sustainable, palatable, affordable, transportable - Consumer and stakeholder “buy-in” & commitment to E-B practices - Dissemination via partners across all 3 system levels - “with an edge” (policy/legislative strategy with relevant federal/state partners) Step IV - Training and TA/QI intervention; all sites eventually get intervention. - Monitoring/fidelity - Report preparation - Results fed back into Step II. Step III - Site recruitment and preparation within “natural replicate” settings - Tool preparation, fidelity/monitoring - ”Skimming the cream,” first taking those sites most ready

  41. Design Considerations • “Begin with the end in mind” – CID model • Enemy of the good is the perfect: raise the floor, not the ceiling • “Randomized encouragement trials” vs. randomized controlled trials • Quality Improvement group vs. TAU • How does one know the necessary ingredients of change? • Attention – Expectations – Hawthorne effects? Measure them • Attention dose, time in treatment? Measure them • Measure change processes • Assuring fidelity to model? Measure it • Ensure therapeutic relationship…and measure it • Ensure family buy-in and therapist buy-in. Measure it • Need for two controls? TAU, attention control group

  42. Overcoming Challenges: A Motivational Approach Change implementation strategies based on motivational approaches - William Miller • Practice what you preach • Express empathy • to challenges of policy makers and practitioners in implementing change with population • Develop discrepancy between ideal and current • Success of evidence-based treatment must be explainable, straightforward, simply stated, meaningful

  43. Overcoming Challenges: A Motivational Approach • Avoid argumentation • Clinician scientists credible to policy makers and community-based practitioners • Avoid overstating the case and “poisoning the well” • Roll with resistance • Develop strategies for engagement, prepare for possible resistance

  44. Foundation of Collaborative Efforts Goals Researcher driven Shared; equal investment Power Research retains Fairly distributed Skills Research skills designated as primary Recognition of contribution by community member & & researchers Communication One-way Unbalanced Open; opportunities to discuss & resolve conflict Trust Belief in the good faith of partners; room for mistakes Continual suspicion

  45. Degrees of collaboration Focus groups Community Advisors or Advisory Board Community partners as paid staff Collaboration (+) identification of pressing community/family needs (+) definition of acceptable research projects or service innovations (+) provides ongoing input regarding various stages of research process (+) collaboration regarding implementation of project (+) access to researchers to provide guidance as obstacles encountered (+) co-creation co-implementation co-evaluation co-dissemination

  46. Points of Collaboration in the Research Process Study Aims Research design & sampling Measurement & Outcomes Procedures (recruit, retain, data collection Implementation Evaluation Dissemination Members of partnership define dissemination outlets OR Members of community fulfill co-author & co-presenter roles OR Researchers present at conferences & publish Plans for analysis co-created to ensure questions of both community & researchers answered OR Community members assist in interpretation of results OR Researchers analyze data Shared responsibility (e.g. community to recruit, research staff to collect data) OR Designed with input OR Designed by researchers Decision made jointly OR Researcher educates on methods & advice sought OR Methods pre- determined Projects are co-directed OR Researchers train community members as co-facilitators OR Research staff hired for project Defined collaboratively OR Advice sought OR Researcher defined Defined within partnership OR Advice sought OR Researcher defined Collaboration

  47. The REACH Institute….Putting Science to Work Step I Step II - Problem area identification - Bring key “change agents” and gatekeepers to the table (federal or state partners, consumer and professional organizations) - Identify “actionable” knowledge among experts and “consumers” - Identify E-B QI procedures that are feasible, sustainable, palatable, affordable, transportable - Consumer and stakeholder “buy-in” & commitment to E-B practices - Dissemination via partners across all 3 system levels - “with an edge” (policy/legislative strategy with relevant federal/state partners) Step IV - Training and TA/QI intervention; all sites eventually get intervention. - Monitoring/fidelity - Report preparation - Results fed back into Step II. Step III - Site recruitment and preparation within “natural replicate” settings - Tool preparation, fidelity/monitoring - ”Skimming the cream,” first taking those sites most ready

More Related