1 / 16

Using Data to Inform Practice

Using Data to Inform Practice. Michael L. Dennis, Ph.D. Chestnut Health Systems, Normal IL

Download Presentation

Using Data to Inform Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Data to Inform Practice Michael L. Dennis, Ph.D. Chestnut Health Systems, Normal IL Presentation at SAMHSA/CSAT Satellite Session, “ Implementing Evidenced Based Treatment for Adolescent Based Treatment,” College of Problems on Drug Dependence, Reno, NV, June 20, 2009. The opinions in this presentation are those of the author and do not reflect positions of the government. This presentation was supported by SAMHSA/CSAT contract no. 270-07-0191. Available from www.chestnut.org/li/posters or by contacting author at mdennis@chestnut.org or 309-451-7806.

  2. Common Questions in Local Program Evaluation and Development • Who is being served? • What services are they receiving? • To what extent are services being targeted at those in need? • To what extent are services being delivered as expected? • Which is the most effective of several services delivered? • What does it cost, cost effectiveness? Source: Dennis, Fetterman & Sechrest (1994)

  3. Exploring Need, Targeting & Unmet Need 7/28=25% to targeted 139/146=95% unmet need 146/1222=40% in need Size of the Problem Extent to which services are currently being targeted Extent to which services are not reaching those in most need

  4. Mental Health Problem (at intake) vs. Any MH Treatment by 3 months *3+ on ASAM dimension B3 criteria Source: Ives & Moritz 2009 CSAT Juvenile and Family Drug Court Presentation

  5. MH Issues at Intake vs. MH Treatment at 3 months Source: Ives & Moritz 2009 CSAT Juvenile and Family Drug Court Presentation

  6. Other MH Issues at Intake vs. MH Treatment at 3 months Source: Ives & Moritz 2009 CSAT Juvenile and Family Drug Court Presentation

  7. GRRS Treatment Planning Needs: Mental Health Source: Ives & Moritz 2009 CSAT Juvenile and Family Drug Court Presentation

  8. Why Do We Care About Unmet Need? • If we subset to those in need, getting mental health services predicts reduced mental health problems • Both psychosocial and medication interventions are associated with reduced problems • If we subset to those NOT in need, getting mental health services does NOT predict change in mental health problems

  9. Residential Treatment need (at intake) vs. 7+ Residential days at 3 months Opportunity to redirect existing funds through better targeting Source: Ives & Moritz 2009 CSAT Juvenile and Family Drug Court Presentation

  10. GRRS Treatment Planning Needs: Substance Use and Treatment Source: Ives & Moritz 2009 CSAT Juvenile and Family Drug Court Presentation

  11. No problems (0-25%ile) 1-3 problems (25-50%ile) 4-8 problems (50-75%ile) 9+ problems (75-100%ile) Impact of Intake Severity on Outcome 10 SPSM groupings OVERALL 8 6 Substance Problem Scale (0-16 Past Month Symptoms) 4 Dot/Lines show Means 2 Intake Severity Correlated -.66 with amount of change 0 0 6 Wave • Programs with low severity look better with absolute outcomes (e.g. abstinence) • Programs with high severity look better with amount of change Source: ATM Main Findings data set

  12. 0.27 0.20 0.15 0.10 0.01 0.00 0.00 -0.04 -0.08 -0.29 -0.39 -0.69 Example of Multi-dimensional HIV Subgroups 0.40 0.20 0.00 -0.02 -0.03 -0.10 Cohen's Effect Size d -0.20 -0.40 Unprotected Sex Acts (f=.14) Days of Victimization (f=.22) -0.60 Days of Needle Use (f=1.19) -0.80 A. B. C. D. Total Low Risk Mod. Risk W/O Trauma Mod. Risk High Risk With Trauma Source: Lloyd et al 2007

  13. Possible Comparison Groups • published data • site over time • subsites, staff, or clinics • compare site to larger program (all sites) • compare site to similar level of care, geography, demographic subgroup, or clinical subgroup • match clinical subgroups from GAIN related presentations or papers • formal matching or propensity scoring to make groups more statistically comparable • formal randomized experiments • path or mediation models to test whether it is actually the dosage or key ingredient driving the change

  14. Evaluation of an OTI Waiting List Reduction Grant from Appointment and Admission Log Used up slots in 2 months and (unexpectedly) had 200 person waiting list Grant allowed program to add 100 slots and reduced time to readmission Source: Dennis, Ingram, Burks & Rachal, 1994

  15. Implementation is Essential(Reduction in Recidivism from .50 Control Group Rate) The best is to have a strong program implemented well The effect of a well implemented weak program is as big as a strong program implemented poorly Thus one should optimally pick the strongest intervention that one can implement well Source: Adapted from Lipsey, 1997, 2005

  16. Major Predictors of Effective Programs that we have to be cognizant of… • Triage to focus on the higher severity subgroups of individuals • An explicit intervention protocol (typically manualized) with a priori evidence that it works when followed with targeted population • Use of monitoring, feedback, supervision and quality assurance to ensure protocol adherence and project implementation • Use proactive case supervision at the individual level to ensure quality of care Source: Adapted from Lipsey, 1997, 2005

More Related