CEBP Research Institute: Past and current studies: Overview and findings CEBP Learning Institute May 27, 2010 Corinne Datchi-Phillips, Ph.D. Jeremy Kinser, B.A. Chris Hanes, M.A.
Research Institute • Mission and goals • To support the implementation of evidence-based practices in the Indiana Department of Correction • To inform program and policy decisions • To provide scientific information about the implementation and effects of community corrections programming in the State of Indiana • To provide assistance with program evaluation
Research Institute • Activities • Survey of Current Community Corrections Practices in IDOC (2009) • Survey of referral criteria (2010) • Add names of chris and jeremy’studies
Research Institute • Main findings of 2009 survey inform current research activities with the goal of: • Producing specific descriptions of community corrections services (target population; problems to be addressed; goals and objectives; activities; mechanisms of change) • Examining the effects of community corrections programming
Research Institute • Producing specific descriptions of community corrections services: • Study of referral criteria provides information about: • The target population • Procedures utilized to make decisions about clients’caseplanning • Level of agreement on systematic procedures across the State of Indiana
Research Institute • Examining the effects of community corrections programming • Chris and Jeremy: your studies provide information on … Do you want to add to the slide that comes before this one?
Community Corrections Referral Criteria Preliminary Findings Corinne Datchi-Phillips, Ph.D.
Referral Criteria • Procedures used to determine which components/services offenders will receive • Adult • Risks and needs assessment (92.3%) • LSI-R (80%) • COMPAS (20%) • Intake interview (84.6%) • Judge’s decisions not based on recommendations by correctional staff (69.2%) • Juvenile • Judge’s decisions • Not based on recommendations by correctional staff (50%) • Based on recommendations by correctional staff (40%) • Intake interview (50%) • Risks and needs assessment (40%) • YLSI, COMPAS
Referral Criteria • Adult community corrections program components are part of a system of rewards and sanctions used in response to violations and non-compliance with program rules. • Adult offenders’ placement on CC program components (home detention, work release, road/work crew, community service and day reporting) is • Court-ordered • Used as a sanction or a means of earning privileges • Adult offenders’ placement on CC components is less likely to be determined by risk/need assessment. When risk/need assessment is taken into consideration • Home detention – moderate risk level • Work release – high risk level • Day reporting – low and high risk levels • Forensic diversion – low, moderate, and high risk levels
Referral Criteria • Criteria for placing juvenile offenders on CC program components appear to vary and to lack in specificity. • A few respondents indicated using risk/need assessment measures to select which program the youth would participate in. • Home detention – moderate to high risk levels • Substance abuse, family-focused, and psycho-educational interventions – moderate to high risk levels • Community Service was the only component listed as a part of a system of graduated sanctions for juvenile offenders.
Referral Criteria • Cautionary notes • 21 community corrections participated in the study. • 4 to 6 community corrections answered questions about juvenile programming in their county. • Given this low response rate (33.8%), it is not possible to draw strong conclusions about community corrections referral criteria.
Title of Jeremy and Chris survey Jeremy Kinser, B.A. Chris Hanes, M.A.
How do we know if what we are doing is effective? • Why is it important? • Increasingly important question • A way to substantiate your work • What it requires? • A systematic approach • A commitment to using knowledge to guide practice
What they want to know- we could answer some, but not others • Who do we serve? • How is our programming used? Is it used effectively/efficiently? • Are we collecting meaningful information to help our offenders? • Is our programming effective for our population? • Is our programming effective overall and relative to other CC sites? • What are we doing/what are we not?
What We Did • Data from four deidentified counties selected by DOC– What we wanted for ideal standard versus what we got? Offense Severity/Type Offense Repeat offender? Risk Level (as indicated by an assessment) Risk Score (as indicated by an assessment) Community Corrections Components received by offender Community Corrections Services/Programs received by offender Dates or some way to determine start and end dates of components/services Completion or non-completion of program Program Outcomes Recidivism (after program completion) Re-offense or Violations during probation (and any additional sanctions) Demographics • Two Surveys: T4C and Program elaboration
A Systematic Approach • Not sure what Chris wanted here– to be added
The Ideal Standard • Here we articulate what is the ideal standard for operating in line with principles of EBP • We also articulate why it is important– ie. what we could answer, accountability, etc. • Segue into the role of these studies to establish a baseline for what is being done and answer important questions based on current data
Who is Served? • Note findings • Note what’s missing or other things that might be helpful for interested parties to know
What’s Being Done? • Note findings • Note what’s missing or other things that might be helpful for interested parties to know
What are the Outcomes? • Note findings • Note what’s missing or other things that might be helpful for interested parties to know
Other Areas of Interest • Here we can articulate the questions of interest that we are unable to answer presently, but could warrant revisitation following a recalibration of CC data collection.
Recommendations For Data Collection • Summary of changes to data collection • To be determined following initial data review
Future Directions for TA • Present Data as Baseline • Data driven feedback process • Webinars • Provide specific guidance in EBP • Implementation • Definitions and forms • Volunteer Counties • Provide county specific guidance in implementing EBP • Provide ongoing evaluation through effectiveness studies of programming • Establish model programs for state wide dissemination • Based on volunteer counties experience of implementation of EBP and effectiveness studies