1 / 63

S-START Evaluation

®. S-START Evaluation . Evaluation Team Evaluator: Nancy Amodei, Ph.D. – Dept Pediatrics Evaluation Coordinator: Danielle Dunlap, M.S. – Dept Pediatrics Data Manager: Kyle Kozlovsky, M.S. – Dept Pediatrics

gada
Download Presentation

S-START Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ® S-START Evaluation Evaluation Team Evaluator: Nancy Amodei, Ph.D. – Dept Pediatrics Evaluation Coordinator: Danielle Dunlap, M.S. – Dept Pediatrics Data Manager: Kyle Kozlovsky, M.S. – Dept Pediatrics Qualitative Expert: Suyen Schneegans, M.A. – Dept Pediatrics Special thanks to: Rasheem Battle Alejandro Bocanegra Meghan Crabtree Merced Doria Destiny Ramos Drew Russell

  2. Process Evaluation

  3. Process Evaluation: How is the program being implemented? • S-START Process Goals • Train UTHSCSA medical residents and residents from other participating South Texas programs to use evidence based SBIRT procedures for patients who have or are at risk of substance abuse disorders. • Promote systems change in targeted residency programs by integrating the SBIRT model into the curriculum on a long-term basis.

  4. Process Goal #1: Train UTHSCSA medical residents and residents from other participating South Texas programs • Key Activities and Measures 1) Develop/implement a comprehensive curriculum 2) Train UTHSCSA and other Faculty • Demographics, type of training, satisfaction (GRPA & qualitative findings) 3)Train UTHSCSA residents & residents from other programs • Demographics, type of training, satisfaction (GPRA)

  5. SBIRT Curricular Strategies by Specialty

  6. What is the core SBIRT event?

  7. Faculty Training

  8. Demographics of Faculty Completing GPRAs

  9. Baseline vs. 30-day Faculty GPRA ratings (N=17) * Wilcoxen Signed Ranks Test * Wilcoxon Signed Ranks Test

  10. Qualitative Study of Faculty Perceptions of S-START • Purpose: Gain an in-depth understanding of the experience and perceptions of S-START faculty • Methods • 16 training faculty from 5 specialties invited • 15 accepted (12 from UTHSCSA; 1 FM program in McAllen, 1 FM CHRISTUS Santa Rosa, 1 FM from Fort Hood) • Mean age (43.93 years); 72% female; 73% MDs, 1 Ph.D., 1 PsyD, 1 M.A. • Average yrs of experience = 15

  11. Qualitative Study of Faculty Perceptions of S-START • Data Collection • Data collection: ≈ 22 months after S-START began • 45 to 60 minute interviews using scripted but open-ended questions • 14 of the interviews taped to facilitate transcription • Topics: How S-START implemented in the program, barriers and challenges, impact of potential clinical service reimbursement in facilitating program; suggestions for improvement

  12. Qualitative Study of Faculty Perceptions of S-START • Data Analysis • Evaluation team hand-coded transcribed interviews • Thematically coded them to correspond to each question • Collapsed materials thematically into 10 emergent or preset categories

  13. Qualitative Study of Faculty Perceptions of S-START • Results 3 Thematic categories accounted for > 50% of interview responses • Critical components • Barriers • Motivation Critical Components • Faculty training Barriers • Lack of Leadership Motivation • Buy-in from faculty and residents

  14. Resident Training

  15. Demographics of Residents Completing GPRAs

  16. Baseline vs. 30-day Resident GPRA ratings (N=409) * Wilcoxon Signed Ranks Test

  17. Process Goal #2: Promote Systems Change in Targeted Residency Programsby integrating SBIRT model into curriculum on long-term basis • Key Activities • Council of Residency SBIRT Trainers Meetings • Elicit support of key personnel • Changes to Electronic Medical Record • Pocket Cards • SBIRT resources (including key modules) on the S-START website • iPad Project

  18. Progress towards Goal 2:Council of Residency SBIRT Trainers Meetings

  19. Process Goal #2: Promote Systems Change in Targeted Residency Programsby integrating SPIRT model into curriculum on long-term basis Progress re other Activities • Support of change leaders- e.g. UTHSCSA President, Residency Program Directors; PD and Co-PD have high profile positions • Changes to Electronic Medical Record- • UTHSCSA – DFCM, Peds; (Psychiatry and Surgery planned) • Pocket cards • McAllen FM: Part of every patient visit paperwork • SBIRT resources (including core modules, resource directory) on the S-START website • iPad Project-proposed for UTHSCSA Pediatrics

  20. Outcome Evaluation

  21. What is the program’s impact? S-START Outcome Goals: • Enhance residents’ knowledge of evidence-based SBIRT practices. • Enhance residents’ readiness and perceived confidence to implement SBIRT with their patients • Increase residents’ implementation of SBIRT practices with their patients • Enhance Faculty Participants’ knowledge and confidence in ability to teach SBIRT practices to future physicians

  22. Outcome Design • 3 x 2 Repeated Measures • Three data collection methods • Measurement Occasions for Surveys: • Pre-Test • 30-Day Follow-Up • Annually up to 36-month follow-up • Measurement Occasions for Pocket Cards • Varies by department • Measurement Occasions for Chart Reviews • 12-month period prior to first core SBIRT module implementation • 12-month period following the first year of SBIRT module implementation • 12-month period following the third year of SBIRT module implementation

  23. Evaluation Measures

  24. Timeline of Self-Administered Instruments & Incentives

  25. Methods of Survey Data Collection • Web-based surveys (i.e., SurveyMonkey) • Emails to UTHSCSA and private email addresses • Unique web links provided to residency coordinators • Hard copy surveys • Pass out at grand rounds and conference periods • Intra-office mail for fellows, faculty • Mail to home and clinic physical address

  26. Strategies for survey follow-up • Collected contact information using a comprehensive tracking form • Text reminders to cell phone numbers • Phone calls to (1) cell, (2) home, (3) significant others, (4) clinic • Contact residency coordinators for updated contact information • Enlist authoritative support of faculty • Look up information using White Pages, AMA DoctorFinder, respective state medical board websites (usually Texas) • Peer-to-peer contact updates

  27. Future strategies for survey follow-up • Reminder postcards sent twice before each annual survey • Bring surveys to end-of-year gatherings for graduating residents • Include surveys in residents’ exit processing before graduation I pity the fool who doesn’t take the survey.

  28. Resident survey rates • Response rates for similar populations (e.g., students, medical professionals) tend be 60% or lower on follow-up surveys (Asch et al., 1997; Kaplowitz et al., 2004; Kaspryzyk et al., 2001; McMahon et al., 2003; Porter & Whitcomb, 2007)

  29. Analyses of resident survey data • Demographic data (pre-test) • Measured changes from pre-test to 12-month follow-up in: • Confidence to use SBIRT (residents only) • Readiness to use SBIRT (residents only) • Current SBIRT practice (residents only) • SBIRT knowledge (residents & faculty) • Confidence to teach SBIRT (faculty only) • Selected departments for analysis: Pediatrics, Family and Community Medicine, Internal Medicine

  30. Resident DemographicsPre-test results Note. UT=University of Texas Health Science Center at San Antonio; Ped.=Pediatrics; FM=Family Medicine; IM=Internal Medicine; OB=Obstetrics/Gynecology; Psy.=Psychiatry (adult & child); Sur=Surgery. aSurgery residents began the SBIRT curriculum on August 15, 2011.

  31. Resident Demographics (cont.)Pre-test results Note. UT=University of Texas Health Science Center at San Antonio; FCM=Family and Community Medicine; McA.=McAllen Family Medicine; SR=CHRISTUS Santa Rosa Family Medicine; FH=Fort Hood Family Medicine; IM=Internal Medicine; ERAHC=Edinburgh Regional Academic Health Center Internal Medicine.

  32. Outcome goal 1Enhance residents’ knowledge of evidence-based SBIRT practices. • Core SBIRT knowledge • 12 items developed locally by the SBIRT project directors • Knowledge that residents across all departments should know after training • Residency-specific SBIRT knowledge • 7-17 items developed locally by the SBIRT faculty in the respective programs • Items designed for specific residency program SBIRT knowledge and patient populations

  33. Outcome goal 1Enhance residents’ knowledge of evidence-based SBIRT practices. • Sample core knowledge item: • “How many ‘standard drinks’ are considered at-risk alcohol use by a healthy 40-year-old man?” • Sample Pediatrics knowledge item: • “________ exposure is the leading known preventable cause of mental retardation.” • Sample Family and Community Medicine knowledge item: • “Hepatitis B, hepatitis C, HIV and AIDS are strongly associated with abuse of…” • Sample Internal Medicine item: • “Alcohol withdrawal treatment on the inpatient medical service is best accomplished by…”

  34. Outcome goal 1 cont.Enhance residents’ knowledge of evidence-based SBIRT practices. • Core SBIRT knowledge • All residents increased SBIRT knowledge, F(1, 167) = 32.1, p < .001. • No differences found between residency programs

  35. Outcome goal 1 cont.Enhance residents’ knowledge of evidence-based SBIRT practices. Residency-specific SBIRT knowledge • Pediatrics increased SBIRT knowledge,F(1, 59) = 4.53, p = .038. • FCM maintained SBIRT knowledge, F(1, 46) = 2.2, p = .149. • IM maintained SBIRT knowledge, F(1, 61) = .20, p = .659.

  36. Outcome goal 2Enhance residents’ readiness & perceived confidence to implement SBIRT with their patients. • Readiness to use SBIRT (D’Onofrio et al., 2002) • Subscale of AES comprised of 7 10-point Likert scale items • Range: 0-100 • Sample item: “How ready are you to change your practice behavior to ask patients about quantity and frequency of their alcohol use?” • Confidence to use SBIRT (D’Onofrio et al., 2002) • Subscale of AES comprised of 7 5-point Likert scale items • Range: 0-100 • Sample item: “I am confident in my ability to discuss/advise patients to change their drinking behavior.”

  37. Outcome goal 2 cont.Enhance residents’ readiness & perceived confidence to implement SBIRT with their patients. • Readiness to use SBIRT: • No significant change in readiness from pre-training to 12 months post-training, F(1, 161) = .87, p = .353. • FCM reported higher readiness than IM overall, F(2, 161) = 4.7, p = ..010. • Pediatrics was not significantly different than the other two programs

  38. Outcome goal 2 cont.Enhance residents’ readiness & perceived confidence to implement SBIRT with their patients. • Confidence to use SBIRT: • Residents overall reported higher confidence at 12-month, F(1, 161) = 27.3, p < .001. • FCM reported higher confidence overall than IM and Pediatrics, F(2, 161) = 8.1, p < .001. • Pediatrics was not significantly different than the other programs

  39. Outcome goal 3Increase residents’ implementation of SBIRT practices with their patients. • Self-report of current SBIRT practice (D’Onofrioet al., 2002) • Subscale of AES comprised of 7 5-point Likert scale items • Range: 0-100 • Sample item: “How often do you formally screen patients for alcohol problems using brief screening tools (e.g., T-ACE, AUDIT, CAGE)?” • Pocket cards • Chart reviews

  40. Outcome goal 3 cont.Increase residents’ implementation of SBIRT practices with their patients. • Current practice of SBIRT skills: • Residents overall reported higher current SBIRT practice at 12-month, F(1, 161) = 35.2, p < .001. • Significant interaction, F(2, 161) = 19.7, p < .001. • Both Pediatrics and FCM improved self-reported current practice . • Internal Medicine declined in self-reported current practice.

  41. Summary of Resident Survey Data Findings • SBIRT core knowledge improved from pre-test to 12-month follow-up • Readiness to implement SBIRT did not change, but was high at pre-test • Confidence to use SBIRT improved from pre-test to 12-month follow-up • For self-report of SBIRT practice, residents overall improved from pre-test to follow-up • However, when departments were analyzed separately, Internal Medicine decreased from pre-test to 12-month

  42. Outcome goal 3 cont.UTHSCSA Family Medicine Pocket Cards • Settings: • Family Medicine inpatient service at University Hospital in San Antonio, Texas • Subjects: • 285 adult patients, from July 2009 to May 2011. • Average Age: 47 • Gender Distribution: 71.3% Male

  43. UTHSCSA Family Medicine Procedures • Patients were interviewed with a 4-step pocket card Step 1: Pre-screening questions for substance use Step 2: WHO ASSIST (Alcohol, Smoking, and Substance Involvement Screening Test) Step 3: ASSIST score to assess the level of risk and determine need for intervention Step 4: checklist describing the intervention, patient response, and future plan. • Residents were asked to complete 12 per year • 26 out of 26 trained residents participated • Residents completed 11 total on average

  44. Step 1: Pre-screening Results

  45. Step 2-3: ASSIST Results • 95.8% of patients screened positive for at least 1 substance • Avg. ASSIST Score was 19 indicating a moderate risk of substance abuse

  46. Brief Interventions • When the ASSIST Score recommended a brief intervention, residents reported some form of brief intervention 69.4%(over two thirds ) of the time • Residents most likely to discuss consequences of use if ASSIST Score recommended brief intervention (79% of the time) • 8% of patients declined to discuss their response to screening

  47. Brief Intervention Actions Taken

  48. Referrals to Treatment • When the ASSIST Score suggests a referral to treatment, residents referred a patient to treatment 71.8% of the time • Residents were most likely to contact an LCDC (Licensed chemical dependency counselor) when ASSIST Score recommended a referral to treatment (46.5% of the time) Referrals to Tx

  49. Referrals to Treatment Actions Taken

More Related