1 / 46

Evaluating the Outcomes of HIV Clinical Training Across Four Domains: Lessons for Existing Outcomes Measurement Projects

Evaluating the Outcomes of HIV Clinical Training Across Four Domains: Lessons for Existing Outcomes Measurement Projects among AETCs. August 24, 2004 Facilitators: Janet Myers, PhD, MPH & Edwin Charlebois, PhD, MPH. Framework for Excellence. Measuring Results Which helps in:

libitha
Download Presentation

Evaluating the Outcomes of HIV Clinical Training Across Four Domains: Lessons for Existing Outcomes Measurement Projects

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Outcomes of HIV Clinical Training Across Four Domains: Lessons for Existing Outcomes Measurement Projects among AETCs August 24, 2004 Facilitators: Janet Myers, PhD, MPH & Edwin Charlebois, PhD, MPH

  2. Framework for Excellence • Measuring Results • Which helps in: • Refining Site Analysis • Marketing • Curriculum Design • Needs Assessment • Course Delivery and Development • Further Measurement and Evaluation!

  3. Presenters Cheryl Hamill, RN, MS, ACRN & Nancy Showers, DSW Delta Region AETC HIVQUAL Results 2002-2003 Sample RW Title III Community Health Center in Mississippi Mari Millery, PhD NY/NJ AETC Lessons from Assessing Knowledge & Practice Outcomes of Level III Trainings Richard Vezina, MPH & Jennifer Gray, RN, PhD TX/OK AETC, Women & HIV Symposium(JG) Pacific AETC, Asilomar Faculty Development Conference (RV) Brad Boekeloo, PhD, ScM NMAETC, Delta AETC Analysis of HIV Patient-Provider Communication Debbie Isenberg, MPH, CHES Southeast ATEC Intensive On-Site Training Evaluation: A Mixed Methods Approach

  4. Why evaluate? To determine if the training was successful in meeting aims (for participants and faculty) To decide how to change training content To improve the quality of training Why measure clinical practice behavior change? To determine if training has the desired effect on participants and ultimately, on quality of care Measurement and Evaluation

  5. Kirkpatrick’s Model (from Kirkpatrick, Donald L. Evaluating Training Programs (2nd edition) 1998)

  6. The HIVQUAL Project Nancy Showers, DSW Delta Region AETC

  7. TheHIVQUALProject • Capacity–building and organizational support for QI • Individualized on-site consultation services • Strengthen HIV-specific QI structure • Foster leadership support for quality • Guide performance measurement • Facilitate implementation of QI projects • Train HIV staff in QI methods • Performance measurement data with comparative reports • Partnership with HRSA to support quality management in Ryan White CARE Act community-based programs

  8. AETC / HIVQUAL Quality Improvement Functions • To provide quality HIV care, programs must have both: • Expertise regarding HIV and HIV treatment. • Effective organizational systems for delivering HIV care • AETC provides education and training regarding HIV and its treatment • HIVQUAL provides onsite consultation for improving organizational care delivery systems (quality management programs) • Both provide education about quality management principles and tools

  9. HIVQUAL Participants - 2003

  10. Annual PAP Test

  11. Annual Syphilis Screen

  12. Hepatitis C Status Known

  13. Adherence Discussed

  14. Viral Load Every 4 Months

  15. MAC Prophylaxis (CD4<50)

  16. Annual Dental Exam

  17. Annual Mental Health Assessment

  18. Strengths of AETC / HIVQUAL Collaboration • Recognition of distinct and complementary areas of expertise • Timely and effective bilateral referrals for consultation, education, and training. • AETC facilitation of HIVQUAL access into new areas • Ongoing identification of regional needs for performance improvement and for quality management training • Planned AETC delivery of HIVQUAL formulated workshops on quality management

  19. Mississippi LPS - Training Summary ReportReporting period: July 1, 2002 - June 30, 2003for Targeted Ryan White Title-Funded Community Health Centers Cheryl Hamill, MS, RN, ACRNDelta Region AETC

  20. MS LPS Training Programs Totals by Level & Discipline For Targeted RW Title III Funded Clinic July 2002-03

  21. Lessons Learned • ETC LPS’s and Regional HIVQual Consultants have complementary processes to support initial, cross assessments of targeted RW Title III/IV programs. • ETC LPS’s support of the annual HIVQual on-site clinical chart abstracting activity allows the ETC program access to provider practice and can serve as a catalyst for development of targeted Level I, II, III, IV activity. • ETC LPS’s can begin to utilize clinical benchmarking reports, obtained through HIVQual chart review, to build outcome measures correlating ETC levels of training provided at a local site with concurrent support of HIVQual on-site consultation over a training year.

  22. Lessons from Assessing Knowledge and Practice Outcomes of Level III Trainings Mari Millery, PhD NY/NJ AETC

  23. Decided to focus more outcome evaluation efforts on Level III because it is the most intensive and a high priority modality; and participants can be asked to devote time to extra paperwork • Pre-test, post-test, and 3-month follow-up surveys • Measures: • Self-rating of comfort in performing clinical tasks • Case-based knowledge questions

  24. Very low Low Medium High Very high Choosing an appropriate HAART regimen 1 2 3 4 5 Evaluating ongoing adherence in HIV patients 1 2 3 4 5 Deciding to change HIV medications 1 2 3 4 5 1. Please rate your current level of comfort in performing the following: (Circle only one answer for each question.) 2. Mrs. Z is a 34 year-old female with HIV CDC A2 disease, CD4 300 cells/cmm and viral load 50,000 copies/ml, who presents for treatment. Which of the following is the most appropriate initial regimen? a) Zidovudine (AZT)/stavudine (D4T)/indinavir b) Didanosine (DDI)/zalcitabine (DDC)/nevirapine c) Zidovudine (AZT)/lamivudine (3TC)/efavirenz d) Stavudine (D4T)/lamivudine (3TC)/nelfinavir/ritonavir

  25. Lessons Learned • Can be done but getting follow-up surveys back is a challenge • Preliminary results are encouraging – self-reported practice comfort and case-based knowledge questions appear to work as measures • Survey needs to be minimum length • Dropped knowledge questions in post-test because they were too soon after baseline – post-test focuses on feedback on training • Nature of Level III varies: intensity/length, profession trained, topics covered, etc. • Developed special versions for nurses and HepC • 40 surveys collected with revised instruments this year – still working on getting all follow-up surveys back

  26. Measuring Training Outcomes Through Qualitative InterviewingAsilomar Faculty Development Conference (RV) andTX/OK AETC Women & HIV Symposium (JG) Richard Vezina, MPH (RV) Pacific AETC Jennifer Gray, RN, PhD (JG) TX/OK AETC

  27. First time region-wide symposium Multidisciplinary planning committee Lack of knowledge about gender-specific care Increased # of HIV infections among women in the region. Symposium goal: Improved care of HIV+ women Annual region-wide training conference 125 Participants, all PAETC faculty and program staff Conference goals: Improved skills and knowledge among faculty/trainers Improved training outcomes throughout region as a result of staff development TX/OK AETC Women & HIV Symposium(JG) Asilomar Faculty Development Conference(RV)

  28. Evaluation Plans JG • Email one month post to all registrants • Simple open-ended questions, for all disciplines • Identify how content was used with patients and shared with peers. RV • Post-Post: • Form A: Self-assessment at end of Conference • Identify skills and content learned, areas in which to integrate new skills and content • Form B: 6 month Follow-Up • Individualized telephone interviews, reviewing Form A • Focus on how skills/content were applied; barriers

  29. Why these evaluation methods? ? • Able to assess at multiple levels (Kirkpatrick model): • Level 2 (Learning: improved knowledge) (RV) • Level 3 (Behavior: change in practices) (JG, RV) • Seeking specific content regarding conference (RV) • Limited resources and time (JG) • No existing tool found that met needs (JG)

  30. Findings Major Themes: (RV) • Identified high need for continued skills training • Transferred new skills/information to coworkers and employees • Barrier to continued integration: Time constraints Major Themes: (JG) • Impact on patients • 13 had taught patients information learned at the symposium • 3 had used info for referrals • 3 system changes- i. e. assessment forms, clinical strategies • Shared information with others: • 8 informally, 1 structured, 4 created materials • Most common topics: medication/adherence, HIV in general

  31. What went well: Announced at end of symposium/conf. (JG, RV) Brief instrument encouraged higher response (JG) Longer instrument yielded rich responses (RV) What’s Next: Provide Incentives (JG, RV) Change instrument Shorter, easier instrument for higher response rate (RV) longer instrument for greater depth (JG) More effective confirmation of contact information (JG, RV) Strengths & Challenges of Methods

  32. Analysis of HIV Patient-Provider Communication Bradley O. Boekeloo, Ph.D., Sc.M. NMAETC, Delta AETC Grant #6 H4A HA 00066-02-01 from the National Minority AIDS Education and Training Center, Health Resources and Services Administration

  33. Methods Providers Randomized (n=8) • Brief cultural competency training vs. none Audiotapes of HIV Visits (n=24) • 3 patient visits tape recorded per physician. • Tapes transcribed. Patient Exit Questionnaire (n=24) • Interviewer read patient questions and patient answered on an answer form.

  34. RESULTS: Randomized Trial Audiotape Observations Study Group Control Intervention (n=4) (n=4) Mean + S.D. Mean + S.D. Audiotape Variables Patient Word Count 991 + 490 1050 + 629 Length of visit (minutes) 20 + 8.3 20 + 7.2

  35. RESULTS: Randomized Trial Exit Interview Observations (1=Very uncomfortable, 4=Very Comfortable) Study Group Control Intervention (n=4) (n=4) Mean + S.D. Mean + S.D. Exit Interview Variables Comfort talking to Dr. about sex 3.3 + .7 3.6 + .7 Comfort talking about substance use 3.5 + .5 3.3 + 1.0 Comfort talking about medication 3.6 + .9 3.7 + .9

  36. Hypothesis Based on Exploratory Data and Next Steps • Brief Intervention not enough for change • Patients may be more comfortable discussing medical therapy than personal risk behaviors • Try to determine whether different types of communication on audiotapes account for differences in patient comfort communicating with physician.

  37. Intensive On-Site Training Evaluation: A Mixed Methods Approach Debbie Isenberg, MPH, CHES Southeast ATEC

  38. Background • Intensive On-Site Training (IOST) • Involves training, consultation, technical assistance and information dissemination (Levels I-V) • Targeted towards new Ryan White Title III and other rural health sites • Central office-based clinical instructor spends a half day to a full day at the site

  39. Study Overview • Main research questions • Process and Impact (Reaction and Learning) • What was the quality of the training? • How well were learning objectives met? • What are the trainees’ intentions to change their clinical practice? • Outcome (Learning and Behavior) • How has the provider’s experience in the clinical training program impacted his/her ability (if at all) to provide HIV quality care to PLWH?

  40. Study Protocol • Phase One • Post training CQI form completed by participants • Phase Two • Recruitment packets mailed 3 months after last IOST • Research staff contact potential participants 1 week later for interview • Phase Three • Reminder letter for 2nd interview sent 9 months after initial interview (total 12 months post IOST) • Research staff contact participants 1 week later for interview

  41. Content: Phase Two and Three • Written Demographic Assessment (PIF+) • Semi-Structured Phone Interview (Tape recorded) • Quantitative: participant asked to rate the effect of training in each specific training area • Qualitative: participant asked to give concrete examples of how training has affected their skills in the clinical area • If no effect reported, participants are asked for more explanation

  42. IOST Results • 17/25 clinicians have participated (68% participation rate) • 2 MDs, 9 NPs, 5 RNs, 1 CNM • From AL, NC, SC & GA • Areas most frequently mentioned where change occurred: • Client communication/education • Labs • Medication • Identification of early signs and symptoms of infection

  43. Strengths and Challenges

  44. Lessons Learned • Think about what motivates the training audience to participate in the study when deciding on study design • Develop the protocol to lower respondent form and time burden • Don’t be afraid to change the protocol midway in the study if not working • Consider the resources that you have to collect and analyze the data in choosing a study design

  45. Presenter Contact Information • NMAETC, Delta AETC: Brad Boekeloo, PhD, ScM 301-405-8546 bb153@umail.umd.edu ASSESS materials available at www.socio.com • AETC National Evaluation Center: Janet Myers, PhD, MPH 415-597-8168 jmyers@psg.ucsf.edu Edwin Charlebois, PhD, MPH 415-597-9301 echarlebois@psg.ucsf.edu • NY/NJ AETC: Mari Millery, PhD 212-305-0409 mm994@columbia.edu • Delta Region AETC: - Cheryl Hamill, RN, MS, ACRN 601-984-5552 chamill@medicine.umsmed.edu - Nancy Showers, DSW 732-603-9681 njshowers@aol.com • Southeast AETC: - Debbie Isenberg, MPH, CHES 404-727-2931 disenbe@emory.edu • Pacific AETC: Richard Vezina, MPH 415-597-9186 rvezina@psg.ucsf.edu

More Related