1 / 49

Evaluating a Ryan White Quality Management Program using quantitative and qualitative methods

Evaluating a Ryan White Quality Management Program using quantitative and qualitative methods. Ryan White All Grantee Meeting November 29 ,2012 Presenters: Mary Kay Diakite, LMSW and Graham Harriman, MA HIV Care, Treatment and Housing Program

andie
Download Presentation

Evaluating a Ryan White Quality Management Program using quantitative and qualitative methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating a Ryan White Quality Management Program using quantitative and qualitative methods Ryan White All Grantee Meeting November 29,2012 Presenters: Mary Kay Diakite, LMSW and Graham Harriman, MA HIV Care, Treatment and Housing Program NYC Department of Health & Mental Hygiene

  2. Disclosures • This continuing education activity is managed and accredited by Professional Education Service Group. The information presented in this activity represents the opinion of the author(s) or faculty. Neither PESG, nor any accrediting organization endorses any commercial products displayed or mentioned in conjunction with this activity. • Commercial Support was NOT received for this activity.

  3. Disclosures • Mary Kay Diakite, LMSW • Has no financial interest or relationships to disclose • Graham Harriman, MA • Has no financial interest or relationships to disclose

  4. Learning Objectives At the end of this activity, the participant will be able to: • Apply quantitative and qualitative methods in quality management program evaluations • Determine the most appropriate methods to apply in their particular EMA

  5. Agenda • Background • Objectives • Methodology • Results • Quantitative • Qualitative • Discussion

  6. Background • Quality management (QM) program goals • Comprehensive performance measurement systems are developed & utilized to assess the quality of Part A services • Enable Part A providers to continue to be skilled in using QI tools & methodologies to enhance the quality of their services • Part A services improve health outcomes and quality of life for people living with HIV/AIDS • The Part A QM Program remains responsive to the changing science & epidemiologic needs of the NY EMA community

  7. Quality Management Activities

  8. Evaluation Objectives

  9. Survey Overview

  10. Methodology & Timeline

  11. Quantitative Results

  12. Respondent Characteristics

  13. Survey Respondents’ Characteristics

  14. Overall Satisfaction with QM Program* Overall 3.8 *No difference in means by service category, F = 0.98, p = 0.45

  15. Satisfaction with Technical Assistance (TA) 1. No difference in means by service category, p > 0.05 2. TCC & OHY excluded because n<5 Overall 3.75

  16. Usefulness/Frequency of QM Activities

  17. Organizational Assessments (OA) 1. No difference in means by service category, p > 0.05 2. TCC, FNS, MH, SCF removed because n<5

  18. OA led to new strategies (N = 57) Overall 3.42 1. No difference in means by service category, p > 0.05 2. EIS, FNS, MH, SCF & TCC removed because n<5

  19. Usefulness of Performance Reports1,2 1.No difference in means by service category, p > 0.05 2. TCC, OHY excluded n <5

  20. Performance Reports, continued1,2 1. No difference in means by service category, p > 0.05 2. SCF, TCC & OHY excluded because n<5

  21. Performance Reports,continued1,2 1. No difference in means by service category, p > 0.05 2. SCF, TCC & OHY excluded because n<5

  22. Training topic Preferences 4th Choice • Gaining leadership support • Consumer involvement

  23. Qualitative Results

  24. Most Useful Webinars 4th Choice 6.3% • Quality Improvement Topics

  25. Examples of New Approaches 4th Choice 5.1% • Retention Tool • QI as priority • New program forms

  26. Quotes • We have made QM a priority and have a dedicated steering committee that meets monthly. • The program now has an eye to improving patient care and looking at specific goals to work on that we feel will improve patient care and health. • I would say the main thing now is having the knowledge and skill to look at something, focus on it, and work towards improving it, and even if it does not generate the desired outcome, it is still a learning experience.

  27. How LNL helped accomplish QI goals 4th Choice 7.2% • Learn from other agencies

  28. Quotes • The LNL provided guidance and resources to develop and implement quality improvement strategies that were not available to members or that we did not have knowledge of. These tools were extremely helpful in improving our skills in quality improvement and shift our paradigm about determining quality measures and what to evaluate in our programs.

  29. Quotes • The QLN inspired our QI team to utilize fishbone diagrams when working on our QI project. • The LNL came to our site and gave us feedback on how were doing overall.

  30. Quotes • It helped us to think outside the box and to separate what PHS expects from the program in terms of numbers with an overall and more general view on how services are provided and what is not working. We were then able to make appropriate changes to different areas of the program to make it more successful.

  31. Additional Comments

  32. Quotes • The learning network is a very helpful networking program. I look forward to going there and know that I can get assistance with challenges that I may be experiencing in the program. • It is a great opportunity to learn about what other providers are doing and also for us to share what has worked in our program.

  33. How program used performance reports

  34. Quotes • Upon receiving the quality reports we were able to focus on areas which needed improvement. One example was intakes/assessments. We were able to communicate our needs to other providers in which referrals were being funneled by several agencies and helped us to meet our goal.

  35. Quotes • We were able to look specifically at our screenings for STI and substance use to review for any possible edits and look at best practices when aligned with other agencies. • Used reports to identify any program shortcomings and then developed a QM project to address the findings.

  36. Ideas for Improvement - Themes Meeting logistics Repetitive information People should attend regularly Performance Reports Overlap of meetings and groups Setting annual goals for Learning Network

  37. Meeting Logistics Time Meeting Logistics Frequency Snacks Location

  38. Meeting Logistics: Time • Meetings smack in the middle of the day are very inconvenient • It would be helpful if meetings could be scheduled to either start at the beginning of the day (9:00am or 9:30am) or end at the end of the day (4pm or 5pm) • Meetings should be for a half-day only

  39. Meeting Logistics, Cont.

  40. Setting annual goals for Learning Network Also it would be helpful if there were annual goals for learning networks so that expectations for what will be accomplished for the year are clear to everyone participating (if sub-grantees are expected to have plans, so should the learning network)

  41. Overlap of meetings and groups When referring to the various quality improvement projects, it is extremely difficult figuring out what group of people you are referring to. For example, the HRR LN would be one group of people. Do they fall under the AI? Another group of people (NYSCHRO) usually comes in to collect data, are we referring to them? The UMRG sometimes facilitated by Susan Weigl, are you referring to the QI projects at this meeting? Too much overlap which makes it confusing to respond appropriately.

  42. Performance Indicators and Reports • Select far fewer indicators and make sure they are clearly defined. Right now, there are too many indicators, and it’s not at all clear that all programs define them in the same way. And whatever indicators are selected should be linked to program outcomes. • NYCSHRO reviews require way too much time and attention and I don’t think the resources required to conduct them pay off in any way that’s beneficial to the portfolio. • Performance reports given to us in a more timely fashion

  43. Limitations • Low response rate (24.6%) • Response bias • Generalizability • People responding to questions having no experience with QLNs (OHY & TCC) • We only included the Part A providers and several Statewide providers are part of the LN--you should probably note that

  44. Lessons Learned & Next Steps • Appropriate of QM activities • Review IPRO/NYCHSRO scheduling procedures • Review reporting deadlines • Implement simple solutions • Meeting logistics

  45. Questions? Thoughts?

  46. Acknowledgements • Respondents • Interagency group

  47. Obtaining CME/CE Credit • If you would like to receive continuing education credit for this activity, please visit: • http://www.pesgce.com/RyanWhite2012

  48. THANK YOU! Heather Mavronicolas, PhD, MPH Director, Quality Management & Special Projects Bureau of HIV/AIDS Prevention and Control hmavroni@health.nyc.gov Mary Kay Diakite, LMSW Technical Assistance Project Manager Bureau of HIV/AIDS Prevention and Control mdiakite@health.nyc.gov

More Related