1 / 39

Updates from the Residency Review Committees for Internal Medicine & Pediatrics

Updates from the Residency Review Committees for Internal Medicine & Pediatrics. MPPDA Meeting April 10, 2014 Caroline Fischer, MBA, Executive Director, RC- Peds Jerry Vasilias, PhD, Executive Director, RC-IM. What’s new…. NAS is here… Brief summary of NAS Highlights from last RC meeting

selma
Download Presentation

Updates from the Residency Review Committees for Internal Medicine & Pediatrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Updates from the Residency Review Committees for Internal Medicine & Pediatrics MPPDA Meeting April 10, 2014 Caroline Fischer, MBA, Executive Director, RC-Peds Jerry Vasilias, PhD, Executive Director, RC-IM

  2. What’s new… • NAS is here… • Brief summary of NAS • Highlights from last RC meeting • Workflow Changes • Highlights from Policy and Procedures Manual • Citations and non-citations • Milestones • Recent news

  3. One step back…

  4. Why NAS? • To achieve promise of outcomes-based accreditation • Annual review of programs to identify “problem programs” to help them improve • Reduce the burden of accreditation Some key elements of NAS: • Most data in NAS already in place • Annual ADS data entry replaces PIFs • Self-studies every 10 years • Site visited only when “issues” arise • “Internal Reviews” no longer required • Programs in good standing can innovate with “detail” PRs • Significant % of common and specialty PRs are “detail” • Citations, Areas for Improvement

  5. The Data Elements of NAS… • The following are the primaryannual data elements: • Program Attrition • Program Changes • Scholarly Activity • Board Pass Rate • Clinical Experience Data • Resident Survey • Faculty Survey • Milestones

  6. Spotlight: Clinical Experience Data • The following are the primary annual data elements: • Program Attrition • Program Changes • Scholarly Activity • Board Pass Rate • Clinical Experience Data • Resident Survey • Faculty Survey • Milestones

  7. Annual Data Element #5:Med-Peds: Clinical Experience Data • Composite variable on perceptions of clinical preparedness. • How measured: 4th year residents’ responses to RS • IM: • Adequacy of clinical and didactic experience in IM, subs, EM, & Neuro • Variety of clinical problems/stages of disease? • Experience w patients of both genders and a broad age range? • Continuity experience sufficient to allow development of a continuous therapeutic relationship with panel of patients • Ability to manage patients in the prevention, counseling, detection, diagnosis and treatment of diseases appropriate of a general internist?

  8. Performance Indicator # 5:Med-Peds: Clinical Experience Data • Peds: • How well prepared are you to perform procedures without supervision? (List from PRs) • How well prepared are you to perform patient care activities without supervision? • How satisfied are you with the patient volume, range of patient ages, variety of medical conditions, and extent of progressive responsibility in the care of patients? • How satisfied are you with the educational experiences to help you achieve competency in patient care skills? • How satisfied are you with aspects of your longitudinal outpatient experience? • Are you well prepared to competently practice general pediatrics?

  9. Where did most of the NAS data elements come from? • In 2009, data modeling project began to identify factors that predicted high and low program performance • Model was replicated • Data elements were assessed to determine “relative risk” to predict low performance • Selection of Elements needed to be • Obtainable • Meaningful • Correlates w/ prior decisions • Passed statistical “muster” • Used in combination • Understand that this is a work-in-progress • New data elements likely in future

  10. Role of Review Committees in NAS • Utilize data and judgment to: • concentrate efforts on problem programs • determine whether accreditation standards are violated and provide useful feedback for programmatic improvement • determine whether these violations (citations) rise to a level requiring alteration in accreditation status • motivate programs to rapidly improve, rather than playing the “accelerating accreditation action game” • over time, understand and refine the nuances of the process • How? • Annually reviewing programs using set of data elements • Using SV as needed, and conducting complete review of the program q10 years • Using a “PIF-less”, team based, department wide evaluation of programs

  11. Two steps forward….

  12. NAS Year 1: Ground Rules • Basic operational principle of NAS: • RCs will take an accreditation action on every program annually. • All med-peds programs will receive notice regarding accreditation status btw January and July. • At January and March 2014 meeting, RCs reviewed NAS data submitted in AY 2012-2013 • ADS annual update information submitted in fall of 2012 • Faculty and Resident survey data from early spring of 2013 • Responses to “previous citations” was current

  13. NAS Year 1: Ground Rules • All programs on warning or probation seen by reviewers • All programs identified by NAS data as “troubled” underwent review by RC staff and then members • What data elements were triggered? • Not all data elements have same importance/weight • Board scores and resident survey have more weight • Cognizant of changes to PR for Board score pass rates for Peds • Are programs still getting used to data elements (e.g., scholarly activity table)? • Are there patterns/trends in data?

  14. What did we expect would happen in NAS?

  15. NOTthis…

  16. In OAS… 82% of med-peds programs had a review cycle between 3-5 years * * ACGME Data Resource Book 2012-2013, based on 78 med-peds programs. Book available on www.acgme.org.

  17. NAS Conceptual Model Expected Outcomes Accreditation with Warning New Programs, Accredited Programs with Major Concerns Probationary Accreditation Initial Accreditation New Programs Continued Accreditation Accredited Programs without Major Concerns Continued Accreditation with Commendation STANDARDS Structure Resources Core Process Detailed Process Outcomes 2-4% 15% 75% 6-8% Structure Core Process Resources Detailed Process Outcomes Structure Resources Core Process Detailed Process Outcomes Structure Resources Core Process Detailed Process Outcomes Withhold Accreditation Withdrawal of Accreditation

  18. NAS Conceptual Model Expected Outcomes Warning/ Probation 15% New (Initial) 2-4% Good Standing 75% Withdrawal of Accreditation • Withheld/ • Withdrawn 6-8%

  19. NAS Year 1: Expected vs Actual Outcomes Warning/Probation 1% New Programs (Initial) 1% * Site visit scheduled 6% NAS Projections Continued Accreditation (Good Standing) 92%) 15% 6-8% 2-4% 75% 80 Med-Peds Residency Programs

  20. How will NAS change the flow of information and the work of the RC and the program?

  21. Reporting & RC Review TimelinesAY 2012-2013

  22. Reporting & RC Review TimelinesAY 2013-2014

  23. What new Policies and Procedures do I need to know about ?

  24. Highlights from New P&P Manual • Review cycles are gone! Why? Annual Accreditation. • Except for new programs, they get 2 yr cycle + a site visit • No longer “propose” adverse actions • Probation only after site visit • Core and Subs have same status options • Probation now an option for subs • Citations as well as non-citations • Core + Subs re-coupled…again • Self-study = RC will see entire dept • Probation for core = probation for subs

  25. Accreditation Status OptionsContinued Accreditation Significant changes: “Continued Accreditation w Warning” appears as such on website; an adverse action no longer “proposed,” can be granted only after a site visit; subs now can also be put on “probationary accreditation.” * Probation cannot exceed 2 years ** Does not require Probation first

  26. Citations in NAS Citations are not new • Identify areas of non-compliance • Linked to specific requirements • Responses required in ADS • Citations are given and removed by RC (not by staff) Phase I specialties: • Citations received in NAS (after July 1, 2013): will require an RC member to review annually. • Citations received in OAS (given prior to July 1, 2013): will go away after two cycles of continued accreditation in NAS with no new citations.

  27. Areas for Improvement (AFI) AFIs are new in NAS • “General concerns” • May be given by staff (RC rules) or by RC members • May not be specifically linked to a requirement • Do not require written response in ADS • Expectation that AFIs will be monitored locally • PD and GMEC will work to resolve • AFIs will be tracked by RC

  28. Citations vs AFIs • In OAS, the main mechanism to provide feedback was through citations • In NAS, we have 2 methods: citations and AFIs • Citations require annual review by a member of the RC In all likelihood… citations will be used more sparingly, in hopes that AFI’s trigger appropriate local program improvement.

  29. Milestones

  30. Reporting Milestones: Med-Peds • ANNUAL reporting of reporting milestones • Report BOTH IM (22) and Peds (21) milestones • Semiannual evaluation process as usual/required • Plan will be revisited AY 2013-14 • Reporting period for both IM & Peds milestones = May 1 – June 15, 2014 AY 2014-15 • Reporting period for both IM & Pedsmilestones = May 1 – June 15, 2015

  31. Reporting Milestones: Use by RCs • De-identified, aggregate (program) data will be used as ONE data element RC can look at • Cannot be fully used for several years • Initially, ascertain that programs are reporting • Next, check for completeness of data, etc • Review of patterns or trends will be important

  32. Clinical Competency Committee • The PD must appoint a CCC. (Core) • Must be composed of at least 3 faculty (Core) • Others eligible for appointment to the committee include faculty from other programs and non- physician members of the health care team. (Detail) • Written descriptions of responsibilities (Core) • Review all resident evaluations semi-annually (Core) • Prepare/assure reporting of milestones evaluations of each resident to ACGME semi-annually (Core) • Make recommendations to the PD for resident progress, including, promotion, remediation and dismissal (Detail)

  33. Clinical Competency Committee cont. • PRs do not specify composition; each program may decide best structure • PRs do not limit PD’s role • PRs do not define specialty, degree, role for members of CCC • “Best practices” may be defined by community • Review FAQ

  34. Recent News: Single GME Accreditation System

  35. Other Recent News • Faculty Development slide decks available on ACGME’s site • Resident survey results will not be available until mid-June • ACGME leadership aware this will be problematic for many and agreed to reassess for next yr.

  36. Program ResourcesACGME Contacts • Questions related to ADS: • Samantha Alvarado (salvarado@acgme.org) 312.755.7118; WebADS@acgme.org • Questions related to site visit: • Ingrid Philibert (iphilibert@acgme.org) 312.755.5003 • Jane Shapiro (jshapiro@acgme.org) 312.755.5015 • Penny Lawrence (pil@acgme.org) 312.755.5014

  37. Questions? • Questions related to requirements or notification letter: • IM Staff: • Jerry Vasilias (jvasilias@acgme.org) 312.755.7477 • Karen Lambert (kll@acgme.org) 312.755.5785 • Billy Hart (bhart@acgme.org) 312.755.5002 • Jessalynn Watanabe (jwatanabe@acgme.org) 312.755.5784 • Peds Staff: • Caroline Fischer (cfischer@acgme.org) 312.755.5046 • Denise Braun-Hart (dbraun@acgme.org) 312.755.7478 • Kim Rucker (krucker@acgme.org) 312.755.7054 • Luz Barrera (lbarrera@acgme.org) 312.755.5077

  38. Thank you. “You can’t teach an old dogma new tricks.”Dorothy Parker

More Related