1 / 42

Using an Institutional Report Card to Support Evidence-Based GME Decision Making

Using an Institutional Report Card to Support Evidence-Based GME Decision Making. Conference Session: SES46 20101ACGME Annual Education Conference Ann Dohn, MA, DIO, Alice Edler, MD, MPH, MA ( Educ ), Nancy Piro, PhD, and Bardia Behravesh, EdD , Program Managers/Ed Specialists

eldora
Download Presentation

Using an Institutional Report Card to Support Evidence-Based GME Decision Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using an Institutional Report Card to Support Evidence-Based GME Decision Making Conference Session: SES46 20101ACGME Annual Education Conference Ann Dohn, MA, DIO, Alice Edler, MD, MPH, MA (Educ), Nancy Piro, PhD, and Bardia Behravesh, EdD, Program Managers/Ed Specialists Department of Graduate Medical Education Stanford Hospital & Clinics

  2. AGENDA • This workshop session will: • Discuss the need for comparative programmatic evaluation • Describe existing “report card/scorecard” models from industry that could be used in our institutions • Present an example of the “Stanford Report Card” for comparative program evaluation • Facilitate exercises that will support participants in developing “report cards” and their uses based on individual needs.

  3. Session Objectives • At the end of this session, participants will be able to: • Understand the basis of organizational performance assessment • Describe some different models for organizational report cards • Identify key areas to include in GME “Report Cards” • Understand key considerations for using and distributing programmatic evaluation data.

  4. Stanford Background Stanford University Medical Center currently sponsors 82 ACGME-accredited training programs with over 1000 enrolled residents and fellows.

  5. Stanford University Medical Center Mission • Dedication to pursuing the highest quality of patient care and graduate medical education, recognizing that one of its major responsibilities is the provision of organized educational programs. • Support of quality graduate medical education programs and excellence in residency training and research. • Guidance and supervision of the resident while facilitating the resident’s professional and personal development and ensuring safe and appropriate care for patients. • Provision of adequate funding of graduate medical education to ensure support of its faculty, residents, ancillary staff, facilities, and educational resources to achieve this important mission. • Ensuring that all of its graduate medical education programs meet or exceed the Institutional and Program Requirements promulgated by the Accreditation Council for Graduate Medical Education.

  6. Why Do This? • We know we’re great…..our residents love us!! • Every Program Director will tell you so…

  7. Can We Wait? • Can we afford to be slow moving? • Can we wait for ACGME site visits? • Can we wait for Internal Reviews?

  8. But…. • Our goal is a five year ACGME cycle • Internal reviews at the 2-1/2 year mark… • A lot can happen in 2-1/2 years

  9. We Think We Need This • ACGME and Institutions are increasingly holding DIOs and GME Committees accountable for their utilization of institutional resources. • Actions / decisions must be based on documented real-time analyses of needs.

  10. DIOs need to be able to make evidence-based programmatic decisions based on comparative data

  11. Few Models Exist Today For GME • Prior to the era of outcome competency, educational quality was perceived solely as test score measurement and credentialing accomplishment. • with the introduction of core competency education medical educators, learners and patients are demanding a more holistic approach to quality medical training.

  12. Few Models Exist TodayFor GME • The concept of Institutional Accountability is relatively new. • Until the ACGME Outcome Project, there was no centralized curriculum oversight in GME, unlike medical schools or UME.

  13. The Report Card Vision • In 2005, Stanford hired its first PhD in GME • The vision was to develop tools to construct evidence based decision-making for Graduate Medical Education consistent with our mission • “We needed a Report Card”…

  14. SIGH…. • It wasn’t as easy as first thought!

  15. Our First Attempt …

  16. Background on InstitutionalReport Cards • Government and Industry Models • Multiple models exist and can be used as per specific purpose: • GRPA (Government Performance and Results Act) • Organizational Report Cards • Balanced Scorecard • Benchmarking • Program Evaluations • Social Indicators

  17. Report Card vs. Balanced Scorecard

  18. Which Model to Choose? • We needed a model that was: • Organizationally focused and managed • Track record of effective use • Fit our existing structure with multiple programs and organizations • Flexible enough to be adapted for use on an annual basis – not an accreditation cycle-Regular Data Collection • “Easily digestible” internal and external measurement dimensions

  19. Our Choice Balanced Scorecard Framework in an Organization Report Card (Scorecard) Tool Best of Both Worlds

  20. Stanford Hospital & Clinics Report Card • The SHC Report Card is built on the Balanced Scorecard conceptual framework for translating an organization’s vision into a set of performance indicators distributed among four perspectives adapted for GME: • Resident Perception Measurements • Program Processes • Learning Outcomes • Financial/Growth

  21. Stanford Hospital & Clinics Report Card • Resident Perception Measurements “Guidance and supervision of the resident while facilitating the resident’s professional and personal development and ensuring safe and appropriate care for patients.” 2. Program Processes “Ensuring that all of its graduate medical education programs meet or exceed the Institutional and Program Requirements promulgated by the Accreditation Council for Graduate Medical Education.”

  22. Stanford Hospital & Clinics Report Card 3. Learning Outcomes “Support of quality graduate medical education programs and excellence in residency training and research.” 4. Financial/Growth “Provision of adequate funding of graduate medical education to ensure support of its faculty, residents, ancillary staff, facilities, and educational resources to achieve this important mission.”

  23. The Balanced Scorecard Approach • The Balanced Scorecard is a performance measurement and performance management system developed by Robert Kaplan and David Norton (1992, 1996) • adopted by a wide range of leading edge organizations, both public and private. (“The Balanced Scorecard--Measures That Drive Performance,” Harvard Business Review, Jan-Feb 1992; and “The Balanced Scorecard-Translating Strategy into Action,” Harvard Business School Press, 1996)

  24. Stanford Hospital & Clinics Report Card • Indicators are designed to measure SHC’s progress toward achieving its vision; other indicators are designed to measure the long term drivers of success. • Through the balanced scorecard, SHC : • Monitors its current performance (finances, resident satisfaction, learning outcomes and program process results) • Monitors its efforts to improve processes, educate residents • Enhances its ability to grow, learn and improve the quality of its fellowship and residency educational programs.

  25. Balanced Scorecard Strategic Perspectives Resident How do our residents see us? Are our programs excelling? Institutional / Financial Growth Mission Vision Strategy Program Processes Are we putting our resources in the right places? Do we continue to improve (outcomes)? Learning

  26. Measurement Across The Continuum • PRE: Measuring events that occur before the trainee arrives • NRMP Results • PERI: During Residency • ACGME Survey • POST: After they have left training to start their career. • Alumni Survey

  27. Selection of Report Card Measures • # Applicants/Open Positions • Match Depth • % Top Medical Schools • GME Internal HS Survey • Overall Satisfaction • Recommendation of Program • Teaching Quality • Curriculum Quality • Educational Leadership • Wellness Index • ACGME Survey • Compliant Responses • Alumni Survey • Faculty Eval of program • Resident Eval of program • Faculty Publications • Duty Hr Violations • ACGME Cycle Length • # ACGME Citations • Core Competency Self-Assessment • ITSE Scores • Annual Resident Publications • Annual Resident Presentations • # Safety Incident Reports • Specialty Board Scores • Core Competency Post Assessment • # Res in Program • Grants Awarded • Subspecialties/Program • Physical Space/Facilities

  28. “Voice of the Residents”

  29. Program Processes

  30. Program Matrix

  31. Learning Outcomes

  32. Financial / Growth

  33. Case Study - Stanford • How the DIO uses the Report Card

  34. How Do We Use this Data? • Look at Indicators that are Resident Driven – “Voice of the Resident” • Is there a discrepancy between the voice of the resident and the other indicators? • Would the majority of the residents not choose the program again yet the Board Scores are high?

  35. How Do We Use this Data? • How Do the Programs Compare Against Each Other? • How do they compare against their ACGME Cycles?

  36. What’s Next? • GME Staff Brainstorming Session • The Why’s – Why are programs where they are? • Where do we need to focus our resources?

  37. Presenting the Data • Program Directors Monthly Forum • Protect the Name of the Program • Growth and Change not Blame

  38. Presenting the Data • Individual Meetings with Program Directors • Share Complete Data

  39. Action Planning • GME Staff working with Program Directors • Sharing Findings with GMEC and Administration

  40. Political Fallout • No Program Director wants to be at the bottom… • Defensiveness • Bragging Rights

  41. And by the way… this will help you answer: COMMON INSTITUTIONAL REVIEW DOCUMENT Question 30b: “Describe how the sponsoring institution monitors that each program provides effective educational experiences for residents that lead to measureable achievement of educational outcomes of the ACGME competencies.”

  42. Questions

More Related