1 / 104

Role of the CanERA Reviewer Workshop

Role of the CanERA Reviewer Workshop. July 15 th , 2019. Agenda for the Day. At the end of this workshop, surveyors will be able to: Use the CanAMS to enter and collate data for their program’s accreditation review

johana
Download Presentation

Role of the CanERA Reviewer Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Role of the CanERA Reviewer Workshop July 15th, 2019

  2. Agenda for the Day

  3. At the end of this workshop, surveyors will be able to: Use the CanAMS to enter and collate data for their program’s accreditation review Describe the requirements and procedure for the CanERA accreditation review Assess programs on their compliance with the CanERA standards Recognize how programs can demonstrate achievement of CanERArequirements and indicators Learning Outcomes

  4. The New CanERA System

  5. Acknowledgements Royal College Drs. Karen Finlay and Joanne Todesco

  6. CanRAC..CanERA..CanAMS..CanWHAT? Canadian Residency Accreditation Consortium: The conjoint group representing the Royal College, CFPC, and CMQ tasked with the development and ongoing improvement of CanERA Canadian Excellence in Residency Accreditation: The name given to the new system of accreditation Canadian Accreditation Management System:The digital accreditation management system, a fundamental component of CanERA

  7. Highlights of CanERA New Accreditation Standards Digitized Accreditation Management System (CanAMS) Accreditation Site Visits • Institution accreditation decisions • 8-year review cycle Focus on Continuous Quality Improvement and Learning Environment

  8. When will this affect me? www.CanERA.ca

  9. Previous Survey Cycle - OLD RCPSC Site Survey 1 6 • Continuous program self-review (CQA) • Faculty evaluations • Rotation evaluations • Curriculum evaluations • Trouble-shooting • Overall program review e.g. retreat 5 2 3 4 PG Office Internal Review

  10. New Survey Cycle • 8 years between regular on-site accreditation visits • New electronic tools for surveyors / less paper & repetition / more flexibility • 2-year follow-ups (some onsite, some not) • Introduction of common software/database (AMS) • Data collected by institutions throughout the cycle, including new sources of information • Selected data provided to RCPSC throughout the cycle

  11. The Survey Process(stays the same) University PSQs ➡ Program Profiles PSQs ➡ Program Profiles Specialty Committee Royal College Comments PSQs& Comments ➡ Program Profiles & Comments Comments Program Director Surveyor

  12. Standards of Accreditation Institution (PGME) Accreditation • General Standards of Accreditation for Institutions with Residency Programs (Replacing the “A” Standards) Residency Program Accreditation • General Standards of Accreditation for Residency Programs (Replacing the “B” Standards) • Specific Standards of Accreditation for each discipline (template aligned with general standards)

  13. Review of the Standards New standards, not so new Review the new standards with your Residency Program Committee • What needs to be in place? • Fully compliant; partially compliant? New standards will be used for the Internal Reviews Talk with other PDs/ PAs in your discipline and outside for tips on the ‘newer standards’

  14. How are the new accreditation standards different? • Accommodation of time and competency based education models. • Written in alignment with the new standards organization framework. However – A PROGRAM IS STILL A PROGRAM! Increased focus on outcomes (“Show me that it works”). Increased clarity of expectations, including increased clarity around required evidence within the AMS. Renewed emphasis on the learning environment and continuous improvement.

  15. Program Standards 5 DOMAINS • Program Organization • Education Program • Resources • Learners, Teachers, & Administrative Personnel • Continuous Improvement • ++ Blueprinted from B1-6 • Updated, clarified, & reorganized • Increased focus on outcomes, the learning environment, & CQI • CanMEDS framework remains

  16. Residency Program Accreditation Standards

  17. Additional Discipline-Specific Documents Traditional cohorts: Objectives of Training (OTR) Specialty Training Requirements (STR) Competence by Design (CBD cohorts): Competencies Training Experiences Note 1: Held to standards in place 12 months in advance of review Note 2: Guidance is provided with respect to how to address education design and delivery expectations for those with mixed cohorts.

  18. Standards Organization Framework Domains were defined by the Future of Medical Education in Canada-Postgraduate (FMEC-PG) Accreditation Implementation Committee to introduce common organizational terminology, to increase alignment of accreditation standards across the medical education continuum. The overarching outcome to be achieved through the fulfillment of the associated requirements. A category of the requirements associated with the overarching standard. A measurable component of a standard. A specific expectation used to evaluate compliance with a requirement (i.e. to demonstrate that the requirement is in place). Mandatory indicators must be met to achieve full compliance with a requirement. Exemplary indicators provide objectives beyond the mandatory expectations and may be used to introduce indicators that will become mandatory over time.

  19. Requirement Rating Scale – NEW! • Meets: all mandatory indicators met • Partially meets: at least one, but not all mandatory indicators met • Does Not Meet: none of the mandatory indicators met

  20. Example - Domain: Program Organization

  21. New Terminology • Area for Improvement (AFI): A not met or partially met requirement. • Some AFIs may require College review in two years • Some AFIs may not require College review until the next regular accreditation review

  22. New Terminology: LPI • Leading Practice and/or innovate (LPI): a practice (method, procedure, etc.) that is noteworthy for the discipline or residency education writ large, and//or is unique and innovative in nature

  23. New Accreditation Categories Your Institution will now receive an accreditation category too New programs will now have an External Review Mandated Internal Reviews & Progress Reports are replaced by the APOR APOR = Action Plan Outcomes Report

  24. What’s new? Principles for decision-making • Increased emphasis on CQI • entrusting programs/institutions to drive their own CI. • demonstrated CQI efforts (e.g. AFIs identified within APO instrument) • Iterative expectations for newer expectations; while understanding imperatives from the current system still apply. • Ensuring consideration of: • persistence • impact on the education environment; and • strengths of the institutions IR process. Note: These are applied to the overall recommendation, not at the requirement rating/indicator level

  25. POSSIBLE CATEGORIES OF ACCREDITATION - NEW • A – RS • A – APOR • A – ER • NOTICE OF INTENT • WITHDRAWAL

  26. Accredited program with follow-up at next regular onsite survey (i.e. in 8 yrs) • Acceptable compliance with standards (could have AFIs) • Expectation of good, ongoing CQI throughout the cycle

  27. Accredited program with follow-up by APOR One (or more) significant area(s) for improvement impacting the overall quality of the program requiring follow-up prior to the next regular onsite review, and which can be evaluated via submission of evidence from the program. Predictable 2-year follow up

  28. APOR = Action Plan Outcomes Report • Replaces A-IR and PR • Living register tracking how weaknesses (AFIs) are being addressed • Discussion with PGME and Program as how best to address AFIs

  29. APOR: Expectations • How AFIs for follow-up by APOR are addressed is at the discretion of the institution (in consultation with the residency program, as appropriate) • Information submitted must include sufficient detail regarding: • How the AFIs have been addressed • Documented outcomes and evidence • Indication that a problem has been addressed without supporting documentation of outcomes and evidence is not sufficient

  30. APOR Expectations: Outcomes/Evidence • A narrative overview of actions taken + documented evidence/outcomes • Examples include: • Updated documentation, including details of approach to implementing/communicating changes • Broad or focused Internal review report • Report from a specific institution committee/structure (e.g., wellness centre) regarding actions taken, including data/information to demonstrate effectiveness

  31. APOR: Actions to address an AFI-2Y • Examples include: • Focused follow-up (e.g., updates to the curriculum plan or policies/procedures, including communication and implementation of changes) • Implementation of new structures or mechanisms to address certain areas, e.g., Wellness Centre focused initiative(s) • Addition and implementation of specific resource(s)

  32. Accredited program with follow-up by External Review One (or more) significant area(s) for improvement impacting the overall quality of the program requiring follow-up prior to the next regular onsite review, and which can be best evaluated by external peer reviewers. Factors: Persistent area(s) requiring improvement; nature of the area(s) of improvement may require reviewer from outside university and/or from same discipline; concerns with program’s or institution’s oversight or CQI of the program Predictable 2-year follow up

  33. Accredited program on Notice of Intent to Withdraw There are major and/or continuing concerns which call into question the educational environment and/or integrity of the residency program and its ability to deliver high quality residency education. OR Despite notifications and reminders, the program has failed to complete and submit the required accreditation follow-up by the deadline. Current residents & CaRMS applicants must be made aware. Predictable 2-year follow up - Onus is on the program to show why accreditation should not be withdrawn

  34. INTERNAL REVIEWS • Important evidence of Institutional CQI • Accreditation standard for Institutions McMaster Process • Similar process to the Royal College/CFPC on-site surveys • Same documentation • Similar accreditation decisions (exception: notice of intent) • Accreditation Committee to review all reports • Follow-up to be determined by the Accreditation Committee • APOR • Full Review

  35. Internal Review • Does not need to be perfect • Quality Assurance and Quality Improvement process • Standardised Process for follow-up important • Attention to program CQI • Attention to Learning environment What we can learn collectively: • New standards- what are frequent compliance issues • Guide education and resources • Best practices • Areas of learning environments that need improvement

  36. Survey Team • Chair- usually a PD or past PD • Faculty member • Resident • Some programs will have an external faculty member from the same discipline ( resource issues; large feeder programs; program identification of need)

  37. Information Provided to Surveyors in Advance • Access to Program Profile (incl response to previous weaknesses) • Specialty specific documents • Survey Report & Transmittal Letter from last RCPSC survey

  38. Information Provided to Surveyors on Site • Resident assessments • Faculty/ rotation evaluation • RPC & Competence Committee Minutes (past 6 years)

  39. Program Profile(previously known as the PSQ) IT’S A BIG DEAL!

  40. First impressions count! • Describes how your program is meeting each standard – “evidence” • Guides the surveyor’s questions • Reviewed by many: • PG Dean • Your Surveyor(s), • Accreditation Committee members

  41. Be clear & thorough – • If you are doing something a bit different or are dealing with a challenge, tell us all about it & defend your choices • Attend to spelling, grammar, & formatting • Get help from others. Give yourself lots of time. Hunt down all the numbers, institutional policies, & governance information. You should be the most informed person about your program!

  42. Use abbreviations where necessary, but always include a legend • Final draft should be reviewed by your RPC including resident reps & department head 7. Tell what is happening now rather than what you wish to happen

  43. And finally … AVOID: • We will be … • We hope to … • Only using “role modeling” &“observation” for the intrinsic CanMEDS Roles (OK in 2000, not OK in 2019)

  44. Additional Resources CanERA has developed online training modules. The modules will allow you to: Familiarize yourself with the standards Understand and navigate the CanAMS Access the training modules here: http://www.royalcollege.ca/mssites/canera-uprh/index.html#/

  45. The Role of the Surveyor

  46. Program Accreditation Review Schedule • Must include (in sequence): • Document review • Program director • Program Administrator • Department/division chairs • Residents (groups of 20) • Teaching faculty • Competence Committee (or equivalent) • Residency Program Committee • Program director (15 mins if needed) • Exit meeting (15 min) – the next morning at the hotel (7:30am) • Include as appropriate: • Lunch (30 min) • Breaks (15 min) – mid-morning and mid-afternoon

  47. Surveyors meeting with PD • Discussion may focus on (but not be limited to): • Overall view of program (with respect to alignment with standards, strengths, areas for improvement) • How program addressed previous AFIs • Specialty committee questions • PA (e.g. role, support, professional development) • Resources • Collaboration with other programs • Resident performance/progress • Learning environment (safe? positive?, FRM, process to address concerns) • Teacher assessment/recognition • Leadership

  48. Surveyors’ Meeting with the PA • Discussion may focus on (but not be limited to): • Overall impression of the program • PA specific professional development opportunities, protected time, roles/responsibilities.

  49. Surveyors’ Meeting with Division/Dept. Head • Discussion may focus on (but not be limited to): • Overall impression of the program (strengths, areas for improvement) • Support/resources available to the program • Relationship, communication, and collaboration with the program/PD/RPC • Teacher assessment • Program collaboration with other programs within the division/department

  50. Surveyor’s Meeting with Residents • Discussion may include (but not be limited to): • Overall impressions of the program (strengths and areas for improvement) • Interaction with PD (accessibility, support, etc.) • Environment (supportive, positive, safe?, FRM) • Opportunities to provide feedback and communication throughout the program • Policies/processes (are they effective?) • Resources • Resident assessment • Supervision and educational experiences • Competence by Design (as appropriate) • Clinical responsibilities • Scholarship and research support/opportunities

More Related