1 / 13

R ³P Project

R ³P Project. Faculty Development. Question Themes: Day 1. Engagement of faculty (5) Recognizing faculty proficiency in assessment (3) Create value for clinician-educator (1) Evaluation process and small numbers (1) Training and giving feedback (1) Next “best” thing to face-to-face FD (1)

jania
Download Presentation

R ³P Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. R³P Project Faculty Development

  2. Question Themes: Day 1 • Engagement of faculty (5) • Recognizing faculty proficiency in assessment (3) • Create value for clinician-educator (1) • Evaluation process and small numbers (1) • Training and giving feedback (1) • Next “best” thing to face-to-face FD (1) • Mass production of assessment tools (1)

  3. Basic Principles • The assessment tool is only as good as the evaluator • There is no “holy grail” of assessment tools • Embed assessment in daily teaching activities • Lots of focused observation possible • Assessment should almost always be linked to feedback

  4. Basic Principles • Need strong learning climate in program • Negative climate will undermine assessment • Evaluation should be something you do with trainees, NOT to them. • Trainees should be active participants • Residents can perform a lot of their own assessment • Self audit of medical records • Videotape review with structured tools • Actively seek feedback from others

  5. Engaging Faculty Science and Professionalism • Use the evidence to demonstrate the known deficiencies in competence • 25 years of data demonstrating problems • Better supervision = better patient care • The “aha” moment • The “you will not believe what I saw today” experience

  6. Push/Pull Forces on Faculty Evaluation Science (Push) • Evidence-based practice • Errors and competency gaps • Evidence-based medical education Professionalism (Push) • Culture of quality in education • Intolerant of incompetence • Professional commitment Science of Assessment • Build design support capability • Program analysis • Experience and competency in assessment • Patient and trainee focus • Teamwork in assessment • Program resources External Pull • Payers & government • Rewarding performance • Regulation • Patients/families/ advocacy groups • Accreditation • ACGME, LCME, NCQA, JCAHO • Certification • ABMS • ABP Increase number of training programs effectively measuring competence Increase number of trainees involved in effective self-assessment Based on Orleans CT, Anderson N and Gruman J Roadmaps for the next frontier: Getting evidence-based behavior medicine into practice: Society of Behavioral Medicine, Annual Meeting, San Diego, March 1999 Improved population health

  7. Engaging Faculty Feasibility • Provide usable assessment tool • Don’t spend a lot of time developing local tools • Embed FD into routine activities where possible • Any standing meeting • “Longitudinal” faculty development • Attending rounds • Residents also benefit from same training • Focused “mini” workshops • Brief FD can have some, albeit modest, impact

  8. Engaging Faculty Feasibility • Embed assessment into daily activities • Focused direct observation • 3-10 minute observations valuable! • Structured clinical observation (SCO) • First patient visit of outpatient clinic session • Pre-rounds • Part of an admission work-up • Work rounds • Chart stimulated recall • MRA to assess clinical judgment

  9. Engaging Faculty Faculty Development as CPD • Combine CPD with assessment training • Performance dimension training for counseling skills • Skill workshops • Physical exam, QI science, etc. • Collaborative projects • Faculty and residents work together on QI • ABIM practice improvement modules (PIM)

  10. Recognizing Proficiency • Reduction in rater errors • Less halo effect, etc. • Increase quality of written, descriptive evaluation • More behavioral, specific • Multiple observations • Videotape practice and feedback • Correlation with other assessments • Concurrent or construct validity • Resident assessment • Don’t underestimate value of evaluation of faculty

  11. Training Methods That Work • Performance Dimension Training (PDT) • Frame of Reference Training (FoRT) • Behavioral Observation Training (BOT) • Direct Observation of Competence (DOC) Training

  12. Faculty Deep knowledge, skills and attitudes about multiple modes of assessment: Need formal training 3-5 Core: PD’s, APD’s Faculty whose main focus is on education and clinical care Must possess strong observation, work-based assessment skills. Can train thru workshops, etc. Basic observation, descriptive skills +FB Use focused FD methods The 1-month per year attending

  13. Other Questions • Create value for clinician-educator (1) • Evaluation process and small numbers (1) • Training and giving feedback (1) • Next “best” thing to face-to-face FD (1) • Mass production of assessment tools (1)

More Related