1 / 95

Online Education in the ER

Nadim Lalani MD. Online Education in the ER. Vanilla Sky. Tom Cruise 2001 Existential “Mind warp” Deals with cryogenics and the possibility of living a v irtual life after death Blending of the technologic and biologic worlds  “plugged in’

Download Presentation

Online Education in the ER

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nadim Lalani MD Online Education in the ER

  2. Vanilla Sky • Tom Cruise 2001 • Existential “Mind warp” • Deals with cryogenics and the possibility of living a virtual life after death • Blending of the technologic and biologic worlds  “plugged in’ • ?Virtual [technologic] world to supplement [real world] EM medical education

  3. Objectives • Definition • Background • Literature Review • Med Ed • Resident Ed • Professional Development • What it might look like • Future Directions

  4. What is “Online learning”? • Online Learning [e-learning] = is digital • Evolved from CD/computer labs • Everyone does it! • Performance Support [ for software e.g] • Web page [e.g. Uptodate] • Self-paced Web-based [CME] • Leader-led [ Distance Learning] • Blended [or hybrid] learning • combines conventional with digital learning

  5. Advantages of e-learning • Rich environment: • Media-filled [esp in EM] • transfer of difficult concepts • Links to sources • Convenient, efficient & flexible • Asynchronous • Can be accessed from a distance • Adult learning principles: • Self-paced and self-directed • Flexible/ home access • efficiency

  6. Background: Why bother? • U of C Medical School current enrollment = 130 students  goal 150 • Mandatory EM rotation / increasing competencies • Resident numbers also increasing • Result  more learners in the ED • Relative shortage of preceptors, relevant clinical encounters and curricular time • Will be worse when our program expands  usurp learning opportunities

  7. Why bother • Deficiency in learning encounters = a performance gap • future physicians do not have the adequate exposure to emergent problems. • imperative we equip students, clerks & residents with the skills and training.

  8. why bother • Increased digitalisation is a key strategic goal of the U of C • Learners are unique with mulitdimensional learning styles. • Adult learning principles • Attract the best candidates • Provide a method of training students and clerks at two different campuses • Provide consistency in teaching

  9. Why bother? • Provide efficient means of knowledge transfer toresidents • Increasing number of competencies [CANMEDs] • Better use of academic half-day. • Provide more effective professional development: • Asynchronous  don’t have to be there • Interactive discussion board • Consistent, evidence-based standard of practice • Increased self-efficacy

  10. E-learning not a panacea • there is more to training and education than e-learning • Certain skills do not lend themselves to e-learning • The key will be selecting the best delivery method. • Cannot simply upload old material. • Learner – focused • no one solution [blended may work for residents].

  11. Process: Can it be done? “Fail to prepare … prepare to fail” • Need to address several key questions: • Purpose? Added value? • What support and expertise exist? • Ongoing upkeep? • Stakeholders? • Team? • Instructional design/Pedagogy

  12. Literature Review • Same Search terms in PUBMED • Bibliographies of relevant articles scanned • Missing 1 Med Ed & 1 CME [both foreign language]

  13. Literature general Comments • More Literature exists for Med Ed • Pre 1990 Limited by lack of internal validity • Few Randomised Controlled Studies • Emerging Lit wrt Resident experience • Despite lots of experience with online CME • Little Literature … mostly Descriptive

  14. Literature General Comments • Terminology inconsistent • Interventions vary. • ? “prototypes” of today’s technology? • Don’t address some of the uniqueness [internet] • Comparing apples to oranges

  15. E Learning & Med Ed Can E-learning be used to replace/augment Traditional Methods?

  16. Study Dartmouth Med School • 328 Students randomised to: • Interactive Case-based study guide on Computer* • Case-Based Printed study guide • Anemia and Cardiology Courses • Outcomes: • Performance on higher order MCQ tests, exams • Self-reported Efficiency * media-rich  images, blood smears and EKG’s

  17. Results • No Difference in Test Scores • No difference on board exams • The vehicle is an acceptable means of delivery

  18. Limitations • Self reporting of efficiency! • Confounders [other text books/practice exams/time-spent cramming] • Doesn’t really tell us about dynamic problem-solving/ clinical judgment

  19. 179 Paeds Clerks in [2 sites Chicago] • Randomised to Lecture via: • Multimedia Text Book* • Lecture • Printed Text • No intervention • Paeds airway diseases • Outomes: • MCQ Test Score [at end of rotation & at 1 y later] * Only different in audio/video

  20. Results

  21. Limitations • 51% Attrition rate! • Clerks at one site had mail-in repeat exam • Confounders • One hour instruction embedded in a 6 week clerkship

  22. 75 Med students [Brisbane Australia] • Randomised after pretest to: • Computer Tutorial  Focus on knowledge • Computer tutorial  create ideal patient for dx + feedback [every 10 cases] • Computer Tutorial  both knowledge & decision + three different types of feedback [after every 10 cases] • Looking at diagnosing abdo pain [ 30 cases]

  23. Outcomes and Results • Outcomes: • Attained knowledge • Diagnostic accuracy • Decision-making confidence [self reported] • Results: • Students focusing on facts did not improve decision-making • All feedback groups improved diagnostic accuracy • Type of Feedback not important. • Self reported confidence improved

  24. Limitations • Small study • Very convoluted method  ?reliability

  25. E- Learning & Med Ed Can E-learning be used to Teach Procedural Skills?

  26. 82 Medical Students [Toronto & Augusta] • Randomised to: • Computer Tutorial + knot board • Lecture with Feedback + knot board • Two-Handed Knot tying • Tested right after [filmed] • Outcomes: • Proportion Square/ Time to tie • Knot Performance score [blinded surgeons] • Student Questionnaire

  27. Results • NO difference in “Cognitive” portion • Lower performance score in CAL group • 89% Students would have preferred Lecture Session • Lack of feedback cited as negative

  28. Limitations • Apples and Oranges! • ? Not controlled for hands-on feedback • Maybe CAL better if it described pitfalls / showed video of good and bad knots? • Reliability of performance score [not included]

  29. 42 Clerks U of T • Randomised to: • Computer Tutorial [rich text, animations, interactive –Q&A, no audio/video] • small group seminar [also interactive] • Epistaxis Management • Outcomes: • Short Answer written Test • Practical Test [16 point performance scale]

  30. Results: • Poor Prior knowledge • No difference in written scores • Slightly better practical skills with CBL

  31. Limitations: • Small numbers • Examiners NOT blinded • ?reliability of performance score [not included] • Practical was on dummy • ? transferability

  32. 69 Medical Students [Wisconsin] • After pre-test Randomised to: • Didactic Session/Q&A¥ • Video-Tape Session* • Computer Tutorial* • Post Intervention: • MCQ test, Practical Skills test [2 blinded obs] • Repeat testing at one month ¥ no feedback . * Instructor present

  33. Outcomes: • MCQ Test Scores • Timed observation of skills • Critical Skills evaluated via checklist • Performance Quotient calculated

  34. Results • Higher initial mean % correct / % complete in CBT group [p<0.01] • Significantly better PQ in CBT group at 1 month [p < 0.01]

  35. Results • Didactic group better on immediate MCQ [63% vs 49% for video/CBT p < 0.01] • Difference in MCQ still there at 1 month

  36. Results • Bigger change in PQ with CBT at 1 month [ P< 0.01]

  37. Limitations • Small study • Video vs CBT essentially the same intervention • ? Why CBT would do better than Video • ? Reliability of checklist and PQ?

  38. What About the ED Experience? Can E-Learning be used for Emergency Medicine Rotations?

  39. 100 Clerks [Mt Sinai] Randomised by blocks • EM rotation with access to EM Website • Modules [ACLS, Tox], Xrays, Pix, Paeds Cases • EM Rotation without access • Outomes: • Exam Scores • Student Satisfaction

  40. Results • ONLY 28% intervention group used it. • 72% Cited lack of time • NON sig difference in exam score [72.8 vs 68.2 p = 0.058] • Non sig difference in satisfaction [ 77.5% vs 66% p = 0.23] • Baseline only 26% > 1h /wk online [cf 96% next class] • Baseline 65% wanted online component

  41. Limitations • < 30% in intervention group  didn’t reach power. • WAS ITT  so results would have been +ve with more participants • Problems with randomising by block rotations given away on lottery [ CARMs] • Unmotivated learners? • ?generalisable to clerks in 2008

  42. 23 Clerks [U of T Sick Kids] • Volunteered for study, Randomised to • Access to Web-based Modules • No Access to Web-based modules • ED Procedures [lac, LP, splint] • Outcome: • Performance on MCQ Test

  43. Results • Statistically higher competence [ p = 0.0001] • Cohen’s d Effect size r = 0.79

  44. Limitations • Small sample size • Volunteers [EM /techno gung-ho] • Methodology: • Unclear when test was administered in relation to rotation • ?randomised to learning vs no learning? • Validity of MCQ vs Observed skills • Transfer of knowledge? • MCQ vs Observed skills

  45. 350 Urology Clerks [4 med schools US] • Randomised [two-group crossover] to: • Web-based Tutorials [BPH,ED,PC,PSA] • No Access to Web Tutorials • Served as the controls for the modules they didn’t have access to online • Outcomes: • Performance on test [pre/post] [Cr = .79] • Durability of learning/ Learning efficiency in SubG

  46. Results

  47. Results:

  48. Results • Learning Efficiency 0.10 vs 0.03 [p<0.001] • Test scores still improved without WBT [12% BPH, 6% ED, 24% PC, 20% PSA] • Web-based alone had Cohen’s r = 24.9!

  49. Limitations: • Volunteers with unequal participation b/w sites [93% HMS vs 52% BUSM] • High Drop out rate 210 /350 completed • ? Generalisability of repeated measures • ? Generalisability to EM

More Related