1 / 37

Evaluating Clinical Simulations

Evaluating Clinical Simulations. Pamela R. Jeffries, DNS, RN, FAAN Johns Hopkins University School of Nursing Mississippi – February 25, 2011. Objectives of the Session:. The participant will be able to: Discuss the evaluation process using simulations

ronna
Download Presentation

Evaluating Clinical Simulations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Clinical Simulations Pamela R. Jeffries, DNS, RN, FAAN Johns Hopkins University School of Nursing Mississippi – February 25, 2011

  2. Objectives of the Session: The participant will be able to: • Discuss the evaluation process using simulations • Describe strategies using simulations as an evaluation tool in courses or programs • Describe different components that need to be assessed when using simulations

  3. WHAT IS EVALUATION? • Feedback • Coaching • Assigning Grades • Judgmental: objective or subjective • Form of quality improvement • Assessment

  4. ASSESSMENT • What it is:“Systematic collection, review and use of information about educational programs undertaken for the purpose of improving student learning and development.” (Palumba and Banta, 1999) • Focus:development and improvement rather than judgment and grading

  5. WHY EVALUATE? • Determine learning outcomes and program goals achieved • Give feedback while learning • Improve effectiveness of teaching and learning • Attain performance standards • Assure patient safety

  6. WHEN TO EVALUATE? • Frequently when: • Learning is complex • There is a high risk of failure • When the consequences of error would be serious • When outcomes are critical: • Ensure that learners are prepared for clinical practices • When findings can be used to alter the direction of the project • End of the module, activity, or project to be certain of learning outcomes

  7. EVALUATION PROCESS • Is…A systematic effort involving identifying what, who, when and why, and then gathering, analyzing and interpreting the data • Concludes when…the findings of the evaluation are reported and used

  8. EVALUATION PROCESS • Judgment:Relative worth or value of something • Focus:Specifically on the student learner in e-learning • Throughout educational process:Interwoven throughout the learning process usually in the form of feedback

  9. EVALUATION INSTRUMENT • Norm-referenced: • Focus: how learners rank in comparison to each other • Outcome: The interpretation of the evaluation is to determine who has the most knowledge, best skill performance, etc. and who has the least

  10. Evaluation Instruments • Criterion-referenced • Focus: the learner’s ability to attain the objectives or competencies that have been specified • Purpose: to determine who has achieved the outcomes and who has not • Outcome: The outcome standard must be specified by the educator, and then the learner is evaluated to determine if the standard is met

  11. PRINCIPLES OF EVALUATING ADULT LEARNERS • Involve in planning • Capable of self-evaluation • Motivated to achieve • Deal with real world problems and need real world evaluation • Like variety in teaching, learning, and evaluation

  12. PRINCIPLES OF EVALUATING ADULT LEARNERS • Respond to feedback (evaluation) • Need frequent and informative feedback • Are self-directed • Learn from each other and can evaluate each other • Respond to choices…provide options for evaluation

  13. GUIDELINES FOR SELECTING EVALUATION STRATEGIES • Appropriate for the domain of learning • The learner should have the opportunity to practice in the way he/she will be evaluated • Used to provide multiple ways of assessing learning • Ease of development/use on the part of the educator • Evaluation instrument should be valid and reliable

  14. Steps to Take When Evaluating • Identify the purpose of the evaluation • Determine a time frame • Identify when to evaluate • Develop the evaluation plan • Select the instrument(s) • Collect the data • Interpret the data

  15. Evaluations and Simulations Four areas of Evaluation: • Evaluating the simulation itself • Evaluating the Implementation phase • Evaluating student learning outcomes • Using simulation as an evaluation tool

  16. Evaluating the simulation (design) • A good simulation design and development is needed to obtain the outcomes you need • The SDS provides a measure of the importance of each design feature

  17. Data Results • Simulation Design Scale (SDS) analyzed using factor analyses with varimax rotation on items for each scale • Cronbach alphas calculated on each subscale and the overall scale to assess instrument reliability

  18. Simulation Design Scale

  19. Incorporating the Educational Practices into the Simulation • Assemble the simulation with the educational practices in mind when implementing the simulation • Active learning • Collaboration • Diverse ways of learning • High expectations

  20. Data Results • Educational Practices within a Simulation Scale (EPSS) analyzed using factor analyses with varimax rotation on items for each scale • Cronbach alphas calculated on each subscale and the overall scales to assess instrument reliability

  21. Using factor analyses with varimax rotation on items for each scale, the analyses revealed 4 factors Educational Practices Scale

  22. Instrument Testing • Testing of the two instruments, Simulation Design Scale and Educational Practices Scale continued in this Phase. • Reliabilities for both scales were found to be good. • It was important we found measures to assess the quality of the designed simulations that were being used.

  23. Evaluating Student Learning Outcomes • Simulation technologies used to measure both process and outcomes range from case studies to standardized patients (OSCE), to task trainers and high fidelity mannequins

  24. Example of a Clinical Competency Measure to Measure a Curriculuar Thread(J. Cleary – Ridgewater, MN)

  25. Evaluating outcomes • Formative evaluation measures: simulation is used by the learner/faculty to mark progress toward a goal • Summative evaluation measures: include determining course competencies, licensing and certification examinations, and employment decisions

  26. Exemplars of student outcome measures used today • Knowledge • Skill performance • Learner satisfaction • Critical Thinking • Self-confidence • Skill proficiency • Teamwork/collaboration • Problem-solving Skills

  27. Using simulations as the Evaluation Tool • When skill sets, clinical reasoning, and selected clinical competencies need to be measured, simulations can be used as the mechanism to do this.

  28. Simulations to Evaluate • Set-up a simulation as an evaluation activity Issues to address: • Make sure student is aware it is an evaluation • Describe the evaluation metrics • Is it objective?

  29. Ways to Use Simulations as an Evaluation Metric • As an Objective Structured Clinical Exam (OSCE) • Hired simulated patients are portrayed by actors or actresses • Students are immersed into specific scenarios

  30. Ways to Use Simulations as an Evaluation Metric • Set-up a simulation to measure such concepts as teamwork, patient safety competencies, ACLS protocol, communication skills, any selected intervention

  31. Ways to Use Simulations as an Evaluation Metric • Simulations used for evaluation can come in the form of computer-based learning, scores, and metrics to demonstrate knowledge, skill competency, and proficiency

  32. Examples of products that can be used for evaluation purposes: • MicroSim DVD- scoring mechanism • ACLS computer-based software scenario packages – scoring • Cath simulators – scoring devices – standardized scales, benchmarks • CPR models/mannekins – programming and scoring

  33. Summary • Simulations require evaluation of many variables, including the simulation design, the implementation process, and learning outcomes • In addition, simulations can serve as the mechanism to evaluate students

  34. Nursing Implications • When developing the course and curriculum planning in nursing, decide what the purpose of the simulation encounters serve and evaluate to make sure the outcomes or purpose is being achieved. • More evidence/documentation is needed that simulations are serving the need for improved clinical performance, critical thinking, and improved diagnostic reasoning

  35. Conclusion “How to tell students what to look for without telling them what to see is the dilemma of teaching.” Lascelles Abercrombie

  36. Any Questions? pjeffri2@son.jhmi.edu

More Related