1 / 51

From Objectives to Outcomes

From Objectives to Outcomes. Beth Fentress Hallmark, PhD, RN Belmont University Nashville, TN. Belmont University Nashville, Tennessee. 5,000 + students College of Health Sciences Inter-professional Education Nursing Accelerated, Fast track and Traditional BSN FNP Social Work (BSW)

job
Download Presentation

From Objectives to Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Objectives to Outcomes Beth Fentress Hallmark, PhD, RN Belmont University Nashville, TN

  2. Belmont UniversityNashville, Tennessee • 5,000 + students College of Health Sciences Inter-professional Education • Nursing • Accelerated, Fast track and Traditional BSN • FNP • Social Work (BSW) • Physical Therapy (DOT) • Occupational Therapy (DPT/MSOT) • Pharm D

  3. Belmont UniversityNashville, Tennessee • 2- Eight bed Adult Health laboratories • 8 bed “Acute care” lab • 4 bed Peds lab • 8 bed Health Assessment/OB lab • 4 Inter-professional private patient areas

  4. From Objectives to Outcomes: Learning Objectives: • Identify the components of Healthcare Simulation • Discuss the importance of outcomes evaluation and challenges to traditional assessments • Discuss the importance of validity, reliability and feasibility as it relates to assessment • Discuss types of assessments and their application in healthcare education

  5. Components of Healthcare Simulation Jeffrey A. Groom, PhD, CRNA (2009)

  6. Study Finds No Progress in Safety at Hospitals • November 24, 2010 : NY Times: 10 North Carolina hospitals • 25.1 injuries per 100 admissions • 42.7 % = extra time in the hospital • 2.9 % patients suffered a permanent injury • > 8 percent life-threatening • 2.4 % of them caused or contributed to a patient’s death • Medication errors caused problems in 162 cases.

  7. How do we measure our improvement? Safe/competent practitioners: whatever the discipline/setting. • Initial & Continued competence • Acquisition of relevant knowledge • Development of psychomotor skills • Application of this knowledge and the skill

  8. Current Assessments • Current methods used to measure performance in the clinical area is difficult: • Confidentiality • Faculty to student ratio • Safety to patient • Preceptors: Valid? Reliable? • Adjunct Faculty • Tools

  9. Model of Competence

  10. Problem with “Knowing” • Knowing is measured using examinations like NCLEX, NREMT cognitive exam, FNP certification exam,, calculations test, etc…. • Recalling basic Facts, principles and theories • Multiple choice questions and T/F questions • Test question design: Valid, reliable • Bloom’s taxonomy • Critical thinking questions

  11. Problem with “Knowing” • Cognitive Domain • Belmont pass rate on NCLEX May 2005 is 98.6% on NP exam 100 %.... • Strategies to pass these exams employed in educational institutions. Does this mean that each of these students Will be prepared to care for you or your loved ones?

  12. Model of Competence

  13. Problem with “Knows How” • “Knows How”: • Application of knowledge to problem solving and decision making” (Waas, 2001) • “A thought process stimulated by a problem” (Waas, 2001). • “ability to solve problems, make decisions and describe procedures” (Scalese, 2008) • Case studies and essays • Multiple/multiples • Again… are these students prepared to provide safe proficient care.

  14. Model of Competence

  15. “Shows how vs. Does” • “Shows How” • “demonstration of skills in a controlled setting” (Scalese, 2008) • Educating in these methods includes simulation based education (SBE). • OSCE, SP, Simulations, log books, portfolios • Technical skills • Includes higher level thinking • “Does” • Moves from simulated environment to the real life setting

  16. Assessment vs. Evaluation • Assessment and evaluation are often used interchangeably • However for our purposes… • Assessment describes the measurement of learner outcomes • Evaluation describes the measurement of course/program outcomes

  17. Why do we assess learner outcomes? • Provides baseline data • Provides summative and formative feedback • “Drives learning” • Allows measures of individual progress. • Encourages “student” reflection • Assures public that providers are competent • Licensure/credentialing requirements

  18. Why do we evaluate our programs? • Demonstrates change and growth in programs/courses • Identifies gaps in programs/ courses • Fundamental to outcomes- or competency-based education • Accrediting/Credentialing facilities/programs • Allows administration to make informed allocation decisions

  19. Objectives/Outcomes of Program • Define outcomes based on accrediting/professional organizations., etc. • Objectives/Outcomes leads to competency and mastery. • Identify the Knowledge, Skills and Attitudes/Affective Behaviors (KSA). • Curricular/Program Specific Simulation Event Specific. • Measurable, clearly defined standards.

  20. ± change/refine

  21. Simulation Education • Knowledge • Skills • Attitudes • Advance these throughout the curriculum via assessment • For example: injection to team training

  22. Preparing assessments • What should be assessed? • Every aspect of curriculum considered • Essential • Significant designated teaching time • Should be consistent with learning outcomes that are established as the competencies students should master/perform at a given phase of study

  23. Use of Assessment in Simulation Formative or Summative Rosen, MA et al. Measuring Team Performance in Simulation-Based Training: Adopting Best Practices for Healthcare. Sim Healthcare 3:2008;33–41.

  24. Assessment • Formative Assessment • Lower stakes assessment • One of several assessments over time of course or program • May be evaluative, diagnostic, or prescriptive • Often results in remediation or progression to next level • Summative Assessment • Higher stakes assessment • Generally final course or program assessment • Primary purpose is performance evaluation • Often results in a Go-No Go outcome

  25. Assessments - peer • Enables learners to hone their skills in their ability to work with others and professional insight • Enables faculty to obtain a view of students they do not see • An important part of peer assessment is for students to justify the marks they award to others • Justification can also be used as a component when faculty evaluates attitudes and professionalism.

  26. Assessments - standard setting • Should be set to determine competence • Enables certification to be documented, accountable and defensible • Appropriately set standards for an assessment will pass those students who are truly competent • Standards should not be two low (false positives) to pass those who are incompetent, nor too high (false negative) to fail those who are competent.

  27. Assessments - standard setting • Standards should be set around a core curriculum that includes the knowledge, skills and attitudes required of all students • When setting a standard the following should be considered: • Must reflect the core curriculum • High standard in the core components of the curriculum • Demonstrate mastery at each phase

  28. Performance Assessment • Basic to performance – Do they know it and know how? • Competence – Can they do it? • Performance – Do they do it?

  29. Assessing Simulation • Documenting Data: • Live, video recording, Software logging systems • Logistics of documenting Data: • AV annotation via logging, pencil paper (wipe off cards), scantron, PDA/handheld/TabletPC • Assessors: • Instructors, Observers, SIM/Patients, Peers, Participants

  30. Choosing appropriate assessment methods/tools • When choosing the assessment instrument, the following should be answered: • Is it valid • Is it reliable • Is it feasible

  31. Assessments - validity • Are we measuring what we are supposed to be measuring? • Use the appropriate instrument for the knowledge, skill, or attitude you are testing • The major types of validity should be considered (content, predictive, and face)

  32. Assessments - reliability • Does the test consistently measure what it is supposed to be measuring • Types of reliability: • Inter-rater (consistency over raters) • Test-retest (consistency over time) • Internal consistency (over different items/forms)

  33. Assessment Tools • Tools should measure KSA within the domains that you are measuring • Cognitive • Psychomotor • Affective • Do these domains occur alone? Or simultaneous? • Simulation offers the ability to assess each of these domains …an application of the cognitive domain while performing psychomotor skills as the student demonstrates how they have internalized values, attitudes and beliefs.

  34. Where did I start? • “Low hanging Fruit” • TASTED GREAT!! • Self- reported • Confidence • Increased critical thinking • Satisfaction • Situational Awareness

  35. Where should you start? • Tools developed for your OBJECTIVES! • To measure clinical judgment • Use a tool developed for this. • Lasater (2007). • Adds to reliability and validity • May combine instruments • What about the tool you use for clinical evaluation? • Is it reliable? Valid? Who developed it? Have you had consistency issues with tool/students in clinical? • Does it measure what you really want it to?

  36. Assessments - feasibility • Is the administration of the assessment instrument feasible in terms of time and resources? • Time to construct? • Time to Score ? • Ease of interpreting the score/producing results ? • Practical given staffing/organization ? • Quality of feedback ? • Learner takeaway ? • Motivate Learner ?

  37. Practicality • Number of students to be assessed • Time available for the assessment • Number of staff available • Resources/equipment available • Special accommodations

  38. Examples of Tools • Kardong-Edgren, S., Adamson, K.A., Fitzgerald, C. (2010, January). A review of currently published evaluation instruments for human patient simulation. Clinical Simulation in Nursing, 6(1), e25-e35. Doi:10.1016/jecns.2009.08.004.

  39. Exercise • Let’s try it: OUTLOUD • Groups • Hospital • Emergency • Nursing education • Safe Medication Administration • How does this link to the programmatic outcomes and then with your course?

  40. Safe Medication Administration • How are you measuring this now? • Summative • Formative • Knowledge (Cognitive exams) • Skills/Psychomotor (lab check off) • Attitudes/Affective (what would you examine or are you examining here?) Likert satisfaction? Self Confidence?

  41. Model of Competence

  42. Knowing: Safe Med Administration • Each group write a high level MC question for Safely Administering a specific medication (Choose One). • Is the student who answers this question safe and competent?

  43. Model of Competence

  44. Knows How: Safe Med Administration • Write a short case related to giving the same medication? • What components must the student tell the grader? • How to administer the med? • Side Effects? • Teaching ? • What Else will we measure? • For the student who reaches all of these assessment criteria …Are they competent and safe to give the medication?

  45. Model of Competence

  46. Shows How: Safe Med Administration • Take the case above and the objectives and apply to a simulation. • This can be simple or advanced incorporating teamwork/communication/ high acuity. • What KSA’s are required? • Which “student” do you want taking care of you?

More Related