810 likes | 1.14k Views
UCF Definition of Information Fluency. The ability to perform effectively in an information-rich and technology-intensive environment.. Fluency is the ability to. gather evaluate, and use information in specific ways for particular purposes, contexts, and disciplines in ethical and legal ways..
E N D
1. Evidence Based Practice for Information Fluency: A three year project Judith Ruland, PhD, RN, CNE
Pamela Ark PhD, RN
2. UCF Definition of Information Fluency
The ability to perform effectively in
an information-rich and technology-
intensive environment.
3. Fluency is the ability to… gather
evaluate, and
use information in specific ways for particular purposes, contexts, and disciplines in ethical and legal ways.
4. The information fluent student Has learned how to learn
Knows how information is organized
Knows how to find, evaluate, and use information in a meaningful way
Can reflect upon and critique their own processes of inquiry
5. Information fluency In order to be considered information fluent, a student must know how to:
Understand and articulate a complex idea or tell an engaging story
Make a “thing”, a world, a space with the language of one’s discipline.
Use technology to find information
Construct significant things with technology and information tools and reflect on that construction.
6. UCF “IF” outcomes Informationally fluent UCF grads will…
articulate the problem in a selected context
recognize the need for information to address the problem
identify the available information sources (domain)
iteratively collect, analyze, and assess (evaluate critically) the relevant information
7. UCF IF outcomes con’t integrate new information with pre-existing knowledge and context
draw conclusions
effectively communicate results and decisions
follow up on actions
8. UCF Integrated IF Model
9. Literacy is the ability to… Adequately communicate in order to function in a workplace and community.
Perform with or use information tools, to evaluate information for validity and usefulness, and to articulate the information found.
10. Florida Atlantic University model Information Literacy
11. The National Academy of Sciences, 2007 Technology Literacy Technological literacy is a broad understanding of the human-designed world and our place in it.
It is an essential quality as it exists in the increasingly technology-driven 21st century.
12. Paul 2001 Critical Thinking
“Thinking about your thinking while thinking to make your thinking better.”
13. Critical Thinking Operates on itself
?
To improve itself
?
Is self correcting
?
Continually assesses itself
Paul 2001
14. Critical Thinking….
Reasonable, reflective thinking that is focused on deciding what to believe or to do.Ennis, 1995
15. How does “IF” fit with Nursing? Healthcare information is EXPLODING
Students at all levels need to learn to access (gather) , evaluate and use this information in legal and ethical manners.
16. Institute of Medicine, 2003 IOM’s message… “The stark reality is that we invest billions in research to find appropriate treatments, we spend more than $1 trillion on health care annually, we have extraordinary capacity to deliver the best care in the world, but we repeatedly fail to translate that knowledge and capacity into clinical practice.”
17. Heater, Becker, & Olson, 1988 Nursing and EBP Although it is empirically supported that patient outcomes are at least 28% better when clinical care is based on rigorously designed research studies than when care is steeped in tradition, most nurses are not implementing EBP.
18. Pravikoff et al., 2005 Nursing and EBP Recent survey with a nationwide sample of 1,097 randomly selected registered nurses:
Almost half were not familiar with term EBP
More than half reported that they did believe that their colleagues use research findings in practice
Only 27% had been taught how to use electronic data bases
Most do not search information bases to gather practice information.
19. Translation of research into practice It takes an average of 17 years to translate research findings into clinical practice (Balas & Boren, 2000)
Five core competencies of IOM’s health professional educational summit include EBP decision making
Federal agencies and policy making bodies have placed a major emphasis on accelerating EBP
20.
Where do the majority of nurses get the information they use for decision making in their practice?
21. Stevens, 2006; Melnyk & Fineout-Overholt, 2004
From other nurses!
22. Sackett, Straus, Richardson, Rosenberg, & Haynes (2000) Key nursing challenge today… Translating research results into evidence based practice (EBP).
EBP is the conscious and judicious integration of best research evidence with clinical expertise and patient values to facilitate clinical decision making.
23. DiCenso, Guyatt, Ciliska (2007) Nursing EBP refers to… Methodologically sound, clinically relevant research about the effectiveness and safety of nursing interventions,
Accuracy and strength of causal relationships,
Cost effectiveness of nursing interventions,
Meaning of illness or patient experiences.
24. Melnyk &Fineout-Overholt, 2005; Sackett, Strus, Richardson, Rosenberg, & Haynes, 2000 UCF Nursing Definition of EBP EBP is a problem-solving approach to clinical care that incorporates the conscientious use of current “best” evidence from well-designed studies, a clinician’s expertise, and patient values and preferences.
25. EBP clinical decisions
26. Strout, 2005 The EBP difference…. EBP principles enable nurses to own their practices and to have the tools with which to improve practice and determine best practices within a complex health system.
27. Clinical Barriers to EBP
Misconceptions (e.g. too time consuming)
Negative attitudes about research
Lack of administrative support
Insufficient EBP mentors and champions in health care field
Inadequate knowledge, beliefs and skills by advanced practice and staff nurses.
28. Melnyk, Fineout-Overholt, Stetler & Allan, 2005 Academic Barriers to EBP Nursing education programs that continue to teach BSN and MSN students how to conduct research instead of how to access efficiently, critically appraise and use studies to improve clinical practice.
The end result of this approach is often a negative attitude about research.
29. Academic Barriers to EBP
Teaching students in-depth critique of research articles instead of efficient steps in searching for and rapidly appraising evidence promulgates the belief that EBP is an insurmountable feat in the real world.
30. Literature Barriers to EBP Complexity of the literature—”No unaided human being can read, recall, and act effectively on the volume of clinically relevant scientific literature” (IOM, 2001).
Form of the knowledge. Not only is volume at issue, literature contains a variety of knowledge forms which are NOT suitable for direct application.
31. Academic Solutions
Academia Paradigm shift-different approach to teaching research and clinical courses in BSN and MSN programs that emphasize EBP.
EBP must be consistently thread through the curriculum in both didactic and clinical courses.
Faculty and preceptors must be role models for EBP.
32. Academic Solutions… Teaching students to critically examine the clinical problems for the purpose of quickly determining the important issues.
Standardized question formats are helpful.
Students need to be taught to reflect on the clinical issue and extract the important components to form the question.
Reflection requires thoughtful consideration of the aspects of the clinical situation—self, patient and others. (Journalling)
33. Literature Solutions… Evidence summaries, including systematic reviews and other forms designed to reduce the complexity and volume by integrating all research on a given topic to a single, meaningful whole.
Transforming knowledge from discovery through a series of stages to increase meaning to the clinician. (STAR model)
34. ACE STAR Model of knowledge transformation Model for understanding the cycles, nature, and characteristics of knowledge that are utilized in various aspects of evidence-based practice (EBP).
The Star Model organizes both old and new concepts of improving care into a whole and provides a framework with which to organize EBP processes and approaches.
35. Steven, K. University of Texas San Antonio, used by permission. ACE Star EBP Model
36. ACE Star EBP Model Discovery: traditional research methodologies and scientific inquiry. Research results are generated through the conduct of a single study.
Summary: the task is to synthesize the corpus of research knowledge into a single, meaningful statement of the state of the science.( evidence synthesis, systematic review (Cochrane Collaboration), meta analysis (a statistical procedure), integrative review, review of literature).
Translation: The aim of translation is to provide a useful and relevant package of summarized evidence to clinicians and clients in a form that suits the time, cost, and care standard. Recommendations are generically termed clinical practice guidelines (CPGs).
Integration: This step involves changing both individual and organizational practices through formal and informal channels. Major aspects addressed in this stage are factors that affect individual and organizational rate of adoption of innovation and factors that affect integration of the change into sustainable systems.
Evaluation: evaluation of the impact of EBP on patient health outcomes, provider and patient satisfaction, efficacy, efficiency, economic analysis, and health status impact.
37. Steps in the EBP Process 1. Asking the clinical question (Discovery)
2. Searching for the best evidence (Summary)
3. Critically appraising the evidence (Translation)
4. Addressing the sufficiency of the (Integration) evidence—to implement or not to implement
5. Evaluating the outcome of evidence implementation (Evaluation)
38. #1. Asking the question Most important step and most challenging.
If question is not searchable, answerable EBP is off to a faulty start.
Motivation for the question is what a clinician is to do or how to conduct patient care. (As compared to research question, where motivation is to generate generalizable knowledge.)
PICO question drives the entire EBP process
39. PICO Questions P= Patient population
I= Intervention
C= Comparison intervention
O= Outcome
“In adults [P], is cognitive-behavior [I] or yoga [C] more effective in reducing depressive symptoms [O]?”
40. #2 Search for the Best Evidence Question informs as to which database to search, which keywords to start with, etc.
Levels of evidence need to be considered and are different with type of question.
Cause and effect question needs Systematic reviews early.
If question relates to meaning of a construct or phenomenon, then qualitative evidence is important.
41. #3. Critically appraising evidence Focus on three questions:
Are the results of the study or systematic review valid?
What are the results, are they reliable?
Are the findings clinically relevant to my patients?
Purpose is to determine value of research to practice.
42. #4 Sufficiency of the evidence The step varies dependent on the availability of valid, reliable evidence.
If evidence exists, it will be integrated with clinical expertise and patient preference to make decision.
Clinical judgment influences how patient preferences and values are assessed, integrated into decision making.
43. #5 Evaluating the outcome Evaluating outcome in health care providers’ own setting.
Important to consider bias introduction and confounding influences.
Patient evaluations of experiences as well as nurses’ evaluation must be considered.
Interdisciplinary collaboration is essential.
44. Levels of Evidence for answering a clinical cause and effect question Level 1 Systematic review or meta-analysis of all relevant randomized controlled trials (RCTs)
Level 2 Evidence-based clinical practice guidelines based on systematic reviews of RCTs
Level 3 Evidence obtained from well designed controlled trials without randomization and from well-designed case-control and cohort studies.
Level 4 Evidence from systematic reviews of descriptive and qualitative studies
Level 5 Evidence from a single descriptive or qualitative study
Level 6 Evidence from the opinion of authorities and/or reports of expert committees
45. Nursing’s IF Model
46. UCF QEP-IF Pilot projects College of Nursing is one of four pilot projects selected
Only project that goes across three different levels of education—BSN, MSN, and PhD
47. IF in Nursing: A three year project Task force of six faculty manage this project.
Judy Ruland, Jean Leuner, Pamela Ark, Lorrie Powel, Jean Kijek, Diane Wink
48. EBP for IF: A three year project Year One: Gather….Assessment phase
Year Two: Use…Curriculum Development phase
Year Three: Evaluate—Analysis of assessment data, implement changes, evaluate faculty and student progress, disseminate lessons learned
49. EBP for IF: Tools for Year One Phase I: Assessment Phase--Students
Conduct Literacy Assessment of new students using the ICT Literacy Assessment Test—iskills.
Validate ICT findings using the Kent State SAILS instrument.
Assess critical thinking with the ATI Critical Thinking exam.
Assess EBP Knowledge of new students using the EBP Readiness Inventory Scale.
50. See the ETS handout ICT Literacy Assessment Test Seven Constructs: Define Use ICT tools to Identify information
Access Collect and retrieve information from digital environments
Manage Apply organization or classification scheme
Integrate Interpret & represent digital information
Evaluate Determine if digital info meets need of task.
Create Generate information
Communicate Communicate analysis and use information.
51. The box plots provide the following information about the spread of student achievement for each of the ICT Literacy skill areas:
1. The gray box indicates the range of student achievement for the middle half of your test takers, from seventy-fifth percentile (top) to twenty-fifth percentile (bottom).
2. The median is indicated by the horizontal line within the box.
3. The vertical lines extend to the highest and lowest student achievement levels.
4. The horizontal line at 0% indicates the reference group average.
NOTE: Student achievement was calculated using the students’ raw skill area scores as a percentage of the possible points for that skill area. The reference group’s average skill area performance was subtracted to remove the effect of differential difficulties in the tasks across the skill areas. The reference group for this plot consists of the 982 test takers from the Early 2006 administration – January through May 2006.
The box plots provide the following information about the spread of student achievement for each of the ICT Literacy skill areas:
1. The gray box indicates the range of student achievement for the middle half of your test takers, from seventy-fifth percentile (top) to twenty-fifth percentile (bottom).
2. The median is indicated by the horizontal line within the box.
3. The vertical lines extend to the highest and lowest student achievement levels.
4. The horizontal line at 0% indicates the reference group average.
NOTE: Student achievement was calculated using the students’ raw skill area scores as a percentage of the possible points for that skill area. The reference group’s average skill area performance was subtracted to remove the effect of differential difficulties in the tasks across the skill areas. The reference group for this plot consists of the 982 test takers from the Early 2006 administration – January through May 2006.
52. Kent State’s SAILS Standardized Assessment of Information Literacy Skills (SAILS)
SAILS is a knowledge test with multiple-choice questions targeting a variety of information literacy skills. The test items are based on the ACRL Information Literacy Competency Standards for Higher Education.
53. Kent State’s SAILS Eight Skill sets are measured…
Developing a Research Strategy
Selecting Finding Tools
Searching
Using Finding Tool Features
Retrieving Sources
Evaluating Sources
Documenting Sources
Understanding Economic, Legal, and Social Issues
54. SAILS VERSUS ICT Spring 2007 - 102 of the BSN Orlando – second semester basic level nursing students took the SAILS exam
UCF and Nursing are working with ETS to study the correlation of the ICT and the SAILS test
Continue the study this summer with 60 incoming new students – Accelerated, second degree program
55. ATI Critical Thinking Exam This exam is a multiple choice exam that is routinely given to BSN students as they enter and exit the program
It is not suited for administration to the MSN and PhD students
A tool has not yet been determined for use with the graduate students for critical thinking
56. ATI Critical Thinking Exam Given to the entering BSN Orlando students Fall 2006
118 students took exam mean score 75.1% which is at the 93rd percentile nationally for all nursing programs and at the 87th percentile for like BSN programs
57. ACE EBP Readiness Inventory This is a confidence scale/self efficacy tool.
EBP competencies developed and leveled across BSN, MSN and PhD levels based a three year national consensus model. (ACE star model)
Students are asked to rate their confidence with each competency
58. ACE EBP Readiness Inventory EBPRI administered in Fall Semester & January to a total of 210 students:
BSN Orlando, first semester juniors;
MSN entering & graduating;
PhD first semester
59. Preliminary Results BSN Orlando First Semester n=93 Mean age 20.9 (range 19-29)
ACE-EBPRI Questions 1-20 – mean 4.7
Lowest = 4 (Q1, 2, 7)
Highest = 5.3 (Q19)
60. Preliminary Results MSN - First Semester n= 71 Mean age = 35 (range 23 – 55)
Years Nursing Experience ~ 9.81
ACE-EBPRI (Questions 1-52) Mean = 4.78
Lowest Score = 3 (Q 2, 3); 3.2 (Q 30, 31)
Highest Score = 5.5 (Q16); 5.3 (Q17);
4.8 (Q 19)
61. Preliminary Results MSN - Graduating n= 8 Mean age = 35 (range 24 – 50)
Years Nursing Experience:
4 with 1 to 5 years; 4 with > 10 years
ACE-EBPRI (Questions 1-52) Mean = 6.14
Lowest Score = 4.29 (Q 30)
Highest Score = 8.5 (Q 16);
7.57 (Q 17, 18, 19)
62. ACE – EBPRI Question Review Q 1 Define EBP in terms of evidence, expertise, and patient values.
BSN end 1st semester = 4
MSN entering = 4
MSN graduating = 7
63. ACE – EBPRI Question Review Q2: With assistance & existing standards, critically appraise original research reports for practice implications in context of EBP.
BSN end 1st semester = 4
MSN entering = 4
MSN graduating = 6
64. ACE – EBPRI Question Review Q3 Use pre-constructed expert search strategies (hedges) to locate primary research in major bibliographic databases.
BSN end first semester = 5
MSN entering = 3
MSN Graduating = 4.71
65. ACE – EBPRI Question Review Q16 Deliver care using evidence-based clinical practice guidelines.
BSN end first semester = 5.3
MSN entering = 5.5
MSN Graduating = 8.5
66. ACE – EBPRI Question Review Q19 Choose evidence-based approaches over routine as base for own clinical decision making.
BSN end of first semester = 5.3
MSN entering = 4.8
MSN graduating = 7.57
67. Terminology – MSN entering pre-constructed expert search strategies
hedges
primary research
evidence summary
Cochrane Database of Systematic Reviews
explicate
CPG
practice variation
Taxonomies
stakeholders & resource managers
68. EBP for IF Year One 2006-2007 Phase I: Assessment of Faculty
Assess Faculty related to IF and EBP awareness and use in the teaching process
Examine mission, philosophy, program outcomes for IF and EBP
69. EBP for IF Year One 2006-2007 Phase II Faculty Development-
Send task force members to conferences related to EBP and IF- ”train the trainer”
WAC
EBP Conferences
UCF FCTL summer and winter workshops
Year’s end present project at national/international conferences
70. EBP for IF: Year One Phase II Faculty Development
TWO CON Faculty development workshops
January: focused on UCF IF project and EBP readiness
May: focused on EBP with national experts
71. EBP for IF: Year Two Create an IF/EBP framework for the College of Nursing
Framework will integrate EBP and Information Fluency outcomes for all nursing programs.
Plan for integration of the IF/EBP framework with the curricula of all programs in the CON
72. EBP for IF: Year Two Involve faculty at all levels in the IF-EBP curriculum development process
Develop Student Learning Outcomes (SLO) for courses and programs.
Level the SLOs by program level and within the programs.
For each SLO identify direct measures to assess student achievement.
73. EBP for IF: Year Two Continue to review evaluation tools for IF/EBP to ensure match with SLOs
Analyze and review assessment data collected to date with incoming students determine if tools are the right ones.
Collect incoming and outgoing student data with selected tools.
74. EBP for IF: Year Two Select the leveling methods we will use to Introduce, Emphasize and Reinforce IF/EBP across the three curricula.
Consider courses needing significant revision to adapt to new model…for e.g.
Undergraduate Nursing Research???
Graduate research method or EBP courses
UG and Graduate level outcome projects
75. EBP for IF: Year Two Dissemination of what we have learned so far:
Task force continue to present at national/international meetings
Task force members develop and publish at least two journal articles related to IF/EBP in nursing education.
Offer a conference for faculty and outside participants related to IF/EBP??
76. EBP for IF: Year Three Construct matrix of SLOs and program courses & identify in which courses SLOs are taught and assessed.
Continue with plan for curricula and pedagogical revisions.
Carry curricular revisions through appropriate approval processes in CON and at University level.
77. EBP for IF: Year Three Evaluation Plan (began in years 1 & 2)
Continue to conduct incoming student assessment of IF EBP competencies.
Conduct end of program assessment in information literacy and EBP before implementation of changes.
Involve stake holders from students, alumni, clinical partners and faculty
78. EBP for IF: Year Three Dissemination of what we have learned so far:
Task force continue to present at national/international meetings
Task force members develop and publish at least two journal articles related to IF/EBP in nursing education.
Offer a second annual conference for faculty and outside participants related to IF/EBP??
79. Resources for IF-EBP project Task force faculty (train the trainer) supported in travel and registration for national IF related and EBP conferences and FCTL conferences.
External experts supported to come to campus to work with faculty.
Support of project by IF team and IF fellows.
80. QEP Faculty Fellows
Karla Kitalong: kitalong@mail.ucf.edu
Communicating
English Department
Jay Brophy: drjbrophy@gmail.com
Technical literacy
Psychology Department
Andrew Todd: atodd@mail.ucf.edu
Information Literacy
Library
Nancy Stanlick: stanlick@mail.ucf.edu
Critical Thinking
Philosophy Department
81. Resources for IF-EBP project UCF Librarians continue to work with Nursing to develop both IF standards and IF-EBP resource access.
82. Questions or Comments? Any thoughts about the project as it has been explained?