1 / 36

Data Gathering Basics of Safe Patient Movement and Handling Research Design

Data Gathering Basics of Safe Patient Movement and Handling Research Design. William F. Wieczorek, Ph.D. Buffalo State Center for Health and Social Research Chaitali Ghosh, Ph.D. Mathematics Department, Buffalo State College. 2 nd Annual Safe Patient Handling in Health Care Conference,

kaida
Download Presentation

Data Gathering Basics of Safe Patient Movement and Handling Research Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Gathering Basics of Safe Patient Movement and Handling Research Design William F. Wieczorek, Ph.D. Buffalo State Center for Health and Social Research Chaitali Ghosh, Ph.D. Mathematics Department, Buffalo State College 2nd Annual Safe Patient Handling in Health Care Conference, Niagara Falls, NY October 1-2, 2008 This project is supported through a New York State Legislative Initiative (#C070232) sponsored by Senator George Maziarz.

  2. Session Overview (1) • Basics of Research and Evaluation Design • Defining Meaningful Objectives • Identifying data relevant to patient handling/movement program implementation

  3. Session Overview (2) • Methods of collecting data • Getting to Outcomes: analyzing and presenting results from your data • Examples from the New York State Safe Patient Movement and Handling Demonstration Project

  4. Defining Terms • Research means a systematic investigation, including development, testing, and evaluation, designed to develop or contribute to generalizable knowledge. (45 CFR46.102(d)). • Definition of research used by US government (e.g., Institutional Review Boards, human subjects)

  5. Defining Evaluation • Program evaluation is defined as follows: A systematic assessment of the results or outcomes of program efforts to measure actual outcomes against the intended outcomes of the program; to discover achievement and results; to discover deviations from planned achievements; to judge the worth of the program; to identify unintended consequences; and to recommend expansion, contraction, elimination, or modification of the program. (DOJ) • Evaluation is a specialized form of research

  6. Rationale for Evaluation • Broad options • Summative: outcomes • Formative: processes, implementation issues • Long-term sustainability • Make specific to your project and needs • Be inclusive in planning the evaluation • Get feedback from array of participants/constituencies

  7. Defining Meaningful Objectives • Scope of the project • How large (1 facility, 100 facilities, number of residents/clients) • Type of facility (acute, long-term, home care) • Type of clients/residents/patients • Number of staff

  8. Purpose of Data Collection • Write down your program objectives • Share them with others (get feedback) • Link the objectives to specific data • Project milestones and specific dates • Worker injury data • Staff retention • Lost time

  9. Access to Relevant Data • Simple but not easy!!! • Can you truly get the information you need • Reasonableness as initial assessment of access (Does it exists, Can it be accessed, In what format)

  10. Quantitative/Qualitative Data • Quantitative data: something that is measureable as a number • such as a count of events, rate of injury, costs, scale • Instrument-based questions • Usually analyzed using statistical techniques • Focus is on outcomes/summative evaluation

  11. Quantitative/Qualitative Data • Qualitative data: non-numerical measures • Verbal feedback, narratives, open-ended feedback • Analyzed from a context viewpoint • Usually informs process evaluation • Mixed methods evaluations utilize quantitative and qualitative data

  12. Data for Safe Patient Handling Projects • Workplace data • Occupational injuries (back, strains, shoulder, etc) • Time off/lost time • Worker retention/turnover • Patient Data • Injuries, satisfaction • Systems/environment data • Ergonomic/safety committees • Training/updates

  13. Data Collection Methods • Record data • Injuries/OSHA/DOL logs, sheets etc. • HR data • Business office data • Patient injury data • May be more challenging than anticipated!!!! • Need specific protocol for each type of record data • Issues with validity/data quality

  14. Data Collection Methods • Surveys/Questionnaires • Self-report of injuries to compare with record data • Staff assessment of program • Implementation issues • Key knowledge assessment • Staff satisfaction/Patient satisfaction • Open-ended feedback

  15. Data Collection Methods • Key informants • Specific questions regarding the program • Interview with persons knowledgeable about program • Can be done with group of informants • Focus groups • Group process • Questions with a group of similar people

  16. Data Quality Issues • Validity and reliability • Validity is whether something is an accurate measure • Reliability is whether the measure is stable (test-retest, internal consistency for scales) • Data quality is variable • Biases • Need for convergent information from multiple sources

  17. Getting to Outcomes • Methods of analyzing your data • Simple non-statistical techniques • Comparisons, interpretations of qualitative data, assessment of milestones • Statistical methods • Allows one to assess the probability that changes are real versus being due to chance • Usually requires training in methods

  18. Getting to Outcomes • Statistical considerations • Need for adequate sample size and valid measures • Need to use statistical methods appropriate for the data sources • Pre-/post tests, trends overtime, comparison of local data with state and national data • Descriptive statistics (frequencies, means) often used for survey data

  19. Getting to Outcomes • Match your data analysis and presentation to the needs of your audience!! • Scientific publication • Internal reports • Funding agencies

  20. Example • Safe Patient Handling and Movement Project • NYS demonstration project • Three long-term care facilities • Two hospitals • Evaluate impact (summative eval) • Assess issues that impede/assist implementation (formative eval)

  21. Data sources • Record data • Worker injuries • Days lost • Type of injury • Calculated costs • For period of three years prior to project and three years during project • Adding patient injury data

  22. Injury Data 1

  23. Injury Data 2

  24. Injury Data 3

  25. Injury data 4

  26. Injury Data 5

  27. Injury Data 6

  28. Staff Survey Data

  29. Survey Results 1

  30. Survey Results 2

  31. Survey Results 3

  32. Survey Results 4

  33. Lessons and Recommendations • Getting data is harder than it sounds! • Key staff turnover • Data quality is a major issue for record data • Potential biases, changes in record methods over time • Even “required” data may not be readily available

  34. Lessons and Recommendations • Need to build rapport/relationship • With your team • With key administrators • With union reps • With key nursing and training staff • Be persistent • Develop procedures for contacts and follow ups

  35. Lessons and Recommendations • Baseline data provides useful insights • Tremendous variability in some measures (e.g., days lost) • Small populations and the impact of a single major injury can skew the data • Adds more degree of difficulty for single sites and demonstration projects • Formative: processes, implementation issues

  36. Lessons and Recommendations • Surveys and staff feedback are critically important • Showed that staff doesn’t recognize importance of ergo teams • Differences in perceptions regarding program between floor staff and administrators • Injury reports from survey indicates that many injuries and near misses are not reported

More Related