1 / 44

Current Examples of Joint HSR/SR Evaluation

Current Examples of Joint HSR/SR Evaluation. Edward J. Miech, Ed.D. Research Scientist Health Services Research & Development VA-CASE (VISN11 VERC) Roudebush VA Medical Center Indianapolis, IN July 14, 2010. VA Cancer Care Collaborative, Phase II.

ike
Download Presentation

Current Examples of Joint HSR/SR Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Current Examples of Joint HSR/SR Evaluation Edward J. Miech, Ed.D. Research Scientist Health Services Research & Development VA-CASE (VISN11 VERC) Roudebush VA Medical Center Indianapolis, IN July 14, 2010

  2. VA Cancer Care Collaborative, Phase II • 21 VA Teams Working on 5 Different Cancer Types • Lung, Prostate, HCC, Head & Neck, Colon • SR-based Collaborative • Each team has a coach who is experienced with SR, team dynamics, VA context

  3. VA Cancer Care Collaborative, Phase II • Each site receives monthly visits by an Industrial Engineer • Learning Sessions include sessions on SR principles, concepts & tools • Successor to VA Cancer Care Collaborative-Phase I • Co-Chairs: Ken Klotz and Kim Voss

  4. Team Development Measure • CCC II planning committee interested in learning more about team dynamics • In May 2010 (during Action Period 1) individual CCC II team members were invited to complete online version of Team Development Measure • n=97; 18 Teams ≥ 3 members completed TDM

  5. Team Development Measure • 31-item instrument where respondents rate level of agreement with statements about team cohesion, communication, roles and goals • Respondents retain anonymity • Takes 5-10 Minutes To Complete

  6. Team Development Measure • Examples of Statements • Team members say what they really mean. • The goals of the team are clearly understood by all team members. • The work I do on this team is valued by the other team members. • 4-point Likert response scale: • Disagree Strongly, Disagree, Agree, Agree Strongly

  7. Team Development Measure • Permission Granted To Use Freely In VA • VA Guide To Using TDM Developed Under Leadership of DeDe Ordin and Brian Mittman • 12-page detailed handbook, includes questions • published February 2010 • Freely Available on QUERI website at http://www.queri.research.va.gov/ciprs/TDM_Module.pdf

  8. Team Development Measure Preliminary Results

  9. Team Development Measure • Coaches Receive Customized Reports For Each of 18 CCC II Teams • Team Reports will include distribution of individual responses for each TDM item • Visual depiction of members’ different perceptions of current state of team

  10. Team Development Measure Example of TDM Customized Team Report

  11. Number of respondents = 6Mean score = 65Lowest score = 50Highest score = 88 The graph below shows how many team members see the team at what stage.Baseline, June 2010

  12. INSERT SNAG IT OF TDM HERE

  13. INSERT SNAG IT OF TDM HERE

  14. INSERT SNAG IT OF TDM HERE

  15. Team Development Measure • Coaches have option to lead 1-hour facilitated discussion • Facilitated discussion led by coach • Where team is now • What next stage looks like • How the team might get to next stage

  16. Team Development Measure • Data captured for TDM is quickly analyzed and reported back to CCC planning committee • TDM Data Cross-Referenced With Other Measures • At level of team, • diverse perceptions of how members see the team leads to • an evidence-based discussion that • promotes further team development and growth

  17. Team Development Measure • Tool = Online Instrument (Kwik Surveys) • All-Digital Data Capture and Organization • Downloads automatically into Excel • Respondents Retain Individual Anonymity • Allows for candid responses to questions • Multi-Level Data • Tracked at both individual and team level • Summary Reports Useful at both Collaborative and Team levels

  18. PCMH Collaborative LS1 - West • Learning Session with approximately 300 participants taking place right now in San Diego • 300-clicker ARS system in use to capture baseline and endpoint data (i.e., pre- and post-)

  19. Baseline ARS at PCMH LS1 - West Selected Items

  20. How many collaboratives have you participated in, including this one? • 1 • 2 • 3 • 4 • 5 • 6 or more 5

  21. How many years have you worked for the VA? • < 1 • 1 - 2 • 3 - 5 • 6 - 10 • 11 - 15 • 16 - 25 • 26 + 0

  22. What is your job title? • Administrative Assoc • HPDP/BHC • LVN/LPN • Nurse • PCMH Faculty & Staff • Pharmacist • Physician • Social Worker • Technician/MA • VA Faculty Leader 0

  23. How many VA project-based teams (QI, Systems Redesign, Lean, etc.) have you personally been a member of, including this one? • 1 • 2 • 3 • 4 • 5 • 6 + 0

  24. Did you complete the prework before you came to LS #1? • Yes, all • Yes, most • Yes, some • No • Did not know about pre-work • N/A 0

  25. I had enough knowledge about the tools available to complete the prework. • Strongly Agree • Agree • Neutral • Disagree • Strongly Disagree 10

  26. I had enough resources, including time, to complete the prework. • Strongly Agree • Agree • Neutral • Disagree • Strongly Disagree 0

  27. Exploratory Data Analysis: Correlations • “How many Collaboratives have you participated in?” and • How many project-based teams have you been on? (r = .55, p<.0001) • Did you complete the LS1 pre-work? (r = .21 p<.001)

  28. ARS data from CCC II LS1-New Orleans • Statistically significant correlations between participation in any prior VA Collaborative and • Higher levels of self-reported familiarity with Systems Redesign tools (r=.59, p>.0001) • Higher levels of perceived support from site leadership for Collaborative work (r=.42, p<.02) • Higher letter grades given by participants to LS1 (r=.41, p<.02)

  29. Common Features: ARS & Online TDM • All-Digital Capture (with built-in reporting features) • Respondents retain anonymity • Responses can be tracked at individual level • Data can be segmented, aggregated, analyzed for correlations, etc.

  30. VA Yellow Belt Training Program • In VA Yellow Belt training program participants actively learn about Systems Redesign and Lean principles, concepts, and tools through simulations, group application exercises, didactic presentations, and discussion • Enrollment capped at 50 participants per session • Courses usually taught by faculty from the Purdue PharmaTAP program • Sessions generally take place for 3.5 consecutive days in an off-site location such as a hotel

  31. VA Yellow Belt Training Program • Over 400 participants in VA Systems Redesign Yellow Belt training across 12 different sessions held between August 2009 (pilot in Indianapolis) and June 2010 (Boston, MA) • Pre- and post-data collected at 11 of these 12 sessions

  32. Example: Interim YB Findings

  33. Example: Interim YB Findings • Analyses found extremely strong correlations between: • the items “I believe this Yellow Belt workshop will be relevant to my job” and “I am positive about Systems Redesign” (r=.68, p<.0001); • the items “I found this workshop to be practical” and the overall letter grade given to the workshop (r=.67, p<.0001)

  34. Example: Yellow Belt in Connecticut • I am familiar with many of the tools used in Systems Redesign. (n=34)

  35. ARS Advanced Features • Multi-Voting • Priority Ranking • 1st choice =10 points • 2nd choice =5 points • 3rd choice= 1 point • Impact-Effort Matrix • Graph two dimensions along x-axis and y-axis • Conditional Branching

  36. Next Frontier • Capturing Multi-Level Data From Complex, Multi-Site Initiatives in a Single Project File • Flexibly Interrogating/Analyzing Evidence • Keywords • Codes • Indexing

  37. Next Frontier • Adding Value to Source Data with • Applying codes and tags • Linking to Analytic Memos • Performing Context Matrix Analyses • Rapid Reporting of findings/results to initiative organizers/planners on rolling basis • Capturing, Organizing, Analyzing and Reporting Rich-Context Data occurs on ongoing, rolling basis • Digital Tool = NVivo8

  38. Acknowledgements Heather Woodward-Hagg • Systems Redesign, Chief, Roudebush VA • Co-Director, VA-CASE (VISN11 VERC) Mary Sherrill • Co-Director, VA-CASE (VISN11 VERC) Dawn Bravata • Co-Coordinator, Stroke QUERI

More Related