1 / 25

Lessons Learned for Creating a Process Improvement Performance Management System

Lessons Learned for Creating a Process Improvement Performance Management System. Jay Ford, Ph.D. Director of Research, NIATx March 2010. Reduce Waiting & No-Shows  Increase Admissions & Continuation. Some is not a number, soon is not a time. -- Don Berwick, MD.

jerrod
Download Presentation

Lessons Learned for Creating a Process Improvement Performance Management System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lessons Learned for Creating a Process Improvement Performance Management System Jay Ford, Ph.D. Director of Research, NIATx March 2010 Reduce Waiting & No-Shows  Increase Admissions & Continuation

  2. Some is not a number, soon is not a time. -- Don Berwick, MD

  3. Institute of Medicine Report • Promote patient centered care • Foster the adoption of evidence based practices • Develop and using process and outcome measures to enhance quality of care, and • Mandate the use of quality improvement measures

  4. Improving the quality of care requires • Identifying problem areas accurately • Implementing creative interventions • Understanding customer data needs

  5. Design Data Systems that Capture and track essential measures of performance and quality Teach staff how to collect, analyze and learn from data and Develop systems to support organizational data needs

  6. Is This Your Data System?

  7. Inquiring minds want to know • What factors were related to successful adoption of process-focused data? • What barriers were encountered developing data expertise and focus? • What attributes do we need to think about when developing a data system?

  8. PIPM Hierarchy of Needs

  9. How Much Data to Collect?

  10. Key Lessons Learned: Data Collection • Key process improvement variables are often not available (Date of first contact) or may not be adequately captured (e.g., no-shows) within existing systems. • Conduct a data walk-through of your system to assess capabilities. • Identify currently available PI Data Elements • Flowchart of the provider submission process. • Evaluate the data submission instructions • Pilot test the process with a small sample of records

  11. Data Walk-through Questions • Could the data easily be pulled from the state system? • What barriers were encountered? • How complete and accurate was the data? • Were there significant missing gaps in the data? • Did you notice any errors in the data? • Write-up and share the lessons learned with key stakeholders.

  12. Is this Data Quality?

  13. Key Lessons Learned: Data Quality • Establish a process for verifying and checking data accuracy. • Failure to verify data entry for accuracy will limit the validity of performance management feedback reports related to process improvement. • Approaches toward ensuring data integrity include • Automatic linkages (e.g., Washington) • Built-in quality checks (e.g., Ohio and Maine) • Feedback mechanisms (e.g., New York, South Carolina and Oklahoma and • Ongoing training and technical assistance

  14. Examples of Ongoing Training and Technical Assistance • Oklahoma created a Data Integrity Review Team (DIRT) to provide on site review and technical assistance on all data issues for any provider. • Maine created a change team to monitor data and performance of the contracted agencies and developed FAQs. • New York developed a series of data entry and report analysis training modules for the STAR-QI system. • Ohio offers technical assistance and follow-up through site visits, telephone calls, or conferences.

  15. What type of Feedback?

  16. Comparative Feedback • Organizational Performance versus • a target (internal) or • a benchmark (external) • Types of comparisons • Internal comparisons over time • External performance comparisons to other similar organizations • External performance comparisons to other agencies within a state

  17. Comparative Feedback • Measurement Comparisons • Performance vs. Outcomes • Business Process vs. Treatment Performance/Outcomes Importance of comparisons • Types of Feedback Reports • Data Quality • Performance Reports • Pay for Performance

  18. Comparative Feedback • Understand the whole picture • Select a few key outcome measures • Use of reports to guide questions • Benchmarks vs. Targets • Focus on the comparison (internal vs. external)

  19. Key Lessons Learned: Performance Management • Do not skimp on data quality efforts. • Ensure access to all persons who need the reports. • Create performance feedback loops that include, not isolate, the provider data coordinators. • Provide only reports that help providers effectively use data to make decisions. • Use pictures or graphs, but remember: one graph, one message. • Update reports over time as data is corrected.

  20. State Examples • New York generates data warehouse reports by provider or in the aggregate. • Ohio links STAR-SI performance measures to departmental Performance Target Outline (PTO). • South Carolina facilitates provider comparisons by preparing & disseminating monthly comparative reports. • Maine provides public access to the TDS reports and allows agencies to access the secure system and to request specialized reports. • Oklahoma provides feedback through the Integrated Client Information System (ICIS), allowing monthly access to feedback reports.

  21. Key Lessons Learned: Pay for Performance • Building the system • Pilot testing • Offering the right type of incentive • Overcoming potential obstacles • Implementing strategies for long-term success and sustainability

  22. For further information, please visit www.niatx.net www.odmhsas.org/data Contact Information Jay Ford, PhD Jay.ford@chess.wisc.edu 608-262-4748 Mark Reynolds MAReynolds@odmhsas.org

More Related