1 / 42

Data Guided Decision-Making How asking questions can improve fidelity

Data Guided Decision-Making How asking questions can improve fidelity. SPDG Grantees Meeting US Department of Education Office Special Education Programs November 6, 2013 Washington, DC. Allison Metz, Ph.D. National Implementation Research Network FPG Child Development Institute

radwan
Download Presentation

Data Guided Decision-Making How asking questions can improve fidelity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Guided Decision-MakingHow asking questions can improve fidelity SPDG Grantees Meeting US Department of Education Office Special Education Programs November 6, 2013 Washington, DC Allison Metz, Ph.D. National Implementation Research Network FPG Child Development Institute University of North Carolina at Chapel Hill

  2. Decision Support Data Systems Goals for Today’s Session • Why should we start with the questions? • How can we foster the curiosity of key stakeholders? • How can data guided decision-making improve fidelity of a well-defined “what”? • How can data guided decision-making further operationalize your “what” and “how”?

  3. Starting with Questions

  4. Implementation Science Achieving Good Outcomes Positive Outcomes for Children Effective Implementation The “HOW” Effective Interventions The “WHAT” DATA DATA DATA

  5. Data-Based Decision Making Where to start • Determine what questions you want to answer (implementation team) • Determine what data will help to answer questions • Determine the simplest way to get the data • Put systems in place to collect data • Analyze data to answer questions (Lewis, 2009)

  6. Data-Based Decision Making Questions to Answer Consider questions related to: Outcomes Implementation Intervention

  7. Decision Support Data How is this different? Practitioner fidelity is important component of DSDS, but not the full picture…what else do we need to understand?

  8. Data-Based Decision Making Questions to Answer • Are practitioners implementing intervention with fidelity? • Are practitioners “bought in” to the new way of work? • Are teachers, families, partners satisfied? • Are stakeholders collaborating? • Are coaching and supervision being delivered as intended? • Are outcomes being achieved? • Are appropriate referrals being made?

  9. Fostering Curiosity Rather than simply providing grantees with data (e.g., fidelity data provided by program developers), we need to foster the curiosity of grantees regarding their own implementation efforts so that grantees become learning organizations. Metz and Albers, In Press

  10. Discussion Points Starting with Questions • What do you need to know? Why do you need to know it? • What type of data could help you answer this question? • What would be the easiest way to get this information? • Do you have systems in place to collect this information? To conduct the analysis to answer your questions? • What are your next right steps? Supporting New Ways of Work

  11. Asking questions to improve fidelity of the “what”…

  12. Promote High Fidelity Implementation Supports Fidelity is an implementation outcome How can we create an implementation infrastructure that supports high fidelity implementation?

  13. Improved OUTCOMES for children and families Performance Assessment (fidelity) Systems Intervention Coaching Facilitative Administration CompetencyDrivers OrganizationDrivers Training Integrated & Compensatory Integrated & Compensatory DecisionSupport DataSystem Selection Leadership Drivers Technical Adaptive

  14. Fidelity Results Common Questions FIDELITY IS AN IMPLEMENTATION OUTCOME Therefore, common questions include: • Are staff able to implement the intervention as intended (with fidelity)? • If yes, what supports are in place to continue to ensure ongoing performance? (i.e. ongoing training, coaching, data systems, facilitative administration, teams) • If no, what barriers or challenges are impeding their ability to implement with fidelity? • Competency challenge – i.e. need for more training, ongoing coaching, additional support from developer • Organizational challenge – i.e. a policy or procedures in the agency or system inhibiting fidelity • Leadership challenge – i.e. need for leadership to attend to organizational or system barriers

  15. Fidelity Results Common Questions FIDELITY IS AN IMPLEMENTATION OUTCOME Common questions include: • If no, who is primarily responsible for this driver? • What stage of implementation is this program in? • What data do we have telling us this is a challenge? • How can potential solutions be identified? • How can potential solutions can be tracked to monitor improvement? • Is this challenge being experience with other programs? • Who needs to know about this challenge?

  16. Case Example Results from Child Wellbeing Project Case management model involved intense program development of core intervention components and accompanying implementation drivers. Clinical case management and home visiting model for families post-care.

  17. Fidelity Data Program Improvement Program Review Process • Process and Outcome Data • Detection Systems for Barriers • Communication protocols Questions to Ask • What formal and informal data have we reviewed? • What is the data telling us? • What barriers have we encountered? • Would improving the functioning of any Implementation Driver help address barrier?

  18. Case Example Results from Child Wellbeing Project Success Coach model involved intense program development of core intervention components and accompanying implementation drivers

  19. Discussion Points Data Guided Decision-Making • How curious is your agency about your program fidelity? How can you foster greater curiosity? • Are you currently using a structured process to assess how implementation efforts are contributing to program fidelity? How could you improve this process? • How might developing a core set of questions to address each month regarding fidelity and implementation be useful? How could you build this process into regular routines? Supporting New Ways of Work

  20. Asking questions to define the “what” and “how”…

  21. Data-Based Decision Making Analyze Data 7 Basic Evaluation Questions: • What does “it” look like now? • Are we satisfied with how “it” looks? • What would we like “it” to look like? • What would we need to do to make “it” look like that? • How would we know if we’ve been successful with “it”? • What can we do to keep “it” like that? • What can we do to make “it” more efficient & durable? (Sugai, 2004)

  22. “Key Aspects of Improvement” “Many initiatives fail for lack of study and reflection on what is actually being done and what the results are from having done it. Observing, describing, and documenting are key aspects to a program improvement cycle, and particularly critical during the pilot phase when key functions of interventions are emerging.” The Child Wellbeing Project, Improvement Cycle Tool

  23. DSDS Improvement Cycles The crux of the DSDS are Improvement Cycles • Decisions should be based on data • Change should occur on purpose • Improvements must be ongoing – always striving to be better in order to succeed and have impact Cycles

  24. DSDS Improvement Cycles Three critical Improvement Cycles: • Usability Testing • Rapid Cycle Improvement Teams • Practice-Policy Communication Loops Cycles

  25. DSDS Usability Testing • Usability Testing is the strategic use of Plan, Do, Study, Act cycles to “test” and improve processes and procedure that are being used for the “first time” • Occurs during initial implementation of the process of procedure being testing Cycles

  26. DSDS Usability Testing Why is it helpful? • Designed to improve and “stabilize” • Early occurring components • Implementation supports • Data collection processes • So that major “bugs” are worked out • And therefore: • Processes are more likely to be “effective” • Implementation Drivers can support the “right” processes Cycles

  27. Usability Testing Which processes to test? • Intervention Processes • Are literacy coaches able to engage children & families? • Are literacy coaches able to do the intervention as intended? • Implementation Processes • Does training occur as intended? • Can fidelity measures be collected as intended? • Data Collection Processes • Area assessment done on schedule? • Is the data entry system functional? Cycles

  28. Usability Testing Testing Dimensions Limited number of “cases” within a given test • Enough to sample with variability in order to detect systematic problems rather an individual challenges • Staged to quickly get a sense of challenges • Small enough number to give you quick early returns of data • Metrics are both functional and easy to collect • Quantitative and Qualitative Information Cycles

  29. Usability Testing Testing Processes For EACH Test… • What’s the relevant measure/key outputs that can be quickly revealed? (e.g., opinions of teachers or parents, percentage of assessments done on schedule) • Who will be responsible for reporting the data and on what schedule? (e.g., Teachers will send an e-mail at the end of each week to their supervisor indicating the percentage of home visits that were done as scheduled) • Who will be responsible for summarizing the data? (e.g., Supervisors will summarize the data across all practitioners and forward to the Implementation Task Group by 4 p.m. on Tuesday for the prior week’s data).

  30. Usability Testing Testing Results What if it’s not good enough? Plan, Do, Study, Act • You Planned • You Did It As Intended • You Studied the Unacceptable Results • Data • Conversation • You Act – Plan, Do, Study Again

  31. “Get Started, Get Better”

  32. DSDS Improvement Cycles Three critical Improvement Cycles: • Usability Testing • Rapid Cycle Improvement Teams • Practice-Policy Communication Loops Cycles

  33. DSDS Rapid Cycle Problem Solving • Problem-solving during early efforts • Team Lead identified • Right people on the team • Time-limited to address the problem • Team disbands • Practice Improvement • On-going efforts to improve practices and competencies • Use data to achieve better outcomes for children and “embed” solutions Cycles

  34. Rapid Cycle Problem Solving Use of Data Quickly Identify: • Data needs • Potential Indicators • Methods of assessment • Efficient analysis • Targeted strategies based on analysis and how reassessment will occur quickly Cycles

  35. DSDS Improvement Cycles Three critical Improvement Cycles: • Usability Testing • Rapid Cycle Improvement Teams • Practice-Policy Communication Loops Cycles

  36. External Implementation Support Practice-Policy Communication Cycle Policy Policy Practice Informs Policy Policy Enables Practices Plan Structure Feedback Study - Act Procedure Practice Do Practice FORM SUPPORTS FUNCTION

  37. Practice-Policy Communication Cycle Policy Practice Informs Policy Policy Enables Practices Plan Data Data Feedback Study - Act Do Practice

  38. DSDS Practice to Policy Practice to Policy Communication Loops Use Data: • To understand what’s happening during service delivery • Create hospitable organizations and supports for practice • To achieve results Cycles

  39. DSDS Practice to Policy Typically, Implementation Teams are the vehicle for this information: • Gather information from practitioners • Share results with leadership and practitioners • Communicate changes and responses bi-directionally Cycles

  40. Discussion Points Data Guided Decision-Making • How might you employ usability testing to stabilize your “what”? • What questions would you ask? • What would your targets be? • How would you collect and analyze the data? • Who would be responsible? • How are you making use of improvement cycles on a regular basis to ensure your infrastructure (the “how”) is supporting your innovation or program model (the “what”)? Supporting New Ways of Work

  41. Summary Data-based Decision Making Requires: • Thoughtful consideration of what’s most important to measure • An efficient system for measuring data • Simple processes for analyzing and targeting strategies for improvement • Transparent and inclusive processes for communicating results of data and improvement strategies • Plans to celebrate strengths and successes • Teams that use stage-appropriate data-based decisions to make improvements though various types of cycles

  42. Stay Connected! Allison.metz@unc.edu @allisonjmetz nirn.fpg.unc.edu www.globalimplementation.org

More Related