1 / 66

A Tale in 4 Acts: Sustaining Evidence- Based & Promising Practices

A Tale in 4 Acts: Sustaining Evidence- Based & Promising Practices. Phyllis C. Panzano University of South Florida Decision Support Services Global Implementation Conference August 15-17, 2011 Washington, DC. Primary Goal.

gryta
Download Presentation

A Tale in 4 Acts: Sustaining Evidence- Based & Promising Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Tale in 4 Acts: Sustaining Evidence- Based & Promising Practices Phyllis C. Panzano University of South Florida Decision Support Services Global Implementation Conference August 15-17, 2011 Washington, DC.

  2. Primary Goal • To explore the question of what accounts for the long-term ‘success’ of healthcare provider organizations’ efforts to implement evidence-based and promising practices (EBPPs)

  3. ODMH Mike Hogan, Director Dee Roth & staff, Kraig Knudsen & staff MacArthur Network Howard Goldman, MD, Pamela Hyde, JD & Colleagues Ohio State University Robert Billings Paul Nutt Robert Backoff CCOEs Patrick Boyle Patrick Kanary William Rubin IDARP*: The Tale CLIFFNotes Innovation Diffusion & Adoption Research Project ODMH & MacArthur Network on MH Policy

  4. Four Acts: Take-home Messages? IDARP Start 4th Contact 1st Contact 2nd Contact 3rd Contact 2001 2002 2003 2004 2005 2006 2007 2008 2009 EBPP Initiative Kickoff ACT 1 ACT 2 ACT 3 ACT 4 AdoptionClimate De-Adoption Sustained Use

  5. Outline • Research Context • Cross-cutting Assumptions and Paradigms • IDARP: Metaphors, Methods, and Findings • Act 1: The adoption decision: A risk-based model • Act 2: Implementation success and project climate • Act 3: The de-adoption decision: Fit • Act 4: Sustainability and Implementation Success • Reflections & Wrap-up

  6. Research Context

  7. ODMH’s Quality Agenda Best Practices Best Practices QI Outcomes

  8. Key FactorsConsidered by Policymakers Evidence Political Salience

  9. Initial 8 Coordinating Centers of Excellence (CCOEs) • SAMI-IDDT • MST • Family Psycho-education • Cluster-Based Planning Evidence • OMAP • MH/Criminal Justice • MH/Schools • Advance Directives Political Salience

  10. Structure of Coordinating Centers of Excellence (CCOE) • University or local partnership • One Best Practice per CCOE • Statewide service area

  11. Role of CCOEs • Promotion of Best Practices (voluntary adoption) • Training, TA, Problem Solving • Capacity Development • Fidelity measurement • Cross-system sharing

  12. Innovation Diffusion & Adoption Research Project(IDARP)

  13. Major Research Questions: • What factors explain the adoption of Best Practices (EB & PPs) by behavioral healthcare organizations (BHOs) in this system context? • What factors explain implementation success among adopter BHOs in this system context?

  14. Cross-cutting Assumptions & Paradigms

  15. Context as Caveat • Context impacts: • Particular questions posed • Theory & paradigms seen most relevant • Generalizability of findings • ODMH Context: • Statewide mental health system initiative • Voluntary adoption • Promoting specific set of ‘BEST” practices • “TA” Resource: CCOE • Key factors potentially impacting relevance

  16. Innovations & Evidence • Focal EB & PPs = innovations • Internal standards: seen as ‘new’ by the relevant unit of adoption regardless of newness in the marketplace (e.g., Zaltman, 1973; Nord & Tucker, 1987)… and • External standards: recency (the state of the art) or “differentness” (significant departures from status quo, prior forms) (e.g., Rye & Kimberly, 2007) • Scientific evidence and adoption? It may be neither necessary nor sufficient (e.g., Abrahamson, 1998; Denis et al, 2002; Duffy & Farley, 1992)

  17. Strategic Decisions • Innovation adoption & implementation decisions: organizationally important, consequential and merit attention by top management At the level of individual [healthcare] provider organizations, the direct and indirect costs of acquiring and implementing [healthcare] innovations are often substantial, and the adoption and use of innovations typically present highly consequential financial and managerial challenges” (Rye & Kimberly, 2007, p. 236) • The strategic value of an innovation to an organization: • May be a function of economic concerns (e.g., effectiveness), sociological concerns (e.g., legitimacy, power) or both (e.g., Abrahamson, 1991; Denis, 2002; Kennedy, 2010) • Is likely to vary across organization ….and even within organization over time (e.g., Helfrich, 2007; Klein & Sorra, 2001; Rye & Kimberly, 2007)

  18. Organizations Think • Organizations do not have mechanisms separate from individuals to set goals, perceive the environment etc. • Top management team (TMT) members interpret developments and form organizational-level interpretations which set the stage for organizational decisions & actions (e.g., innovation adoption decisions) • The organizational interpretation process is something more than individual processes because organizations have missions, stakeholders, histories, memories and intelligence that are preserved although individual managers may come and go. (e.g., Daft & Weick, 1984; Hambrick & Mason, 1984; Kennedy & Fiss, 2010)

  19. Organizations Think • Organizations do not have mechanisms separate from individuals to set goals, perceive the environment etc. • Member of Top Management Teams (TMT) interpret developments and form organizational-level interpretations which set the stage for organizational decisions & actions (e.g., innovation adoption decisions) • The organizational interpretation process is something more than individual processes because organizations have missions, stakeholders, histories, memories and intelligence that are preserved although individual managers may come and go. (e.g., Daft & Weick, 1984; Hambrick & Mason, 1984; Kennedy & Fiss, 2010)

  20. Interpretations Depend on Vantage Point System Developments Strategic Issue Agenda Upper Echelon CCOE Focal Subunit (e.g., Mathieu & Chen, 2011)

  21. How do you spell S-U-C-C-E-S-S? • Multiple ways to define and measure implementation success • Decision-to-adopt (e.g., Damanpour, 1991; Rye & Kimberly, 2007) • Fidelity/strategic accuracy of use (e.g., Dusenbury) • Outcomes (objective & perceived) • Duration of use/continued use (e.g., Hickson, 2003; Nutt, 2004; Kennedy & Fiss, 2010) • Decision or implementation stage achieved (e.g., Chamberlain, 2010; Meyer & Goes, 1988; Yin, 1979) • Meaningful measures of implementation success (with implications for sustained use) may be conceptually different for EB & PPs compared to other innovations. (e.g., Fichman, 2005; Real and Poole, 2005)

  22. The 4 Acts:Methods, Metaphors, & Findings

  23. The Four Acts 4th Contact 1st Contact 2nd Contact 3rd Contact 2001 2002 2003 2004 2005 2006 2007 2008 2009 ACT 1 ACT 2 ACT 3 ACT 4 AdoptionClimate De-Adoption Sustained Use Research Question 1 Research Question 2

  24. Design, Methods, Measures • Observational Field Study; longitudinal; < 4 Contacts • Focal Practices: 4 of the 8 practices/CCOEs • Primary = IDDT & MST; • Secondary = OMAP & CBP • Maximize variability on key innovation attributes • Recruitment: CCOE  Provide key organizational contact to research team; voluntary participation by CCOEs and organizational participants (>90% - 75% across time period) • Methods: interviews(structured, process reconstruction protocol); surveys (organization & CCOE; forms varied by role, project status, and time period); archival data

  25. Design, Methods, Measures • Interview Questions & Survey Measures: Theory-based; adapted from existing protocols and scales; needed to be generic to accommodate the study of multiple innovations • Key informants: Top decision-makers (e.g., CEO, CCO, CFO), implementation managers, front-line implementers; CCOE Project liaisons/consultants • Sample tracked:84 adoption decisions (& decision processes) and 50+ implementation efforts followed for < 4 contact pts. • Analyses: measurement reliability and agreement prerequisite to aggregating for analyses; use multiple sources of measures when possible; use measures from multiple time frames when feasible

  26. Participation Across Contact Point1 * Archival measures not included; most survey and interview questions adapted from existing measures and interview protocols

  27. Act 1: The Adoption Decision The Card Shark Model The decision to adopt depends on calculated risk; the size of your chip stack does matter! Some relevant topics and readings • Decision-making under risk (e.g., Sitkin & Pablo) • Prospect theory (e.g., Tversky & Kahneman) • Threat rigidity hypothesis (e.g., Staw, Sandelands, Dutton) • Unified Theory (e.g., Venkatesh et al) • Innovation adoption by healthcare provider orgs (e.g., Rye, 2007)

  28. A Decision Under Risk Success: Decision Stage Perceived Risk of Adopting ANTECEDENTS - • RAISE • CALL • CHECK • FOLD • IMPLEMENTATION UNDERWAY • JUST DECIDED TO ADOPT • STILL CONSIDERING • NEVER WILL ADOPT + Capacity to Manage or Absorb Risk + Risk-taking Propensity For more detail: Panzano and Roth, 2006

  29. TMT Perspective Success: Decision Stage Perceived Risk of Adopting ANTECEDENTS -.50 • IMPLEMENTATION • UNDERWAY • JUST DECIDED TO • ADOPT • STILL CONSIDERING • NEVER WILL ADOPT Capacity to Manage or Absorb Risk .40 .28 Risk-taking Propensity Panzano & Roth (2006) Psychiatric Services

  30. A bit more …. Capacity To Manage Risk Perceived Risk Decision Stage Past Propensity to Take Risks

  31. ACT 1: 1. Success: Decision to Adopt 2. Decision made in consideration of risk to the organization

  32. Act 2: Climate for Implementation The Dilbert Model Projects can rise and fall depending on how soundly they’re managed Some relevant topics: • Levels issues in climate research (e.g., Klein & Kozlowski, 2004) • Climate for implementation (e.g., Klein et al, 2001; Holahan, 2004; Helfrich, 2007) • Implementation drivers (e.g., Fixsen et al, 2009)

  33. Climate for Implementation • Definition: Targeted employees shared perceptions of the extent to which their use of a specific innovation is rewarded, supported and expected within their organization (Klein & Sorra, 1996, p. 1059). • Climate dimensions in BHO & CCOE Surveys • Top management support • Goal Clarity • Dedicated resources • Access to training & TA • Rewards/recognition for implementing • Removal of obstacles • Performance monitoring • Freedom to express doubts

  34. Focal Entity for Climate for Implementation Focal Entity: Team The definition of targeted employees for the measurement of climate for implementation is likely to vary by innovation

  35. Definition of Implementation Success • Implementation effectiveness: • The accurate, committed & consistent use • of a practice by targeted employees (e.g., • fidelity) • Innovation effectiveness: • Benefits that accrue to an organization and its • stakeholders as a result of implementation Klein & Sorra, 1996; 2001

  36. CLIMATE AND SUCCESS Time 1 Time 2 .75 IMPLEMENTATION EFFECTIVENESS Climate for Implementation .45 INNOVATION EFFECTIVENESS For more details: Vaidyanathan, 2004; Panzano et al, 2005

  37. CLIMATE AND SUCCESS Time 1 Time 2 +++ IMPLEMENTATION EFFECTIVENESS Climate for Implementation +++ NS INNOVATION EFFECTIVENESS

  38. ACT 2: • Practice-specific implementation climate is important • Implementation effectiveness (e.g., fidelity)  innovation effectiveness

  39. Act 3: De-Adoption Decision If the Glove Still Fits, Keep-wearing-it-model External and Internal Developments Influence Goodness-of-Fit Some relevant topics and readings: • Disengagement or abandonment of innovation, Rye & Kimberly, 2007 • Managerial fads and fashions: The diffusion and rejection of innovations, Abrahamson, 1991 • De-innovation, Kimberly & Evanisko, 1981

  40. The De-Adoption Decision1 • Un-anticipated • Exploratory – data from BHO key informant interviews & surveys • 12 matched pairs of practices/orgs • Compared Sustainers with “De-adopters” 1For more detail: Massatti, Sweeney, Panzano & Roth, 2008

  41. Key differences: Sustainers vs De-adopters • Better Fit • Compatibility with org mission & values • Support from external organizations to continue • Ongoing support from top management • Positive attitudes about practice among staff • Stronger project management & resources • Access to TA during implementation • Availability of resources (current & projected) • Know – how and skill at implementing • More convincing effectiveness evidence

  42. Recap: Acts 1, 2 & 3 ACT 1: Adoption & Org. Risk ACT 2: Climate & Effectiveness ACT 3: De-adoption, fit & effectiveness Gains > Losses Stronger Project climate, better execution & results + Fit, + Results Losses > Gains - Fit; - Results

  43. Act 4: Sustained Use & Assimilation GLOVE FITS DILBERT

  44. Contact 4: Sample & Methods • 43/44 sites still implementing (roughly 6 years, on average) • Success: Assimilation • Informants: • BHOs: TMT members & EBPP managers • CCOEs: primary consultant/liaison • Methods: Surveys (different forms); interviews (BHOs) • All projects represented between BHO + CCOE data: • 35: organization survey and fidelity interview • 34: CCOE surveys & fidelity data • 25: Common/overlapping

  45. Sustainability Model: Core Elements

  46. Operational Measures: In CCOE & Organizational Surveys

  47. Operational Measures/Scales

  48. Major Bi-variate Linkages in the Sustainability Framework Focus • CCOE and Org ‘r’ values shown have 1-tailed p < .01 • Absolute rs with Assimilation range from .43 to .77

  49. Assimilation • The extent to which an innovation is seen as or has become a regular part of organizational procedures. (e.g., Yin, 1979; Zmud & Apple, 1992)

  50. Organization & CCOE Scale Scores* for 25 Common Projects * Color contrast: Paired T-Test, 2-tailed p < .05

More Related