1 / 24

Evaluating organisational context

Explore the use of realistic evaluation to study the effects of organisational context on knowledge utilization and uptake at grassroots level. Discover the key concepts of CMOs (Context, Mechanism, Outcomes) and their application in evaluating KU and organisational context.

gallen
Download Presentation

Evaluating organisational context

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating organisational context Measurement approaches: evaluation David Dunt Program Evaluation Unit, Univerity of Melbourne

  2. The contribution of realistic evaluation in studying the effects of organisational context on KU & KU’s take-up at grassroots level Program Evaluation Unit, Univerity of Melbourne

  3. For me, Realistic evaluation, not a new set of ideas but a name for an approach that I had progressively discovered during my career as an evaluator (‘that’s what it is’) • My outline not so much an exposition on a realist theory of science • But a description of this path of discovery • As it encapsualtes the path to enlightenment for many other health practitioners interested in KU particularly after their discovery of Evidence-based Medicine & healthcare Program Evaluation Unit, Univerity of Melbourne

  4. Outline the main ideas of Realistic evaluation particularly the key conceptsofCMOs • Context • Mechanism • Outcomes • Outline applications to KU & organisational context Program Evaluation Unit, Univerity of Melbourne

  5. Outcomes movement- point of departure The randomised control trial Treatment Program Outcome ‘Thing’ Program Evaluation Unit, Univerity of Melbourne

  6. Outcomes movement- first problem The randomised control trial Treatment Program Outcome ‘Thing’ Treatment, program content - a black box Program Evaluation Unit, Univerity of Melbourne

  7. With a black box:- • Don’t know what element of the program was responsible for • success • failure • Was program failure due to model failure or implementation failure (of a good model) Program Evaluation Unit, Univerity of Melbourne

  8. Distinction is fundamental to evidence-based decision making where effective and ineffective programs must be correctly identified Program Evaluation Unit, Univerity of Melbourne

  9. Points to the importance of the study of process • The importance of the use of Program logic Program Evaluation Unit, Univerity of Melbourne

  10. Program logic • Is a concept much loved by evaluators • Similar to management flowsheets • Specify successive stages (chronological, causal) in program rollout • Use key performance indicators (KPIs) to assess whether a stage has been successfully reached • Gather relevant information collected at each stage why stage has been reached (or not) • But need to move beyond their common sense models of causation & chronology Program Evaluation Unit, Univerity of Melbourne

  11. Program theory - Mechanisms • Instead program logic based upon a program theory in order to identify optimal program content including all its components and their relationship (stages) • Application of a Mechanism or causal model based on best information (literature, experts, stakeholders) Program Evaluation Unit, Univerity of Melbourne

  12. Program theory - Context • Program logic also based upon knowledge of local Context (analysis of barriers, risks eg negotiating key gatekeepers, usual work patterns, usual organisational practice and barriers Program Evaluation Unit, Univerity of Melbourne

  13. In summary • With program development • There should be an analysis of the application of the proposed Mechanism in the local Context in order to ensure a program with maximum functionality in that context and with achievement of best Outcomes • With program implementation • There should be monitoring of the important stages of the program’s rollout to ensure they are occurring Program Evaluation Unit, Univerity of Melbourne

  14. A second problem for the Outcomes movement embedded in this analysis • Context is assumed and • Context of EBM studies typically is quite atypical • Teaching hospitals • Atypical patients chosen for their ease of study (no co-morbidities, English-speaking, no mental impairment, and with unrepresentative social & demographic characteristics) Program Evaluation Unit, Univerity of Melbourne

  15. Fundamental to evidence-based decision making where effective and ineffective programs must be correctly identified Program Evaluation Unit, Univerity of Melbourne

  16. Realistic evaluation in KU • Focus on Context and Mechanism occurring to some extent already • EPOC group • Examples: academic detailing, use of opinion leaders Program Evaluation Unit, Univerity of Melbourne

  17. Realistic evaluation – in KU • Still little program theory (psychological, sociological [systems], or economic) • Confirmed in a recent UK review by Grimshaw et al • Reliance instead on atheoretical, tickbox questionnaire-based study of behavior change • Program logic approaches notably absent Program Evaluation Unit, Univerity of Melbourne

  18. Realistic evaluation - measuring organisational context & change 1 • Challenge is first to build program theory • Application of Mechanisms using • existing theory and evidence to • Local organisational Contexts using • best local knowledge Program Evaluation Unit, Univerity of Melbourne

  19. Realistic evaluation - measuring organisational context & change 2 • Generally, aim is to build heuristic models to assist in thinking about program development and implementation. • Not to construct a causal model by eg path analysis, with each path’s contribution being measured by a correlation coefficient Program Evaluation Unit, Univerity of Melbourne

  20. A few exceptions where may be important to build causal models • For very important and central dimensions of the health care system Program Evaluation Unit, Univerity of Melbourne

  21. A few exceptions where may be important to build causal models • Finish briefly with an example of this from my own work in the Australian health care system • The development of a psychometric tool to measure the level of integration of the GP with the remainder of the health care system Program Evaluation Unit, Univerity of Melbourne

  22. GP integration with the health care system 1 • 70 question inventory completed by GPs about their current activities relevant to integration • 9 integration factors organised around 2 higher order factors • patient care management and • community health • Plus 5 enabling factors Program Evaluation Unit, Univerity of Melbourne

  23. GP integration with the health care system 2 • Two key findings • It is possible to map levels of GP integration across Australia – considerable variation exists • Levels strongly relate to GP’s knowledge of local services and resources Program Evaluation Unit, Univerity of Melbourne

  24. GP integration with the health care system 3 • Further details are a story for another day Program Evaluation Unit, Univerity of Melbourne

More Related