1 / 31

Thinking systemically: Seeing from simple to complex in impact evaluation

Thinking systemically: Seeing from simple to complex in impact evaluation. Expert lecture for AfREA Conference March 30 – April 2, 2009 Cairo, Egypt. Professor Patricia Rogers Royal Melbourne Institute of Technology, Australia Dr. Irene Guijt Learning by Design, the Netherlands

chars
Download Presentation

Thinking systemically: Seeing from simple to complex in impact evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Thinking systemically: Seeing from simple to complex in impact evaluation Expert lecture for AfREA Conference March 30 – April 2, 2009 Cairo, Egypt Professor Patricia Rogers Royal Melbourne Institute of Technology, Australia Dr. Irene Guijt Learning by Design, the Netherlands Bob Williams Independent consultant, New Zealand (with thanks to Dr. Jim Woodhill, Wageningen International, the Netherlands)

  2. TODAY’S SESSION • Explore what thinking systemically is and how it relates to evaluation • Introduce a systems approach that we think has the potential to move IDE forward, plus give you something you can use in your own practice • Give you an opportunity to reflect and play with systems ideas and this method.

  3. SYSTEMS CONCEPTS IN EVALUATION AN EXPERT ANTHOLOGY Eds. Iraj Imam & Bob Williams

  4. http://www.iigss.net/gPICT.pdf

  5. THREE ELEMENTS OF THINKING SYSTEMICALLY INTER-RELATIONSHIPS PERSPECTIVES BOUNDARIES

  6. INTER-RELATIONSHIPS Being deeply aware of their significance • Some inter-relationships matter more than most • Some only matter over time • Some are slower in their impact than others • Some are linear (A affects B), some are non-linear and recursive (A affects B which affects A) • Most critically, systems thinking focuses on the inter-relationship of ideas, assumptions, beliefs as well as actions in the traditional cause and effect.

  7. PERSPECTIVES Thinking systemically about perspectives is not the same as stakeholder analysis

  8. PERSPECTIVES

  9. PERSPECTIVES We all bring different perspectives to bear on anything we do. In this workshop I am handling four different perspectives • A session where people learn something • Something that allows me to communicate my knowledge • A means of expressing friendship and support to colleagues, • A way of enjoying myself. You cannot understand how I behave at this session unless you understand how I juggle these perspectives.

  10. BOUNDARIES

  11. BOUNDARIES Who or what is “in” and who or what is “out” • Purpose of evaluation; how will you judge “success”? • Resources for evaluation; what is not in your control? • What evidence is considered credible, whose expertise is acknowledged, or ignored? • Whose or what interests are not being served by an evaluation ?

  12. BOUNDARIES Thinking systemically requires you to do two things around those boundary decisions; • to identify the consequences of boundary decisions • consider how to mitigate any negative consequences of boundary decisions.

  13. Tool for thought - the Cynefin Framework Facilitates seeing situational diversity Based on recognizing different types of cause and effect relations – a given situation will contain aspects of all Draws on theories of: Complexity Cognitive Systems Narrative Networks Developed by Dave Snowden – ex-IBM knowledge management researcher

  14. The Cynefin Framework – knowing what you are dealing with Complex Complicated Unordered Ordered Disorder Chaotic Simple Source: Cognitive Edge (www.cognitive-edge.com)

  15. Ordered Domain – Simple (known) Source: Cognitive Edge (www.cognitive-edge.com) • Cause and Effect: • repeatable, perceivable, and predictable • Approach: • Sense – Categorise- Respond • Methods: • Standard operating procedures • Best practices • Process reengineering

  16. Evaluating the ‘simple’ • Simple aspects of a situation • Causal links are tight, clearly observed and understood • Key variables to assess can be determined • For evaluation • Need to know activities and some context • If activity takes place, outcomes are known (e.g. polio vaccination once in the person is guaranteed) • Monitoring is important •  “sense, categorise, respond” • But need to guard for slipping into chaos

  17. Ordered domain – Complicated (knowable) Source: Cognitive Edge (www.cognitive-edge.com) • Cause and Effect • Cause-effect knowable with ‘expert’ input • Approach • Sense – Analyse - Respond • Methods • Analytical/reductionist • Results-based thinking • Scenario planning • Good practices

  18. Evaluating the ‘complicated’ • Complicated aspects • less predictable, less self-evident, subject to some debate and discussion • usually an evidence base: ‘if A in relation to X under Y conditions, then Z likely’ • For evaluation • outcome hierarchy/results chain to identify information needed to understand impact • activities, chain(s) of results and assumptions that link them • contextual information to explain results • consulting ‘experts’ is key - “sense, analyse, respond”

  19. Unordered Domain - Complex • Cause and Effect: • Coherent in retrospect and do not repeat • Approach: • Probe – Sense - Respond • Methods: • Pattern management • Perspective filters • Circular dialogue • Emergent practice Source: Cognitive Edge (www.cognitive-edge.com)

  20. Evaluating the ‘complex’ • Complex aspects of a situation • unpredictable in advance; no clear understanding of chain of results; highly dependent on context and starting conditions • View as (a set of) experiments and figure out what ‘sticks’: “probe, sense, respond” • For evaluation: • observe activities and possible results of those activities, context fundamental, pathway of decisions to change • construct sense-making by drawing people into a dialectic

  21. Unordered Domain - Chaotic Source: Cognitive Edge (www.cognitive-edge.com) • Cause and Effect: • not perceivable • Approach: • Act – Sense- Respond • Methods: • Stability-focused intervention • Crisis management • Novel practice

  22. Evaluating the ‘chaotic’ • Chaos • totally unpredictable; no clear understanding of chain of results • For evaluation: • Observe context, prioritise needs, act, observe again • Afterwards (if/when situation stabilises), evaluate if best possible action under the circumstances was taken (real time evaluation) • “act, sense, respond” - assess worth of “act” and subsequent effects and follow-up actions

  23. Implications for IE Practice • How do we understand what is happening? • How do we change what is happening?

  24. Intervention is both necessary and sufficient to produce the impact ‘Silver bullet’ simple impacts Impact No impact Intervention No intervention

  25. Intervention is necessary but not sufficient to produce the impact ‘Jigsaw’ complicated impacts Impact No impact Intervention Favourable context Intervention Unfavourable context

  26. Intervention is sufficient but not necessary to produce the impact ‘Parallel’ complicated impacts Impact Impact Intervention No intervention Alternative activity

  27. ‘Life is a path you beat by walking’ complex impacts Impact / Vision Intermediate Results (at 1, t+1, t+2) Plan D2 Plan B2 Plan B3 PlanA Plan B Plan D2 Plan E Plan E2 Plan F Plan C Plan D

  28. Conceptualising the Intervention Differentiated study of causality

  29. So what is ‘impact evaluation’?

  30. Take home messages • Seeing ‘ontological diversity’ in situations (and interventions) enables a more conscious, appropriate methodologically mixed approach – it’s about being a good professional. • Cynefin is just a heuristic - a tool for thinking systemically • Thinking systemically is about a deep understanding inter-relationships, perspectives and boundaries. Boundary critique is the area where evaluation can learn most by drawing on the experience of the systems field.

  31. References Eoyang, Glenda. (2008) So, what about accountability? http://www.cognitive-edge.com/blogs/guest/2008/12/so_what_about_accountability_1.php Glouberman, S. and Zimmerman, B. (2002) Complicated and Complex Systems:What Would Successful Reform of Medicare Look Like?Commission on the Future of Health Care in Canada. Discussion Paper 8. Available athttp://www.healthandeverything.org/pubs/Glouberman_E.pdf Guijt, I. (2008). Navigating Complexity. Report of an Innovation Dialogue, May 2008. http://portals.wi.wur.nl/files/docs/Innovation%20Dialogue%20on%20Navigating%20Complexity%20-%20Full%20Report.pdf Guijt, I. and P. Engel. (2009). ‘Nine Hot Potatoes. Current Debates and Issues in Results-Oriented Practice.’ Presentation for Hivos In-house Training on Reflection-oriented Practice. Mackie, J. (1974). The Cement of the Universe. Oxford University Press, Oxford. Mark MR. 2001. What works and how can we tell? Evaluation Seminar 2. Victoria Department of Natural Resources and Environment. Rogers, P.J. (2008) ‘Using programme theory for complicated and complex programmes’ Evaluation: the international jourmal of theory, research and practice. 14 (1): 29-48. Rogers, P.J. (2008) ‘Impact Evaluation Guidance. Subgroup 2’. Meeting of NONIE (Network of Networks on Impact Evaluation), Washington, DC. Rogers, P.J. (2001) Impact Evaluation Research Report Department of Natural Resources and Environment, Victoria. Ross, H. L., Campbell, D. T., & Glass, G. V (1970). Determining the social effects of a legal reform. In S. S. Nagel (Ed.), Law and social change (pp. 15-32). Beverly Hills, CA: SAGE. Williams, B. and I. Imam. (2007). Systems Concepts in Evaluation: An Expert Anthology. American Evaluation Association. Woodhill, J. (2008). The Cynefin Framework: What to do about complexity? Implications for Learning, Participation, Strategy and Leadership. Presentation for the ‘Navigating Complexity Workshop’, Wageningen International. http://portals.wi.wur.nl/files/docs/File/navigatingcomplexity/CynefinFramework%20final%20%5BRead-Only%5D.pdf

More Related