1 / 17

Impact Measurement and You

Impact Measurement and You. Understanding Impact Measurement, Practical Tools, Key Questions. PI’s Paper on IM. Locates IM within the Programme Approach – therefore impact is measured at programme level. Focuses on three key elements of impact measurement:

manton
Download Presentation

Impact Measurement and You

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions

  2. PI’s Paper on IM • Locates IM within the Programme Approach – therefore impact is measured at programme level. • Focuses on three key elements of impact measurement: • Are we achieving the high-level goals we set out to achieve [tracking change]? • Is our theory of change holding true [testing the ToC]? • What’s changing in the context which may influence the above [on-going contextual analysis]? • The paper seeks to provide a mix of the conceptual and the practical – but errs on the side of conceptual

  3. Some Key Principles • IM is reliant on (and can help build) a strong culture and processes for learning and knowledge management. • Prior to work on IM, COs need to have a clear vision and understanding of the Theory of Change for their long-term programmes. • IM focuses on contribution analysis at higher levels, drawn from strong M & E at lower levels. • IM requires strong internal knowledge, but also partnership with researchers, think tanks, collegial organisations, etc. • Emphasis has been placed on quantitative data and analysis, but we need to build more support for qualitative work as well.

  4. Elements of IM a bit more detain on key focus areas

  5. Tracking Change • Understanding – and then quantifying – the changes expressed in the Domains of Change and Impact Goal of each programme. • High-level changes, which should show contribution to national and international standard indicators – e.g. the MDI plus list. • This is a process of contribution analysis – seeking to understand our (CARE + Partners) contribution to population-level changes.

  6. Testing the Theory of Change • The programmeToC is a set of hypotheses, which, if they hold true, should give rise to the kinds of social change articulated in the Domains of Change and Impact Goal. • Our hypotheses are based on evidence, but also on intuition. Some are foundational, and, if they haven’t already been “proven” will need to be – otherwise the ToC is a house of cards. • Hypotheses for testing are based on critical assumptions, which, if do not hold true, threaten the overall ToC. They inform programming strategy, rather than being a part of the strategy. • There are many ways to test key hypotheses: through new or existing initiatives; through targeted research; through literature reviews; etc.

  7. On-Going Context Analysis • The Theory of Change is located within evolving contexts, which must, on an on-going basis, be reviewed and integrated into the ToC and discussions about change. • This is an opportunity to integrate the Programme Approach, IM and EPP processes. • Timing will vary in different COs – e.g. Egypt’s rapidly changing context will require closer monitoring and meaning making than some other countries – the same is true for “emergency-prone” COs.

  8. Knowledge Systems and Sub-Systems (with thanks to Tom Barton and ECARMU) The importance of “holding systems”

  9. Sub-systems from Tom Barton

  10. DM&E System • Design: aims at the formulation (and on-going modification) of useful, significant efforts toward alleviation of poverty, social injustice and enhancement of human dignity. • M & E: the coordinated set of interlinked activities for gathering and analysing information, reporting and supporting decision-making and the implementation of improvements. • Monitoring: the regular collection (plus analysis and use) of information about progress within the programme and its projects. • Evaluation: periodic reviews and reflective practice toward information from within, as well as about, programmes and projects and their performance. Reflective practice refers to the process of challenging ourselves based on the information we get by asking key questions, e.g., why, so what, and now what?

  11. Knowledge Management System • Purpose: turning information into knowledge and evidence; • Making knowledge accessible • Many COs have very weak and disbursed knowledge management systems – some innovative ideas: sharepoints, M & E and Learning Units, etc.

  12. Learning System • Purpose: to support on-going improvements in personal & organisational practice; track and reassess what’s “right.” • Learning Systems are based on: • Reflective Practice; • Communication; • Application. • Learning happens by doing, and by reading and engaging others.

  13. Synthesis • COs generate data and knowledge through monitoring systems; • This knowledge (and data) needs to be aligned to the relevant CO programme – and may form part of a learning agenda, or process toward exploring hypotheses; • Learning systems need to support the interpretation of data and the use of knowledge – at initiative and programme levels.

  14. Thoughts to Ponder

  15. General Questions • How best to create dialogue about IM and supporting systems? • What support is there for IM and supporting systems? • How is the link between IM and PPLA’s current work on knowledge management articulated? • In times of tight budgets, which are the most important processes/elements? • Where can a repository of good practice be housed?

  16. WE-IM Specific Emerging Questions • WE: Means to an end? End in itself? And is there space/appetite for this discussion? • Common hypotheses to test in multiple contexts - e.g. contribution of WE to other outcomes? Role of VSLA in building self-efficacy? Others? • Experimental pilots for measuring WE at impact level (Pathways, perhaps….?)

  17. Other thoughts, questions, ideas, missing points???

More Related