1 / 54

How Evaluation Can Make You Brilliant Caroline Fiennes giving-evidence

How Evaluation Can Make You Brilliant Caroline Fiennes www.giving-evidence.com Twitter: @carolinefiennes. How Evaluation Can Make You Brilliant Or What I learnt about impact and evaluation from Roger Federer. From J-PAL (www.povertyactionlab.org). Opportunity cost!.

bevan
Download Presentation

How Evaluation Can Make You Brilliant Caroline Fiennes giving-evidence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How Evaluation Can Make You Brilliant Caroline Fiennes www.giving-evidence.com Twitter: @carolinefiennes

  2. How Evaluation Can Make You Brilliant Or What I learnt about impact and evaluation from Roger Federer

  3. From J-PAL (www.povertyactionlab.org)

  4. Opportunity cost! From J-PAL (www.povertyactionlab.org)

  5. “I have been struck again and again by how important measurement is to improving the human condition You can achieve amazing progress if you set a clear goal and find a measure that will drive progress toward that goal – a feedback loop Without feedback from precise measurement, invention is “doomed to be rare and erratic.” With it, invention becomes “commonplace.” We need better measurement tools to determine which approaches work and which do not” - Bill Gates’ Annual Letter, Jan 2013

  6. ‘This report was triggered by a simple question: “Has our performance to date in achieving scale been good, average or poor when compared with our peers?” Given the lack of other published information around performance – including both success and failure – from peer organisations, this proved to be very difficult to answer. That is surprising given the billions of dollars managed by foundations’

  7. 7% ! Making an Impact: Impact Measurement Across Charities and Social Enterprises in the UK, NPC, October 2012

  8. Doh! 7% ! Making an Impact: Impact Measurement Across Charities and Social Enterprises in the UK, NPC, October 2012

  9. “There is a deep-rooted fear of finding out (or ‘being found out’)that one has not had the impact that was intended. Organisations are incredibly reluctant to admit that programmes have not gone to plan. Some simply do not tell funders the truth; others are very opaque when reporting back to funders; yet others cherry-pick clients to ensure low success rates are minimised. Lessons of ‘failure’ are rarely shared. When funders become aware that the desired results have not been achieved for whatever reason, they are seemingly equally reluctant to take constructive action, for fear of damaging the organisations’ (and possibly their own) reputations.” - Demonstrating Impact: Current Practice Amongst Social Purpose Organisations in the Republic of Ireland, The Wheel, 2011

  10. How evaluation can make you brilliant

  11. Gates Foundation’s approach: Are we helping her? Could we be helping her more?

  12. These are the data which indicate whether we’re on the right track & how we can improve

  13. What track do you want to be on? Alice went on: 'Would you tell me, please, which way I ought to go from here?' `That depends a good deal on where you want to get to,' said the Cat. `I don't much care where--' said Alice. `Then it doesn't matter which way you go,' said the Cat.

  14. Impact = idea x implementation

  15. Impact = idea x implementation How much does rail infrastructure contribute to GDP? How many trains did you run? Are your trains on time? How full are they? What do your passengers think of them?

  16. Impact = idea x implementation How much does rail infrastructure contribute to GDP? How many trains did you run? Are your trains on time? How full are they? What do your passengers think of them? Evaluation (impact eval ) Monitoring (process eval ) Independent. Is social science research. By the operating entity. On-going.

  17. Do we know whether this idea works? Yes No Find out, via a rigorous evaluation* • Crack on: • Do / fund the thing if it works • Don’t if it doesn’t *and publish the answer

  18. From The Lancet, a central tenet of good clinical research:Ask an important question and answer it reliably

  19. Impact = idea x implementation R&D, not M&E

  20. What if n=1 / can’t have a control, e.g., a national campaign, legislative change, climate? • What if it’s an innovative idea? • Who should gather the data? Which data? Who should gather what & why? What problems with charity-generated data? • Proportionality: should M and/or E be proportional to the size of the org and/or the grant?

  21. In reports from charities, we found “some, though relatively few, instances of outcomes being reported with little or no evidence to back this up” Good OK Poor Designation of quality of charities’ data in their reports

  22. Predictable problems with data from charities: • Not independent • Marketing • Incentivised to make it look great • No skilled at analysis design / data collection • No incentive: data often not useful to them • Poor data quality

  23. Do we know whether this idea works? Yes No Find out, via a rigorous evaluation* • Crack on: • Do / fund the thing if it works • Don’t if it doesn’t *and publish the answer

  24. - Bill Gates’ Annual Letter, Jan 2013 • See what exists already: don’t duplicate • Collect what you need • Collect it properly to properly answer the question • Don’t collect what you don’t need “We need better measurement tools to determine which approaches work and which do not”

  25. Caroline Fiennes www.giving-evidence.com Twitter: @carolinefiennes

  26. Example RCT (back-to-work programme) Source: Test, Learn, Adapt: Cabinet Office, July 2012

  27. RCT of two educational interventions Source: Test, Learn, Adapt: Cabinet Office, July 2012

  28. evidence Using external info for selection Areas of focus & type of partner Suitable prospective partner in that area appears Has it done a proper needs assessment? (if no, then get it to do one) Has that found a need for its proposed ‘idea’? Yes No Don’t fund it Question to answer: Is 3rd party reliable evidence of the idea available? Process: Go look in the databases / cupboard of knowledge Yes No Evidence clear that idea doesn’t work, ever Evidence clear that idea does work in passably similar circumstances Evidence clear that idea does work in some circumstances: not yet tested here Evidence is inconclusive No evidence available Assess implementation ability Consider funding ‘replication study’ (tests it here)* Consider funding primary (impact eval) study and/or meta analysis of existing research* Consider funding primary (impact eval) study * Good Poor Fund it Don’t fund it Publish what you find! Don’t fund it Monitor it / process eval *Spend proportionate to size / usefulness of knowledge gap. Get useful sample size etc.

  29. What even is a foundation’s impact? Part 1 An org / programme’s impact = the change attributable to the org /programme = the difference between what happened with the org / programme and what would have happened anyway

  30. From It Ain’t What You Give, It’s The Way That You Give It

  31. From It Ain’t What You Give, It’s The Way That You Give It

  32. From It Ain’t What You Give, It’s The Way That You Give It

  33. ‘From 1987 to 1999, Paul Brest was the dean of Stanford Law School. A day hardly went by when students, faculty or alumni didn’t tell him what he was doing wrong – and at least once in a while they were right. Then in 2000, he became president of the Hewlett Foundation [a large US grant-making foundation]. By all external signals, within a matter of months, underwent a personal transformation and achieved perfection.’ - Paul Brest and Hal Harvey

  34. The six questions which any good charity can answer Impact = idea x implementation • Idea • What’s the problem you’re trying to solve • What activities does the organisation do? • What evidence is there that those activities help solve the problem? • Implementation • How do you find out whether you are achieving anything? (ie, what is the research process? • What are you achieving? (i.e., what results does that process produce?) • How are you learning and improving? What examples do you have of learning and improving?

  35. Key research question: What do you want to find out?

  36. The goals are not: To get tons of data To create tons of work To ask for every conceivable thing To count your total beneficiaries Count any other random thing

  37. “Non-profit CEOs spend huge amounts of time – sometimes as much as half their time – dealing with funders. These leaders have incredible ideas about solving important issues such as child literacy… but they can’t focus on their work because of the constant demands of donors”

More Related