1 / 20

It’s the learning, not the result which counts most in evaluation

It’s the learning, not the result which counts most in evaluation. Randall Pearce THINK: Insight & Advice The 7 th Australasian Better Boards Conference 5 July 2013. Evaluation – What is it?. ‘ A systematic way of answering questions about projects, policies and programs ’

elkan
Download Presentation

It’s the learning, not the result which counts most in evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. It’s the learning, not the result which counts most in evaluation Randall Pearce THINK: Insight & Advice The 7th Australasian Better Boards Conference 5 July 2013

  2. Evaluation – What is it? • ‘A systematic way of answering questions about projects, policies and programs’ • Is it needed or worthwhile? • Is it having an effect? • At what cost? • How could it be improved? • Are there better alternatives? • Are there any unintended consequences?

  3. NFP Evaluation – What it isn’t

  4. Who evaluates?

  5. Why do (or don’t) they evaluate? Source: New Philanthropy Capital

  6. What do they gain? Source: New Philanthropy Capital

  7. Who should evaluate?

  8. When to conduct evaluation? Source: K Roberts (adapted from Owen and Rogers, 2006)

  9. Dispelling myths • Theory of change? • Not needed because the evaluator will reconstruct the logic of the actual program, not the theoretical model: • Foundational activities • Activities • Outputs • Immediate outcomes • Intermediate outcomes • Long-term outcomes • Organisational goals

  10. Dispelling myths • Mountain of data? • Most data is just information…we are looking for insight into what it means • Historical data is more valuable than a mountain of current data • Your evaluator should identify the few ‘dashboard’ measures that you will need to evaluate • Once an evaluation has been conducted you can use the dashboard forever

  11. Dispelling myths • A wad of cash? • Think of what is at stake versus the internal budget allocation – any activity with a value in excess of $200K should be evaluated • Governments and foundations often allow for 10% to be spent on evaluation • There are many ways that NFPs can reduce the cost of evaluations

  12. Using the results of evaluation • Share them…as widely as you can • Some evaluators will agree to write a summary which protects the egos of those involved • Action learning/research is a participative approach based on a four part cycle of: taking action, measuring the results of the action, interpreting the results and planning what change needs to take place for the next cycle of action • The best projects conclude with a Summit workshop

  13. Beyond program impact evaluation

  14. Learning along the way • Documentation • Documents success and failures • Summary of key documents in one place • Timeline/sequence of events • Isolates key measures for the future • Supports performance appraisal for staff and board • Helps orient staff, volunteers and contractors

  15. Learning along the way • Full cost accounting • Full costs and expenses need to be calculated to arrive at the true financial picture • Need to include: • Budget allocation • Cash donations • In-kind services • Pro-bono services

  16. Learning along the way • Full value assessment • Captures all non-financial outputs in addition to financial information • For example, while social media produces a host of measures, there are no financial equivalents as there are in traditional media (i.e. TARPs) • Need to identify data sources for year-on-year comparison in future

  17. Learning along the way • Organisational behaviour and Governance • Qualitative research reveals issues around organisational behaviour and governance which can impact outcomes • Project governance can be examined independent of personalities to pinpoint areas for change/improvement

  18. Learning along the way • Relationship building • The evaluation process has been described as ‘cathartic’ by key players • Helps diffuse tensions that build up during a campaign • Provides stakeholders a voice/builds goodwill for the future • Aids communication ‘across the political/media divide’

  19. Over to you… Questions

  20. For more information, contact: Randall Pearce +61 2 9358 6664 randall.pearce@thinkinsightadvice.com.au NOTE: For a copy of this presentation, please provide your business card at the end of the session or email

More Related