1 / 17

September 2012

Annual Evaluation Review World Vision Australia’s j ourney. Lucia Boxelaar Head Program Research and Advisory. September 2012. How the program design described it. What the assessment suggested was needed. How much sense the design made to program partners.

hasana
Download Presentation

September 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Annual Evaluation Review World Vision Australia’s journey Lucia Boxelaar Head Program Research and Advisory September 2012

  2. How the program design described it What the assessment suggested was needed How much sense the design made to program partners What capacity we had to deliver How sales & marketing described it to the donor How the program was resourced and supported How the program design described it What the children wanted What the community really needed How the program was documented So how do we know what’s really going on? What will the evidence tell us?

  3. Assumption 1 challenged: All projects and programs are evaluated at completion, or at end of phase and evaluation documents are readily available on Docstore (WVA database) The reality After much searching of our systems, we worked out that a total of 222WVA funded projects and programs finished in 2009 Our information systems suggested that 82of these had been evaluated After 6 months of intense pursuit of NOs, we could locate 46reports And so our assessment could begin...... Or so we thought....

  4. Assumption 2 challenged:That most reports will meet scientific quality standards, i.e.... • That most reports adequately describe the methodology (sampling, data collection and analysis) so we can make an assessment of the validity of the findings • Rationale for methodology described in only 21 reports • Sampling strategy defined in only 22 reports • Analysis methods described in only 20 reports • Methodological limitations described adequately in only 17 reports • That reports base their findings on a representative sample and therefore had something valid to say about the whole community • Even when sampling strategy is defined well, there was often lots of missing data, so even when sample was large, often actual data only for a significantly reduced sample • That reports describe the data on which their conclusions were made • Data is often not adequately described, esp for qualitative data • Only 1report fully meets standards of scientific rigour

  5. Therefore... .... we abandoned scientific rigour as an inclusion criteria for the evaluation review. Instead we included all reports that included plausible descriptions of change, that would convince a development professional without a DME or research background.

  6. Assumption 3 challenged:That we would be able to aggregate findings from the different reports The reality Totally inconsistent use of indicators, poor sampling, lack of proper statistical analysis, all pre-empted statistical aggregation Therefore We could not conduct a meta-analysis (which statistically aggregates data), but could only do a review of evaluations

  7. Assumption 4 challenged:That reports would describe the objectives of the project or program and assess the achievements of the project or program against these objectivesThat we could therefore assess the number of projects or programs that achieved their objectives The reality Many evaluation reports did not assess projects against objectives set in the project design. Therefore We had to find a different framework against which to assess programs. And so we did...

  8. WVA’s analytical framework Increasing complexity

  9. Our findings over the last 3 years... Number of projects that observed the highest level of change has increased each year, 18 in FY09, 25 in FY10 and 34 in FY11

  10. And now for the good news... We are heading in the right direction .... fast!!

  11. Significant improvement in evaluation methods used

  12. Evaluation approach

  13. Some new additions this year... Assessment of prevalence of good practice in 3 sectors – MNCH, Child Protection and Food Security

  14. Maternal Newborn and Child Health • 65% (80 of 130) MNCH & related health projects incorporate 7-11 strategies

  15. Child Protection and Advocacy • 45% (52 of 122) projects incorporated part or all of the CPA approach

  16. Food security approved approaches • This assessment was against six approaches only - Farmer Managed Natural Regeneration (FMNR), Local Value Chain Development (LVCD), Business Facilitation (BF), Community Management of Acute Malnutrition (CMAM), Permaculture household gardens and Energy saving stoves • 17% (31 of 180) projects are using one of these approaches

  17. Comments or questions?

More Related