1 / 25

Designing Influential Evaluations Session 6 Analysis & presentation

Designing Influential Evaluations Session 6 Analysis & presentation. Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014. What does the client want?. Why is the work being commissioned? Lesson-learning Accountability Scaling up

jirair
Download Presentation

Designing Influential Evaluations Session 6 Analysis & presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing Influential EvaluationsSession 6Analysis & presentation Uganda Evaluation Week - Pre-Conference Workshop 19th and 20th May 2014

  2. What does the client want? • Why is the work being commissioned? • Lesson-learning • Accountability • Scaling up • Transferring a model to other situations/ contexts • To support a spending decision • How might this affect your approach to the report? Discussion

  3. Understanding the intervention & context Known knowns Known unknowns Unknown unknowns Kurtz C & Snowden D (2003) The new dynamics of strategy: sense making in a complex and complicated world. IBM Systems Journal 42: 462-483

  4. How to pitch the analysis • Guidance on quality of a study (DFID)

  5. Discussion • What tools or presentations can you use to convey how strong or robust your findings are? Discuss in your group and prepare to exchange ideas in plenary.

  6. Internal External Validity • The extent to which a causal conclusion based on a study is warranted • Reflects the extent to which bias is minimised • Essential for a satisfactory evaluation • The validity of generalized (causal) inferences in evaluation studies • The extent to which the results of a study can be generalized to other situations and to other people • Depends on the scope to collect necessary data and the effects of context

  7. Quantitative Qualitative How to demonstrate strength of findings • Limitations in data collection • Data quality • Statistical analysis • Significance tests • Confidence intervals • Probability • Data available for re-analysis • Limitations in data collection • Accuracy of records & note-taking • Consistency in documentation • Formal textual analysis • Comparisons and ranking • Scaling and rating

  8. Developing the argument • Start with the question to be answered • What were the findings • Strength of the findings • Statistical analysis – level of confidence • Extent of interview response – consistency and high percentage responses; or lack of any trend or pattern • How well are they corroborated? • Triangulation of method • Triangulation of data sources • Need to present data in a table or graph? • Comparisons across locations, stakeholder groups or time • Tables to have % in the body and total for rows and columns • Remember to explain the limitations • Consider other plausible explanations (contribution analysis) • Draw conclusions/ implications

  9. Tips for effective reading • Signpost well • Break up text with boxes, tables and figures • Lots of white space • Structure paragraphs logically • Mix sentence length and complexity. Short is good. • Cross reference and remind the reader • Summarise frequently • Carry findings through logically to conclusions • Draw lessons • Direct clear recommendations to specific people or organisations

  10. Introduce and signpost • Example “The following chapter presents findings on whether staff have access to appropriate training and technical advice to effectively ensure evaluability and results documentation as part of the grant management process (Hypothesis 2). Under technical advice, we consider the systems that are in place for quality assurance and expert review. The evidence for this chapter draws on a number of sources: 1) A review of training material and post course evaluations; 2) Attendance records of training; 3) the results of the online staff survey; 4) findings from the focus group discussions; and 5) the comparative analysis of grant management processes of other development agencies. The chapter is divided in two parts: first the findings from our review of training are discussed; this is then followed by the findings related to technical support.”

  11. Paragraph structure • Good guidance from Norad “Chapters presenting findings A body paragraph shall be allocated for each finding. Findings presented shall be evidence-based, triangulated and clearly referenced. The finding shall be presented as a clear topic sentence. This shall be followed by presentation of the relevant data, quotations, references, and analysis that shows how and why the evidence presented supports the position taken in the topic sentence. Included herein is also the presentation of the comparisons with other studies, significant trends if any, uncertainties, and limitations relevant for the analysis presented.”

  12. Paragraph example • Example “Minimum requirements on results measurement are not consistently understood by staff. While the Grant Management Manual outlinesa number of requirements on results measurements (See Table 3 above) the understanding of these among staff is mixed. On one hand, the staff survey indicated that 83 percent felt they had a clear understanding of the minimum requirements that a partner’s results framework should have. On the other, the interviews and focus groups revealed that staff felt very unsure of what the Norad/MFA approach to results measurement was and felt that individual staff members were given too much individual discretion to decide what is good enough when appraising results frameworks. Box 5 provides an illustration of some of the views we heard.”

  13. Timelines are effective

  14. Table layout

  15. Summarise frequently

  16. Carry through to conclusions • Implementation of a results-focus fails to ensure evaluability, partly because there is little clarity about minimum standards, but also pressures of time on staff and a lack of incentives to prioritise results.

  17. Recommendations Conclusions & Lessons What makes a good conclusion? Responds to the questions Concludes against evaluation criteria • What makes a good recommendation? • Concise • Addressed to a specific person or organisation • Indication of urgency or priority Someevaluators argue that evaluations should not make recommendations Conclusions and recommendations

  18. Clear simple lessons • Independence is critical to evaluation credibility: In addition to being directed from IFAD’s own Independent Office of Evaluation substantial efforts were made to demonstrate full independence by the use of a steering committee and of high level independent advisors. • Evaluation quality matters: The analysis in the IEE Final report was widely judged to have been sound and was accepted by the Executive Board, President and Senior Management Team. The imprimatur of the independent advisors helped achieve that. • Useability of recommendations is facilitated by reliance on reform initiatives already underway: The recommendations were far reaching, but a core built on reforms that had already been piloted. In this way the IEE endorsed the ideas of reformers in IFAD and gained their support.

  19. Recommendations for Norad’s Evaluation Department • Designing an evaluation: • Tighten the design specifications for evaluations. Draft terms of reference with tighter specifications for the purpose, objective and scope of evaluations so it is clear when outcome or impact is to be evaluated in addition to outputs. • Keep evaluation questions focused. Reduce the number of evaluation questions that are to be covered by an evaluation so that resources are clearly prioritised to key results. • Require evaluators to clearly describe the programme logic of the intervention being evaluated. All evaluations should be required to specify the programme logic or reconstruct if necessary as a basis for the design. • Be more specific in terms of reference about the required consultants’ skills. More consideration should be given to the specific skills and expertise required for either the team leader or core team members. This would require EVAL to do more preparation up front around which evaluation designs and methods are best suited to answer the evaluation questions.

  20. Elements of communication Commissioning process Stakeholder consultation The evaluation Social media On-line access Video Analysis – core source of information Press release Meetings & workshops Topic briefs Statistical synopsis Short summaries Full Report

  21. Communication options • Working in small groups brainstorm channels of communication and the audiences they are most likely to reach. • Create a table like the one below to illustrate your answer and for discussion in plenary.

  22. A communication strategy Purpose: To ensure the results of any evaluation are communicated clearly, accurately andin a way that audience can use the information • Consider before you start…. • Use and useability

  23. Use dependent on useability of design Useability an evaluation design shapes how it’s outputs can be used Potential users know why an evaluation is taking place Time studies to match decision-making Render evidence and data for the ‘non-technical’ Tailor material according to audience • Evaluative culture and organisational context • Findings attached to further funding • Value and priority given to evaluation • Organisational ‘insertion’ or position • External influence (pressure/ independence) • User characteristics • Forward looking to improvement • Involvement in design • Capacity to respond • Internalized purposes for evaluation Use and useability

  24. Communicating effectively • An evaluation is not complete until its communication is complete. • Depends where on continuum evaluation lies from modifying an intervention to modifying policy; re-shaping thinking about a problem & lesson learning • Availability of tracking and follow-up systems • Whether and when to put in the public domain or not? • if there are serious criticisms • if confidentiality might be broken • if it tackles a sensitive issue

  25. End

More Related