The tsunami evaluation coalition
Download
1 / 12

The Tsunami Evaluation Coalition: - PowerPoint PPT Presentation


  • 190 Views
  • Updated On :

The Tsunami Evaluation Coalition: . What Worked and What Did Not? European Evaluation Society 2006. Background of Tsunami Evaluation Coalition . The TEC is a new sector wide learning and accountability initiative constituted in February 2005

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'The Tsunami Evaluation Coalition: ' - PamelaLan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
The tsunami evaluation coalition l.jpg

The Tsunami Evaluation Coalition:

What Worked and What Did Not?

European Evaluation Society 2006


Background of tsunami evaluation coalition l.jpg
Background of Tsunami Evaluation Coalition

  • The TEC is a new sector wide learning and accountability initiative constituted in February 2005

  • It is made up of about 40 UN agencies, donors, NGOs, a non-for-profit and the Red Cross/Crescent Movement.

  • Participating agencies have worked within a framework that encourages sector-wide information sharing, lesson learning, accountability and transparency.

  • Focus on cross-cutting themes (coordination, needs assessment, local capacities, donor response, LRRD) and sector-wide performance rather than on individual agency performance


Tec timeline l.jpg
TEC Timeline

  • February 2005 Geneva Meeting

  • April 2005 First TEC teleconference

  • June 2005 ALNAP Meeting in the Hague

  • July – August - planning phase for all evaluations

  • September – November – field visits

  • October 05 – Copenhagen Meeting: Comm/Diss strategy

  • November- May 2006 Report Writing

  • December 8: ALNAP meeting/TEC meeting Brussels: Presentation of early findings and early findings report

  • December 25 – publication of early findings report

  • February 2006 – Teamleader validation meeting London

  • February – June 06 – Production of the Synthesis Report

  • July 06 – Launch of Synthesis Report during ECOSOC

  • July 06 – April 07 – TEC Follow up


Getting started 1 l.jpg
Getting started … (1)

  • This was a voluntary initiative started by a few actors who felt the time was right for a major inter-agency initiative

  • The first meeting in 02/05 did not immediately provide clarity about roles and responsibilities, nor the actual nature of the various studies

  • Many actors stayed on the fence ….

  • Much time was initially spent on gaining mutual confidence and building relationships

  • Key initial actors busy with other things and TEC workload was significant for all key actors

  • There needed to be dedicated time and resources at the beginning of the process: jump started by ALNAP - the f/t researcher played pivotal role to keep the TEC going during the early days


Getting started 2 l.jpg
Getting started …..(2)

  • Key tipping points were: the appointment of the researcher, the appointment of the coordinator and the ALNAP Biannual Meeting in the Hague in June 2005

  • ALNAP meeting in particular brought the necessary buy-in and funding

  • Funding, however, came in slow and had adverse impact on timeliness of the TEC

  • Some agencies had to wait for full funding before the evaluation process took off – delayed start-up of TEC missions as they were to be undertaken simultaneously

  • TOR preparation not coordinated - duplication

  • “Fishing in the same pond”


Fundraising l.jpg
Fundraising

  • Getting commitments from some major donors brought in others and gave wide buy-in

  • Down-side: multiple donors with short time-frames lead to short contracts for consultants, shortened field visits, increased admin costs

  • Raising funds for the core of the TEC and between studies should have been better coordinated

  • Fundraising for all five studies and the TEC Secretariat was extremely time-consuming and cumbersome – this should have been part of the appeal or a special trust fund established

  • Yet, excellent results BUT can this be replicated?


Implementation modalities l.jpg
Implementation Modalities

  • Set-up with a core management group and a broader working group worked well

  • Strong commitment by CMG and sub-groups – with very harmonious way of working together

  • 3/5 studies had similar set-ups

  • Good mix between face-to-face meetings and teleconferences

  • Good use of technology – telecon, shared documents, mapping, the resource CD

  • Backing of ALNAP, a network with a natural fit to the TEC and an interest in joint evaluations – was critical

  • Complex arrangement


Slide8 l.jpg

Flows:

Management

Coordination

Evaluation Reports

Core Management Group

for theTsunami Evaluation Coalitionand thesix thematic evaluations

Theme: Coordination led by OCHA

Theme:

International Community’s Funding Response led by Danida

ALNAP Secretariat

Hosts the TEC and manages the writing of the Synthesis Report.

TEC staff include: Evaluation Advisor & Coordinator (EAC), Researcher & Deputy Coordinator (RDC), and TEC Administrator

Theme: Needs Assessment

Led byWHO, SDC&FAO

Theme:

Impact Assessment

led by IFRC with the Global Consortium

Theme:

Impact on Local & National Capacities Led by UNDP by DMI

Theme:

LRRD

Led by Sida

Key Messages Report written by the EAC

Longer term Studies

(from ’06)

Individual Agency Evaluations (TEC Members)

Synthesis Report

Written by the Synthesis Primary Author with contributions from the EAC and the RDC.

TEC Online Forum(includes the Evaluation Map)


Working through the mandate l.jpg
Working through the mandate

  • Mandate was assumed by the TEC but had no broader clientele, including those not working in evaluation units of the respective TEC members

  • No real involvement of regional and local actors

  • Five cross-cutting themes in principle a good idea but resulted in overlap, uncertainties between the teams and a confused and overloaded recipient community

  • Did not consider alternative and possibly more cost effective approaches, e.g. one team per country

  • Missed out on “impact” although an attempt was made to cover this through an IFRC-planned initiative that took almost a year to materialize


Some tec shortcomings l.jpg
Some TEC shortcomings

  • Not all teams worked well together

  • Some critical expertise was missing

  • Not enough time spent in the field

  • Weak on hard data

  • Little information on Impact

  • Lack of local ownership/buy-in

  • Reports of varying quality – much work needed to bring some of them to acceptable levels

  • Country reports in some cases not very strong – underestimated time needed to do them well

  • Many cooks …teamleaders not fully on board

  • Did not reduce evaluation overload


Some tec achievements l.jpg
Some TEC Achievements

  • First major system-wide humanitarian evaluation since Rwanda

  • TEC approach can work and lessons from setting up the TEC will make the next time easier

  • Timing of TEC products was well planned and critical (initial findings report for 12/25 and the synthesis report for ECOSOC)

  • TEC is beginning to influence humanitarian reform debate

  • Clinton Initiative is moving on critical TEC issues in relation on NGOs

  • Much more follow up ahead but will need dedicated attention and a sustained effort at various levels


What should we do differently next time l.jpg
What should we do differently next time?

  • Include system-wide mechanism as part of the appeal

  • Get early in-country stakeholder buy-in

  • Establish a local support/reference group(s)

  • Organize regular in-country discussion/follow-up meetings (through a focal point organization)

  • Promote the early establishment of performance indicators and M&E systems

  • Develop an evaluation framework with agreed-to performance benchmarks

  • Reduce complexities (funding, multi-team etc)

  • Identify good practice, not just what did not work


ad