Meteo mali agrometeorological program evaluation preliminary report
This presentation is the property of its rightful owner.
Sponsored Links
1 / 24

Meteo Mali Agrometeorological Program Evaluation: Preliminary Report PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Meteo Mali Agrometeorological Program Evaluation: Preliminary Report. Edward R. Carr Department of Geography University of South Carolina. Assessment Background. June 2011 meeting in Dakar Demand-driven assessment Lessons learned/good practices Scaling up. Assessment Design.

Download Presentation

Meteo Mali Agrometeorological Program Evaluation: Preliminary Report

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Meteo mali agrometeorological program evaluation preliminary report

Meteo Mali Agrometeorological Program Evaluation: Preliminary Report

Edward R. Carr

Department of Geography

University of South Carolina

Assessment background

Assessment Background

  • June 2011 meeting in Dakar

  • Demand-driven assessment

    • Lessons learned/good practices

    • Scaling up

Assessment design

Assessment Design

Three components

  • Science assessment

    • IRI

  • Institutional assessment

    • CCAFS

  • Field assessment

    • CCAFS/IER/University of South Carolina

Science assessment

Science Assessment


  • What climate information is provided to farmers?

  • What is the scientific basis for this information?

  • What is the translation and dissemination process?

  • What opportunities are there for improving the quality and relevance products?

  • What challenges have been encountered in satisfying specific user needs?

Science assessment1

Science Assessment

February 2012

  • Meetings with Mali Meteo

  • Consultations with AGRHYMET and ACMAD

  • Review of methods and documentation

  • Integration with field assessment findings

Science assessment2

+ Raingauge


o Synoptic

Science Assessment

Draft Assessment prepared at the end of July 2012

  • Example challenges

    • Difficulty of providing reliable local-scale forecasts

      • Onset of the rainy season

      • Timing of possible dry spell.

    • Need for monthly forecasts

    • Lack of verification information

Mali’s network of meteorological stations

Science assessment3

Science Assessment

Draft Assessment

  • Example opportunities

    • Prospects for improved downscaling

      • Merging satellite and station data

    • Using Global Producing Centre (GPC) model outputs to strengthen seasonal and monthly forecasts

Distribution of known GLAM villages

Institutional assessment

Institutional Assessment

Learn what institutional factors contributed to program success

  • Narrative program history

  • Identification of product development process

  • Mapping of the changing flow of information, products, and resources in the program

Institutional assessment1

Institutional Assessment

June – November 2012

  • Responses from 12 informants

    • Follow-ups ongoing

  • Draft document prepared

Institutional assessment2

Institutional Assessment

Institutional assessment3

Institutional Assessment

Example draft findings

  • Coordinating group was highly interdisciplinary but informal

  • Continuous project funding allowed time to learn

  • Opportunities

    • Broader focus for information (including livestock and fisheries)

    • Formalized frameworks that entrench and support the interdisciplinarity of the program

Field assessment

Field Assessment


  • Identify current impacts of the program on participants

  • Explain the causes of these impacts

  • Both extraordinarily difficult to do post-hoc

Field assessment1

Field Assessment

January 2012 – March 2012

  • 36 villages

    • 18 GLAM, 18 control

  • 144 focus groups

  • 720 interviews

    • Men and women

    • Young and old

Field assessment2

Field Assessment

Broad assessment

  • Livelihoods practices

  • Agricultural activities

  • Engagement with NGOs

  • Engagement with the Agromet program

Field assessment3

Field Assessment

Analysis ongoing

  • Over 430,000 data

  • Validation of controls

    • No baselines

    • Too long a duration

  • Identification of groupings for analysis

Field assessment4

Field Assessment

Initial findings

  • Opportunities to build on end-user delivery

  • Opportunities to better target end-user needs

    • Current and in preparation for future needs

  • Opportunities to expand the user base

    • Heavily focused on younger men

Field assessment5

Field Assessment

Initial findings

  • Suggestions of impact

    • Differential numbers of crops grown/varieties used by those who use agromet data and those who do not

    • Varies by grouping/agroecological zone

      • Complex impact by crop

Field assessment6

Field Assessment


  • Will have difficulty talking about yield impact

  • Will have difficulty establishing causality for impact

    Addressing the Limitations

  • Re-running the survey in February-March 2013

  • Qualitative work in selected villages in May-July 2013

Coming soon

Coming soon…

Integration of science and field assessments

  • Will look at science constraints and opportunities in context of end-user demands

    Rigorous assessment of impact at the end-user level

    Integrated lessons learned and good practices

  • Connecting science, institutional context, and end-user impact



  • Login