1 / 24

Evaluation for Institutional Learning and Change (ILAC) A CGIAR initiative & it’s implications for IFAD Doug Horton

Evaluation for Institutional Learning and Change (ILAC) A CGIAR initiative & it’s implications for IFAD Doug Horton & Jamie Watts February 22, 2005. “To be serious about poverty, the agricultural research and development community has to be serious about institutional learning and change.”

barto
Download Presentation

Evaluation for Institutional Learning and Change (ILAC) A CGIAR initiative & it’s implications for IFAD Doug Horton

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation for Institutional Learning and Change (ILAC)A CGIAR initiative & it’s implications for IFADDoug Horton & Jamie WattsFebruary 22, 2005

  2. “To be serious about poverty, the agricultural research and development community has to be serious about institutional learning and change.” Robert Chambers “Changes in the CGIAR should be home-grown and evolutionary.” Ian Johnson “The significant problems we face cannot be solved at the same level of thinking we were at when we created them.” Albert Einstein

  3. Topics • Why be concerned with ILAC? • What is ILAC? • ILAC and innovation systems • Entry points for ILAC: The key role of evaluation • Some practical examples • ILAC Briefs • Q & A • What are the implications for IFAD?

  4. ILAC Team • Jamie Watts (IPGRI project leader) • Doug Horton (project coordinator) • Anne Acosta (ILAC research associate) • Robert Chambers (IDS-Sussex) • Boru Douthwaite (CIAT) • Andy Hall (UNU-INTECH) • Charles Staver (INIBAP) • Peter Matlon (RF), Theo van de Sande (DGIS), Stephan Krall (GTZ) & Shantanu Mathur (IFAD)

  5. 1. Why are We Concerned with ILAC? • Accelerating change on many fronts • Limited progress in poverty reduction & sustainable resource management • Limitations of “lone ranger,” “pipeline” & “TOT” approaches To remain relevant and have impact, our organizations need to change: • Link more effectively with partners & users • Evolve more rapidly • Learn (from errors & failures) • Translate lessons into effective action

  6. 2. What is ILAC? • ILAC emerged from concerns that CGIAR centers were not: • Sufficiently engaged with the “real world” • Learning enough from their evaluations • Using lessons to improve their work • ILAC has academic roots in : • Utilization-focused evaluation, • Science & policy studies • Management science • Organizational development • Action research

  7. How can we define ILAC? A process of reflection, reframing and using lessons learned during R&D processes that changes: • Professional behaviors of those involved in the agricultural innovation • Institutions (habits & norms) that guide behavior • Performance of R&D organizations

  8. Another Perspective on ILAC An emerging menu of interventions that promote new behaviors and relationships, through: • Critical reflection & self-awareness • Analysis of both successes & failures • Using lessons to work more effectively • Changing rules, norms & conventions that guide behavior • Developing an environment that supports learning and change

  9. Our Vision: ILAC will be a catalyst for changing the way we conduct agricultural research to improve its contribution to development

  10. 3. ILAC and Innovation Systems In the Innovation Systems Framework: • Agricultural research organizations operate within complex, adaptive systems • Innovation is a socio-technical process • Innovations emerge at the interfaces of knowledge production, dissemination & economic activity

  11. Elements of an Innovation System • Organizations and individuals involved in generating, diffusing, adapting and putting new knowledge into economic use • Institutions and policies that govern behavior and interactions • Interactive learning

  12. Implications of the Innovation Systems Perspective • Communication, negotiation & relationships are crucial (hence facilitation) • R&D organizations need to concern themselves not only with technical but institutional innovation • Participatory, evolutionary, systems approaches to planning, management & evaluation should be favored over linear, reductionist, expert-based approaches

  13. 4. Entry Points for ILAC • Developing individuals’ knowledge, attitudes andskills • Reorienting management systems • Fostering a culture of innovation, learning and change • Fostering a supportive external operating environment

  14. Key Role of Evaluation • Evaluation, broadly defined, can serve as a tool for learning from past successes & failures in order to improve future actions • To serve this purpose, evaluation must be “Utilization-Focused” & involve key potential users of the evaluation results

  15. Evaluation & Organizational Learning: Principles & Pitfalls • We learn most from our “errors,” but seldom admit them • We learn most “in the field,” but seldom go there. • Most organizations have serious “learning disabilities.” • The higher you go the less you can “afford” to learn. • The “new boss syndrome” • Staff turnover and “knowledge loss” • Evaluations are seldom utilization-focused & seldom support organizational learning & change • Organizational learning is a complex & delicate social process that needs to be managed.

  16. 6. Examples of ILAC in Action • An integrated action research / action learning program • Learning from innovation histories • Building learning into external reviews • Building KS into internal reviews / meetings • Combining participant and expert reviews • “Horizontal evaluation”

  17. The Andean “Papa Andina” NetworkAndre Devaux, CIP • Applied R&D with many partners • Emphasis on “institutional platforms” • Participatory planning & review • Structured “sistematización” • Publication of strategies & results • Center-based, with long-term funding (SDC+) • Stable core team, who realized the key role of institutional innovation, participation & learning • Consultants in social science, evaluation & communications

  18. Innovation / Institutional HistoriesAndy Hall (UNU-INTECH), Shambu Prasad (ICRISAT) & Boru Douthwaite (CIAT) • Published accounts & conventional wisdom often differ sharply from the way things actually happen. • IH: An “investigative journalistic account” that seeks to capture both technical & institutional factors of success / failure • Experience in IRRI, CIP, ICRISAT, CIAT • Challenges: • Engaging key actors, to avoid “dropping the bomb.” • Questions of legitimacy within traditional research organizations

  19. Building learning into CCER’s • Analysis of CCERs by project managers • EPMR process informed by CCERs • Revised CCER process to include self assessment • SWOT analysis • Recommendations for improvement • Survey of stakeholder perspectives • Use of existing data from other sources • Reporting of results to project managers and top management • Survey results inform priority setting within IPGRI • Have reviewed the learning dimension (presentation at AEA)

  20. Building KS and learning into a Center’s Annual MeetingSimone Staiger, Nathan Russell & Doug Pachico, CIAT • Issues: • How best to structure large-scale Annual Review & Planning Meetings? • How to address “hot issues” and keep things cool? • Approach: • Define clear objectives ! • Introduce new ways to organize sessions: • “Open Space”: Structured free-for-all for staff to work through issues of importance to them • “Peer Assists”: For Individuals to present problems & get suggestions from colleagues • “Knowledge fair” rather than PPT presentations • Self-organized small-group meetings • “Barometer team” & “After-action reviews” • Participant evaluation

  21. Combining Participant & Expert Review WASNAR • Provide guidelines for preparation of program presentations • Have structured reviews by staff groups • Have an external reviewer participate in the meetings & assess both: • The programs • The review procedures

  22. “Horizontal Evaluations” within a NetworkAndre Devaux, Papa Andina • Conducted within a network • Organize project review meetings • Provide guidelines for project presentations & reviews • Invite managers of other projects to review the project in question • Involve an external reviewer as well

  23. Eight ILAC Briefs: • Institutional Learning & Change: A CGIAR initiative • Innovation systems • Learning-oriented evaluation • Collaborative agreements: A “how to” guide • Preparation and use of innovation histories • Participatory strategic planning: An example from CIMMYT • Towards integrated monitoring and evaluation systems • Learning alliances

  24. Summary • Why ILAC? • What is ILAC? • Where did it come from? • Innovation systems & ILAC • Entry points & ways forward • Examples of ILAC in action • ILAC Briefs • Q & A • What are the implications for IFAD?

More Related