developing evaluation capacity
Download
Skip this Video
Download Presentation
Developing Evaluation Capacity

Loading in 2 Seconds...

play fullscreen
1 / 15

Developing Evaluation Capacity - PowerPoint PPT Presentation


  • 81 Views
  • Uploaded on

Developing Evaluation Capacity. Jean Langlois, MSc, MBA. Outline. Intro Nature of the work NGO funding model Motivating people Leading questions. Introduction. The world according to Jean Langlois… Challenges in developing evaluation capacity at an ENGO Nature of the work

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Developing Evaluation Capacity' - tallis


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
developing evaluation capacity

Developing Evaluation Capacity

Jean Langlois, MSc, MBA

outline
Outline
  • Intro
  • Nature of the work
  • NGO funding model
  • Motivating people
  • Leading questions
introduction
Introduction
  • The world according to Jean Langlois…
  • Challenges in developing evaluation capacity at an ENGO
    • Nature of the work
    • NGO funding model
    • Motivating people
nature of the work challenges
Nature of the work - Challenges

Advocacy:

  • Diffuse causality
  • Causality attribution

Direct Conservation:

  • Measurable vs. Important biases
  • Durability of outcomes
  • Defining outcomes and impacts
nature of the work
Nature of the work

What works:

  • Invest time in the logframe at the outset
  • Articulate and track tangible results
  • Articulate, don’t track, intangibles

What’s needed:

  • Shared comfort with unmeasured intangible benefits
nature of the work questions
Nature of the work - Questions
  • How do we combat the biases of the measurable vs. the important?
  • Is it useful to articulate intangible benefits that will not be measured?
  • For advocacy campaigns, is there a better model than the dotted-line-logframe?
ngo funding model challenges
NGO funding model - Challenges
  • Not enough money, ever
    • www.naturecanada.ca
  • Reporting to multiple funders
    • Multiple evaluation frameworks
    • Varying degrees of buy-in to rigorous evaluation
  • Propensity to provide “positive” reports
ngo funding model
NGO funding model

What works:

  • Invest time in logframe at the outset
  • Consistent deliverables across grant proposals

What’s needed:

  • Shared expectations
  • Comfort reporting and learning from failure
ngo funding model questions
NGO funding model - Questions
  • How do we create an environment where people are comfortable sharing and learning from “failure”?
motivating people challenges
Motivating people - Challenges

“The effectiveness of our program will be measured by the degree to which the outputs and outcomes identified in the logical framework are achieved.”

- NOT Martin Luther King Jr

motivating people challenges1
Motivating people - Challenges
  • “Evaluation is a time drain”
  • “It’s not my job”
  • “I’d get more work done if I didn’t have to report on everything I do”

- Anonymous Colleagues

motivating people
Motivating people

What works:

  • Adapt the language to the audience
  • Always relate to mission
  • Tough love: invest in logframe at the outset

What’s needed:

  • Positive experiences with evaluation
  • Project-to-project application of lessons learned
motivating people questions
Motivating people - Questions
  • How can we create short-term wins (i.e. positive experiences with evaluation)?
  • How can we design evaluation approaches that produce useful results that are easily applicable to multiple future projects?
leading questions
Leading questions

How do we combat the biases of the measurable vs. the important?

Is it useful to articulate intangible benefits that will not be measured?

For advocacy campaigns, is there a better model than the dotted-line-logframe?

How do we create an environment where people are comfortable sharing and learning from “failure”?

How can we create short-term wins (i.e. positive experiences with evaluation)?

How can we design evaluation approaches that produce useful results that are easily applicable to multiple future projects?

ad