1 / 30

MONITORING AND EVALUATION METHODOLOGY

MONITORING AND EVALUATION METHODOLOGY. KEVIN P O’KELLY. Draft Report. Introduction Definitions Social Inclusion as a European Issue OMC NAPs Scope of MSI Project Why Mainstreaming?. Draft Report. Poverty, Social Inclusion and Public Policy Participative Methodology

kali
Download Presentation

MONITORING AND EVALUATION METHODOLOGY

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MONITORING AND EVALUATION METHODOLOGY KEVIN P O’KELLY

  2. Draft Report • Introduction • Definitions • Social Inclusion as a European Issue • OMC • NAPs • Scope of MSI Project • Why Mainstreaming?

  3. Draft Report • Poverty, Social Inclusion and Public Policy • Participative Methodology • Monitoring and Evaluation

  4. Monitoring and Evaluation Structure of the chapter: • Define ‘Monitoring;’ and ‘Evaluation’ • Theory of Evaluation • Designing an evaluation scheme (research design) • Indicators • Evaluating MSI

  5. MONITORING Monitoring or Process Evaluation: • Carried out during implementation • How, Why and under what conditions? • What happens during implementation? • Is implementation in line with original design?

  6. Evaluation Types of Evaluation • Impact or summative • Outcome • Variation • Counterfactual

  7. What is Evaluation? A systematic assessment of the operation and/or the outcomes of a programme or policy, compared to a set of explicit or implicit standards, as a means of contributing to the improvement of the programme or policy (C H Weiss, 1998)

  8. What is Evaluation? UK Treasury Green Book

  9. What is Evaluation? F FEEDBACK

  10. ACTORS • Political / Policy level • Administration / Management • Service providers • Target Groups / Recipients

  11. Design of Evaluation Evaluation Questions • Does the policy work? • Why does it work? • Why (how) should policies work? (Robert Walker – June 2004)

  12. 1 2 4 6 7 What worked? Infrastructure for evaluation Has the policy worked? Is there a problem? What policy would work? POLICY 5 3 Is this policy working? Will this policy work? Can we make this work?

  13. Evaluation Questions • Factual • Behavioural • Attitudinal • Knowledge

  14. Design of Evaluation Twelve Steps • Advisory Committee • Resources • Selection of evaluators • Key questions • Methodology • How to collect data • Questionnaire / interview guidelines

  15. Design of Evaluation • Target sample • Field work • Analysis • Meta-analysis • Write up findings • Publication / dissemination

  16. Policy Indicators • Social Policy Committee (Laeken) indicators • Low incomes households • Long-term unemployed • Low education levels • Health status • ‘In-work’ poor

  17. Policy Indicators • EAPN Indicators • Employment • Income levels • Housing • Health • Education • Participation and identity • Definition

  18. Policy Indicators Participation and Identity - ADefinition: The percentage share of the population with an income below 60% of the median (national poverty level) that are members of or connected with: (a range of social, community and cultural activities)

  19. Evaluating MSI • Mainstreaming is a process • Political commitment • Involve ALL key stakeholders • Realistic expectations

  20. Evaluating MSI Design issues: • What target groups? • What outcomes? • Quality of data • Comparison of small and large units • Collection of data at point of delivery (local level) • Challenge of ‘Silo’ policies

  21. Evaluating MSI • Access to data • What level of governance is mainstreaming implemented? • Political environment / decision-making • Structures • Link between governance roles • Culture • Different criteria for success

  22. MSI Question Does Mainstreaming of Social Inclusion have an impact on the policy process and outcomes? If so, can it be measured? • European level: NAPs/incl. & OMC • Implementation level of NAPs/incl.

  23. Stephen Donnelly Paper • Mainstreaming issues • Any measurement of mainstreaming will effectively be a measurement of qualitative processes … the ultimate purpose of mainstreaming is to produce measurable poverty reduction outcomes. • A key challenge in attempting to determine how far poverty reduction activities are mainstreamed centres on the subjectivity of any measurement tools that are put in place

  24. Stephen Donnelly Paper Measuring ‘Mainstreaming’ is subjective! How to define: • Political will /leadership • Partnership • Ownership • Cross-departmental working?

  25. Stephen Donnelly Paper • ‘Positive action’ initiatives are NOT mainstreaming • However, mainstreaming doesn’t preclude ‘positive action’ • ‘Poverty Proofing’

  26. Stephen Donnelly Paper A number of elements which are fundamental to mainstreaming: • Leadership • Structures • Capacity and skills • Community participation and engagement • Research and evaluation Why Mainstreaming?

  27. Stephen Donnelly Paper Draft Measurement Framework: • Political Leadership and sponsorship • Executive leadership and strategies • Capacity • Structures • Data, research and evaluation • Community engagement and participation

  28. Stephen Donnelly Paper Poses the question: is social auditing / theory of change an alternative to Mainstreaming?

  29. Measuring Mainstreaming • Qualitative? • Quantitative? • Combination?

  30. EVALUATION TYPOLOGY II FOR MAINSTREAMING SOCIAL INCLUSION CONTEXT (EU; National; Regional; Local // Economic; Demographic; Social; Cultural; etc.) Mainstreaming Social Inclusion Features Process Outcomes • Cross-cutting • Policy development • Participation • Monitoring and evaluation • Political commitment • Inputs (resources) • Organisation of resources • Outputs • Indicators Definition EVALUATION QUESTIONS (Robert Walker) • What worked? • How did it work? • Has the policy worked? • How did it (not) work? • Is this policy working? • How is it working? • Is there a problem? • What is the problem? • What policy would work? • How would it work? • Will this policy work? • How will it (not) work? • Can we make this policy work? • How can we make it work? • REQUIRMENTS FOR • EVALUATION • Clear Policy Objectives • Clear ‘Theory of Change’ • Clear Evaluation Objectives • KEY QUESTIONS EVALUATION OF MSI: • What would be good evaluation questions on the process of mainstreaming (Mst.)? • Can we build a scenario for an evaluation framework to a) identify & b) measure the impact of Mst.? • Do we have the tools to analysis the process and measure the impact? If ‘no’, how do we get the tools? • Can we identify evaluation processes of Mst. in the different Member States? • Why Mst? Is it better? Revised Research Question • EVIDENCE • Case studies by • Theme • Country • Governance level • Meta-analysis (JIMs, JIRs) • Theory of change • Scale of features • Interviews with key actors SCENARIOS OF EVALUATION FRAMEWORK e.g. Is Mts. a process / tool or a policy? No counterfactual

More Related