1 / 19

The Role of Evaluation in 2014 – 2020 period: Czech ESF experience

The Role of Evaluation in 2014 – 2020 period: Czech ESF experience. Vladimír Kváča Ministry of Labour and Social Affairs. Presentation overview. Current experience: Evaluative culture Evaluation designs Proposals for a better future. Current experience. Evaluative culture.

dyani
Download Presentation

The Role of Evaluation in 2014 – 2020 period: Czech ESF experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Role of Evaluationin 2014 – 2020 period:Czech ESF experience VladimírKváča Ministry of Labour and Social Affairs

  2. Presentation overview Current experience: • Evaluative culture • Evaluation designs Proposals for a better future

  3. Current experience Evaluative culture

  4. Capacity building in evaluation process 1 Evaluation Mandate 2 Evaluation Design / ToR 3 Evaluation Execution Management Internal eval. staff Consultants 4 Evaluation Report 5 Recommendation 6 Decision 4

  5. Current experience Evaluation design

  6. Typical Design: Post-test only of project participants X P Project participants • Need to fill in missing data through other means: • What change occurred during the life of the project? • What would have happened without the project (counterfactual)? • How sustainable is that change likely to be? end of project evaluation Based on: Jim Rugh‘spresentation, Stockholm, Semptember 2011 6

  7. Longitudinal Quasi-experimental P1 X P2 X P3 P4 C1 C2 C3 C4 Project participants Comparison group baseline midterm end of project evaluation post project evaluation Source: Jim Rugh‘spresentation, Stockholm, Semptember 2011 7

  8. Pre+postof project; no comparison P1 X P2 Project participants baseline end of project evaluation Source: Jim Rugh‘spresentation, Stockholm, Semptember 2011 8

  9. Post-test only of project and comparison X P C Project participants Comparison group end of project evaluation Source: Jim Rugh‘spresentation, Stockholm, Semptember 2011 9

  10. Pre+postof project; post-only comparison P1 X P2 C Project participants Comparison group baseline end of project evaluation Source: Jim Rugh‘spresentation, Stockholm, Semptember 2011 10

  11. Quasi-experimental (pre+post, with comparison) P1 X P2 C1 C2 Project participants Comparison group baseline end of project evaluation Source: Jim Rugh‘spresentation, Stockholm, Semptember 2011 11

  12. No need to go always for longitudinal design with control group.But there are many stronger designsthanone-group, post-test only…

  13. Proposals for a better future

  14. Steps for a better future 1 • Clear and explicit definition of programme/operation‘s goals that are attributable to it. Preferrably in terms of well-being of people. • Be aware of contribution xattribution difference.

  15. Area of decreasing influence / contribution Increased employment rate Higher work intensity sustained Prosperity of Companies Companies want to share kindergartens costs with parents Parents‘ work intensity increased Area of control / attribution Parents want to work more Companies employ parents Kindergartens used Parents want to send children in Kindergarten is affordable (costs, transport) for parents Kindergartens built Inspired by Jim Rugh.

  16. Steps for a better future 2 • Go for a better design than „one group post-test only“. • Evaluation design should be known before the operation starts, incl. draft of evaluation question. This requires monitoring system to be adjusted to evaluation needs and to be able to collect baseline data. • From current practices it is obvious that large parts of evaluation budgets are spent on re-doing insufficient or not well planned monitoring – this takes money and time.

  17. Steps for a better future 3 • Evaluation should be regarded as one of the main instrument to allow necessary responsible flexibility in OP execution / planning. • Monitoring and a system of indicator-based targets answers the question „are we doing the things right“? • Evaluation should periodically check if „we are doing the right things“ – do the originallyexpressed assumptions still hold? Are the programme goals still relevant? Are the targets/indicators free of perverse incentives? What change is caused by us?

  18. Steps for a better future 4 • Actively build the evaluative culture – at the level of external consultants, at the level of programme internal evaluation specialists and at the level of programme management as well.

  19. Thank you for your attention

More Related