1 / 17

MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar

MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar Unitatea Centrală de Evaluare. Evaluation Working Group – second training seminar for evaluation staff of 2007-2013 Romanian NSRF and Operational Programmes

emiracle
Download Presentation

MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar Unitatea Centrală de Evaluare Evaluation Working Group – second training seminar for evaluation staff of 2007-2013 Romanian NSRF and Operational Programmes Evaluation from the External Evaluator’s Perspective Dr Jim Fitzpatrick, Fitzpatrick Associates Economic Consultants, Ireland May 18, 2006

  2. Content • Evaluation as a “process” • Tasks in a typical evaluation process • Methodologies in practice • Common problems in practice • Procurement of evaluation - typical stages… • Some “tips” for evaluation commissioners • Pitfalls the evaluator faces • Some tips for evaluators • Wider issues for the future

  3. Some overall considerations from the external consultant’s perspective… • A wide variety of different contexts (e.g. doing v supervising, policy v service delivery, ex ante v ex post, technical v non-technical) • “Planning” and “doing” closely related • Experience across a wide range of organisations, topics, etc • Overlaps with planning other types of assignments • An external consultancy perspective

  4. Evaluation is a process, not just a technique! • Evaluation is a balancing act between.. • Client, user relations • Research, analysis • Managing the team • Stakeholder Involvement • Time, resources, budget

  5. Tasks in a typical evaluation process… • Establish/Understand Context • who is the “client”? • why is the evaluation being done? • any specific use intended? • what kind of evaluation is needed? • Obtain/Prepare/Agree Brief (ToR) • is there one? • is it clear? • write one? • is it agreed?

  6. Evaluation tasks (continued)… 3. Prepare Work Plan (Proposal) • overall approach (i.e. how interpreting brief, how going about it) • analytical framework (i.e. overall logic) • methodology/techniques (e.g. CBA, CEA, MCA, benchmarking) • work programme (i.e. the data, data collection, e.g. surveys, interviews*) • Evaluation Team/Resources (budget) • no. of people/person days • types of people • necessary expertise (e.g. on technical aspects) *data not just statistics, includes other information

  7. Evaluation tasks continued… 5. Doing the Evaluation • implement method/work programme • client relations • manage team • deal with unexpected issues 6. The Output/Report/Schedule • meet how often, how many, when? • nature of report e.g. length? style? nature? • presentations?

  8. Evaluation methodology in practice… • trying to establish if intervention did (or will) make a difference so “with-without” (scientific method at its core) • formal quantitative techniques very desirable, but very different • MCA/scoring, weighting and ranking most used • others useful are: • before v after (time-series) • places that do, don’t have (“control group”) • “expert” opinion • views of stakeholders • always need some framework for answering the evaluation questions (samples available)

  9. Common problems in practice… • poor initial project/programme design • inability to control for external influences • poor/unavailable indicators (too few, too many, not really capturing essence of intervention) • lack of consensus about purpose of evaluation • “scope creep”

  10. Procurement of evaluation - typical stages… • Policy issue or topic, regulatory requirement • Terms of Reference, brief • Invitations, tendering • Selection, contracting • Inception • Managing, undertaking, analysing • Reporting

  11. Common Challenges, Good Practice

  12. Common Needs, Challenges and Good Practice (Cont’d)

  13. Some comments on the procurement process… • You need to balance competition with the need for dialogue with evaluators • Can you invite too many bidders? • Need for guidance on scale • Circulation of all replies to all bidders! • Ability, availability of clients representatives

  14. Some practical tips for evaluation commissioners… • ensure programme/project planning are good (monitoring, evaluation considered at outset) • Make sure the Terms of Reference have: • Clarity • Focus • Indicate Scale • Relationships: • Be open post selection • Avoid surprises • take time to get shared understanding of what’s happening • ensure there is some kind of method/framework being used • performance indicators – use “sensibly”, and note they are the fuel of monitoring/evaluation, they are not it themselves

  15. Pitfalls the evaluator faces… • Misunderstand context • Objectives unclear, not agreed • Client unclear, not agreed • Lack of balance, being one-dimensional • Thinking you know the answer • Work that’s not used in the end • Having no analytical framework • Being over-ambitious • Not having the right expertise • Failing to consult stakeholders • Not allowing time for project/process/management • No “intellectual leadership” • Report not doing work justice

  16. Some practical tips for the evaluator • watch for “scope-creep” • keep re-reading the brief • estimate time needed and double it! • avoid surprising the client • don’t over-promise • structure the report early on • set internal deadlines SATISFACTION = PERCEPTIONS MINUS EXPECTATIONS (S=P-E)

  17. Wider Issues for the Future • extent of evidence-informed, evaluation culture • the need for research, evaluation to “speak” to policy makers • need for more basic, neutral data collection • balance between “independence” and “relevance” • emphasis on costs of research/evaluation v costs of poor policy decision • more inter-disciplinary research, evaluation (e.g. “economic” v “social”) • over-evaluation of some areas, under-evaluation of others

More Related