1 / 34

Project Evaluation under the 2007-2013 Convergence and Competitiveness Programmes

Project Evaluation under the 2007-2013 Convergence and Competitiveness Programmes. Richard Gaunt Mark Beynon CRG Research Ltd. Overview. About CRG WEFO Requirements Experiences and Perceptions of Evaluation Case Studies Evaluation – Principles and Practice Do’s and Don’ts Q & A.

bonner
Download Presentation

Project Evaluation under the 2007-2013 Convergence and Competitiveness Programmes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project Evaluation under the 2007-2013 Convergence and Competitiveness Programmes Richard Gaunt Mark Beynon CRG Research Ltd.

  2. Overview • About CRG • WEFO Requirements • Experiences and Perceptions of Evaluation • Case Studies • Evaluation – Principles and Practice • Do’s and Don’ts • Q & A

  3. About CRG • Leading provider of research, evaluation, consultancy and policy development services • Completed over 400 projects for a wide range of clients in both the public and private sectors • 15 dedicated researchers from a range of disciplines, capable of utilising a full range of qualitative and quantitative research techniques • High level of expertise in different areas of Welsh policy and funding, including EU policy and Structural Fund Programmes • Excellent data collection, entry and analysis services • Full bilingual (English/Welsh) service provision • ISO 9001 accredited procedures and quality assurance.

  4. WEFO Requirements • 2007-2013 Programmes – all sponsors required to undertake or commission evaluation of projects • All projects awarded £2million + in grant (ERDF/ESF) are required to have their projects evaluated by an external independent contractor (+ projects involved in implementing ERDF innovative or experimental actions and innovative projects under Article 7 ESF) • Projects below this threshold may still wish to appoint external contractor because of independent perspective which can be provided • The size of the evaluation should be proportionate to the size and complexity of the project • Costs associated with undertaking evaluation are eligible for Structural Fund assistance

  5. Rationale Why Evaluate • Assess whether projects achieve their objectives • How efficiently were outputs and results achieved • Wider consideration of outcomes and impacts • What would have happened without the intervention Evaluation is wider in scope than monitoring • Whilst monitoring data can help asses performance against objectives – wont produce rounded judgement of success • Evaluation – considers quality of achievements and contextual factors which have helped/hindered success • Example: Monitoring data will tell you 100 jobs created, but evaluation will tell you the quality of those jobs, the impact on beneficiaries, and considers whether a proportion of the jobs would have been created in any case • Share good practice, make recommendations for future action.

  6. Evaluation Planning Setting Aims and Objectives • Evaluation Plan to be submitted when developing your project • Outline evaluation activity during the life of your project • State when evaluation will be undertaken, by whom, and which evaluation methods will be used. • Plan will identify key evaluation questions, and the scope and size of the evaluation • Size of evaluation proportionate to size and risk of project – agreed with the WEFO Project Development Officer at development stage • Guidance on evaluation questions and suggested methodologies available on WEFO Website • This is an effectively an overview document – doesn’t necessarily have all the detail you need to design and implement the evaluation and ensure it is integrated into your project

  7. Who should evaluate? • Size of project: > £2million = External, < £2Million = option to run it internally • If your project falls below threshold you may still wish to go external. Issues to consider include: • Do you have the skills internally • Do you have enough staff resource to carry out the work • Independent Perspective • External Evaluation Cost • Contract Management Time • The other factor to consider is risk. High Risk/ High Profile projects will require a thorough evaluation which may well be resource intensive. Project which are innovative or pilot in nature or where there are learning or participatory elements will need more resource intensive evaluations. • Key Factors: Cost, Complexity, Resources, External Perspective

  8. Objectives of Project Evaluation • Demonstrating the need for your project - feasibility • Evaluating Project Processes • Evaluating Project Outcomes • Reporting against Impact Indicators

  9. Issues to Consider • Formative vs. Summative – important decision that will shape the research methodology and impact upon the way the project makes use of evaluation data. Big advantage of formative is that it provides an ongoing assessment of progress against objectives, and allows opportunities/ corrective actions to be identified at every stage. • Methods for collecting Data – Covers primary and secondary data, and methods should be appropriate to answer the key research questions. Selection of method has resource implications. Weigh up the pro’s and con’s of different approaches • Methods for analysing data – should be considered when deciding the methods, size and scope of your evaluation. The method of data collection will have a clear impact on the method of data analysis. The two main methods are: • Quantitative – statistics used to provide characteristics of a sample, and if there is sufficient data, to derive conclusions about performance of the project • Qualitative – patterns and themes emerging from interview transcripts, data or recordings can be categorized to provide an analytical description of a sample and the key themes/issues emerging from a projects • Reporting & Dissemination – how do you intend to use evaluation data? Is it just to satisfy WEFO requirements or will it actively be used as part of the project management process.

  10. Research Specification The Specification will set the parameters for the evaluation. It will need to include: • Introduction • Background/ Requirements • Method/Budget – 2 options: detail method or detail budget • Reporting/ Deliverables • Dissemination • Timetable • Data Protection Issues • Contract Award Criteria • Contract Management Arrangements • Cost • Internal/External?

  11. Commissioning Evaluation If external evaluators are to be used, need to commission effectively. Issues to consider: • Detailed Research Specification and Invitation to Tender • Evaluation should meet project evaluation needs and WEFO requirements • Need to comply with your own procurement guidelines • WEFO will advise on approach, method and budget • PDO’s don’t appear to be paying much attention to evaluation pre-approval. Uncertainty over to what degree WEFO will involve themselves in the commissioning process at later stages • Evaluation costs are eligible for Structural Fund Support, so the expenditure will be subject to approval, audit, monitoring, etc. • Possibility of Supplier Frameworks being used to provide details of approved ‘reputable’ companies • Decide on approach early and work with WEFO to ensure they are aware, and happy with the way the project is being evaluated.

  12. The Evaluation Experience • Evaluation can be a powerful project management and review tool. Can measure progress against targets, identify what works well/ less well, provides the opportunity for corrective action to minimise risks and take advantage of unforeseen opportunities. • It also identifies best practice, and explores outputs, outcomes and impacts. Recommendations can inform both existing and future delivery. • If badly designed, poorly resourced, or ineffectively delivered they can be a burden to project managers, partners and beneficiaries. • Keen to hear your experiences?? Any good news/ horror stories? Any concerns about evaluation?

  13. Case Study 1 Research Skills Training • Objective 1 ESF funded collaborative project between Bangor, Aberystwyth and Swansea Universities • Funded 67 PhD Studentship and ran collaborative research projects with regional SMEs • “to provide research training to enable individuals to develop the skills to contribute to research as professionals. The project will also increase the research capacity of SMEs, encourage them to undertake research and recruit researchers. It will also support the development of key technology clusters in the region”.

  14. Case Study 1Research Skills Training • Summative Evaluation • Research Methodology: • Desk research • Face to Face Interviews with academic supervisors • Case Studies and Interviews with company supervisors • Interviews with key management and administrative staff • Focus Groups and telephone interviews with beneficiary PhD Students • Regular updates, draft final and final reports to client

  15. Case Study 1 Research Skills Training Outcomes for the client: • Thorough evaluation of the project model, internal processes, outputs and outcomes for all parties – University, companies and most importantly the students • Highlighted successes, and explored why some collaborations failed to achieve anticipated outcomes – unintended outcomes • Identified a number of learning points for future collaborative projects – client able to refine the model for new funding bid • Final Report completed in time for client to use as part of discussions with funder for future phases – powerful tool, evidence success and demonstrate commitment to continual improvement • Proactive approach to evaluation

  16. Case Study 2DCELLS Skills in the Workplace • Skills in the Workplace, launched in June 2005, was a Welsh Assembly Government initiative designed to “raise skill levels of employees and create an ethos of training within SMEs in North Wales”. • The project attracted in excess of £4 million of European Social Fund (ESF) support, and ran until July 2008. • FE Colleges and Private Training providers offering ‘bite size’ chunks of training to SMEs

  17. Case Study 2DCELLS Skills in the Workplace • Comprehensive Formative Evaluation, run over 3 stages – Baseline, Mid-point, Final • 4 Thematic Reports – confusion in the marketplace, size of company, sectoral analysis, modes of delivery • Research Methodology: • KI Interviews • Employer Survey • Company Case Studies • Desk Research/ MI Analysis • Regular attendance at steering group meetings • Dissemination/Stakeholder events • Comprehensive reporting

  18. Case Study 2DCELLS Skills in the Workplace Outcomes for the client: • Formative evaluation model, measuring distance travelled and allowing partners to identify issues/opportunities at every stage of delivery – integrated approach • Thematic reports offered in-depth analysis of key themes which impacted upon or resulted from the project • Conclusions and Recommendations used to inform and evidence decisions for future business support models • Evaluation was at the heart of dissemination efforts which celebrated successes and reflected upon lessons learnt

  19. Evaluation Principles

  20. In practice evaluation is seldom “pure” • It reflects real life situations • It’s done for a mix of purposes • Combines a number of types and design options • Measures at a number of levels • Utilises a mixture of techniques (triangulation) • It allows judgements to be made Evaluation is not an exact science

  21. Why Evaluate? • TO LEARN • Demonstrate/plan/improve utilisation of resources • Provide evidence to funders • Develop provision • Inform policy development • Accountability • Because we have to?

  22. When do you evaluate? • What do we really want to know? • Long-term impact or short term outcomes • Areas for improvement • ‘Next time’ • Policy review Taken from: The Green Book: Appraisal and Evaluation in Central Government. Treasury Guidance

  23. Evaluation Options • Ex-Ante (feasibility, what if) • Formative (action research/ could it be better/different?) • Summative (what we did/achieved) • Process (how was it implemented) • Output (what was delivered) • Outcome or Impact (what difference it made) • Goal free (unintended consequences) • Economic Appraisal • (Builds on, covers more than monitoring)

  24. Design Options • Desk Research (what’s already known) • Data Reanalysis (what’s available already?) • Cross-sectional (cost-effective sampling) • Comparative (between groups/contexts) • Longitudinal (over time) • Some/ all of the above

  25. Unit of Analysis • Small/ larger areas • Stakeholder group • Individual/cohort (farmer or group of farmers, offenders) • Sector (farming, all farmers, prisons or prisoners) • The economy (all economic outputs)

  26. Qualitative Techniques Semi-structured data collection • Focus groups, telephone and face-to-face interviews • Observations • Workshops • Diaries • Data collection – topic guides vs. questionnaires • Analysis • Thematic • Content Analysis • Ethnographic

  27. Quantitative Techniques Statistical analysis • Surveys – CAPI, telephone, face-to-face, postal, internet/email • A set of clear questions to pose • Hard data • Census – fact based • Statistical returns/ MIS • Trends/statistical reanalysis • Enhance primary data

  28. Advantages Face to face interviews: Good for open-ended issues Good (80%) response rates Group interviews: Excellent for open-ended issues May be developmental in themselves (staff) Focus groups/community panels: Good for exploring issues Possible to standardise Possible to preserve anonymity Disadvantages Difficult to analyse Resource heavy (for researcher & interviewee) Difficult to organise and analyse Need trained interviewers Useless for “competitors” Needs trained facilitators Difficult to analyse accurately May be to balance the group Data Collection (1)

  29. Advantages Telephone interviews: Good is the topics can be defined Quick (if you can get through) Postal questionnaire: Easy to administer Ensures privacy Good for standardised data Disadvantages Poor for qualitative date Resistance to SUGOM Need to define questions very tightly Low response rate Unable to control who fills in the form Data Collection (2)

  30. Reporting • To whom? • For what purpose? • Brief report to a meeting? • Feedback to participants/clients/more widely? • Part of an on-going process? • Need for standardisation? • Multiple audiences • Summary • Main report • Extended analysis • Feedback to respondents

  31. EvaluationDo’s & Don’ts

  32. Do… • Establish key research questions from the outset – what do you really want to know? • Collect good baseline data • Clear specification will ensure better quality of tenders • Consider the pro’s and con’s of internal/external evaluation • How do you want the evaluation to work – is it to satisfy the funder, or do you want it work as a project management tool • Make evaluation work for you • Make sure you set aside adequate time and resources to implement and manage the evaluation • Involve stakeholders in the evaluation • Disseminate and act on findings

  33. Don’t… • Think about evaluation too late - unlikely to be effective • Send out ITT with ill-defined ideas and requirements – difficult for tenders to respond to effectively; difficult to compare tenders • Leave evaluators working in isolation – manage and learn from the process • Cast the net too wide when commissioning evaluators – wastes time and resources for all • Lose sight of why you are evaluating • Treat evaluation as a bolt-on • Ignore findings – act on them

  34. Thanks for your time Happy to receive comments or queries Mark Beynon CRG Research 25 Cathedral Road Cardiff CF11 9TZ 029 2022 3218 mark@crgresearch.co.uk

More Related