1 / 35

Monitoring and Evaluating in the PHEA Educational Technology Initiative

Monitoring and Evaluating in the PHEA Educational Technology Initiative. Patrick Spaven. The Project Cycle. Plan projects – and plan to monitor and evaluate them. Carry out the projects. Formative evaluation. ET Strategy. Monitor. Summative evaluation. Outside Stakeholders.

morsem
Download Presentation

Monitoring and Evaluating in the PHEA Educational Technology Initiative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

  2. The Project Cycle Plan projects – and plan to monitor and evaluate them Carry out the projects Formative evaluation ET Strategy Monitor Summative evaluation Outside Stakeholders

  3. Levels • Your projects • Your programme • The whole Initiative

  4. Monitoring and evaluating your projects

  5. Monitoring • The capture of data about the project regularlyor continuously, usually in a consistent way. • Monitoring data can be expressed in numbers or concise narrative.

  6. Evaluation The all-round assessment of the performance of a project or programme Uses data from monitoring plus those captured during the evaluation itself, e.g. qualitative interviews with key stakeholders

  7. Uses of monitoring Provides up-to-date feedback on performance of the project. Are we on track with: Inputs (use of resources)? Activities? Outputs? Short-term outcomes? Contributes data to evaluations.

  8. Uses of evaluation Improvement - Are we doing the things right? Wider learning – What difference did we make? Did we do the right things? Accountability – Did we do what stakeholders expected? Advocacy – Look what we can do!

  9. M & E M&E • In human development, where results are often unpredictable, monitoring and evaluation are tending to converge. • Monitoring should look beyond planned results • Evaluation should be a regular, timely, process.

  10. Planning your M&E – the important questions • Why? For whom? • What? • How? When? Where?

  11. Planning your M&E – why, for whom and what • Be clear why you are doing it – what use you and others will make of it. • Who are the main stakeholders? • What do they need to know? • In this light, and bearing in mind the resources available, decide what you should M&E and on what scale. • Don’t plan to M&E anything that isn’t important to know about!

  12. WHAT to M&E Outcomes Outputs Activities Inputs

  13. WHAT to M&E • Outcomes = the changes the intervention helped to bring about (better educated students, more empowered academic staff) – both planned and unplanned • Outputs = the immediate, planned, results of the intervention (technicians and academic staff trained, courses re-engineered and launched on the LMS)

  14. WHAT to M&E • Activities = the things you did to ensure you delivered the outputs (identify and contract trainers; research the software options and procure) • Inputs = the resources you used in your activities

  15. Kirkpatrick’s 4 levels for evaluating training • Reactive: how they felt about the training • Cognitive/affective: new knowledge, skills, ideas, attitudes • Behavioural: doing new things, doing things differently • Organisational/multiplier: effect on the organisation; diffusing benefits to others

  16. Planning M&E - indicators • Decide whether it would be helpful to develop indicators for these elements (inputs, activities, outputs, outcomes). • [The ETS LogFrame requires indicators for outputs] • Indicators are pre-defined, precise, pointers that help you assess performance against the background of the planned inputs, activities, outputs and outcomes.

  17. Planning M&E - indicators Examples: • Proportion of activities in the project plan completed on time • Proportion of trainees who report that the training either met or exceeded their expectations • Numbers of students using the new LMS applications • Date when ETS was formally adopted by the University XYZ Board

  18. Planning M&E – indicators/targets • Indicators can be neutral as in the previous slide. • Or they can be expressed as targets if you are confident the targets are appropriate. • 2000 students using the new LMS applications within 3 months of their launch • ETS formally adopted by the XYZ Board by October 2010

  19. Planning M&E - indicators • Be careful not to let indicators narrow your perspective. • Outcomes in particular are often difficult to predict in advance. Look for unplanned effects of your interventions.

  20. A few words about baselines and attribution • You have a qualitative picture of where your institution stands in ET4TL on the verge of ETI Part B. • But if you want to assess with precision the change that your intervention has promoted in a specific area – e.g. a change in attitudes or usage of ICT among a particular group - you may need to capture data on the baseline. • This research must obviously be built into your project plan and implemented before the project gets moving.

  21. A few more words about baselines and attribution Trying to measure some sorts of change and attribute it to your intervention can be difficult. The before and after groups may not be comparable – e.g. different student cohorts - and change may be influenced by extraneous factors.

  22. Planning M&E - data • Work out HOW you are going to capture the data on inputs, activities, outputs and outcomes – including baseline data. • Bear in mind the cost, time and access issues in capturing the data. Don’t try to do too much. Use samples if appropriate. • Work out how you are going to process and analyse the data.

  23. Project Logical Framework – optional except for outputs

  24. Questions to answer in a summative evaluation Effectiveness (have we fulfilled the project objectives?) Efficiency (have the resources – including time – been used optimally?)

  25. Questions to answer with a summative evaluation Impact (what difference have we made - intentionally or unintentionally?) Sustainability (are the positive changes likely to last?) Relevance of project (were we doing the right things?)

  26. The Project Cycle Plan projects – and plan to monitor and evaluate them Carry out the projects Formative evaluation ET Strategy Monitor Summative evaluation Outside Stakeholders

  27. Reporting on your projects • The MOA requires six monthly reporting on projects: • Progress towards results and indicators • A summary of problems and challenges experienced • Revised activity schedules for each project • A budget variance report.

  28. Reviewing projects • You will want to meet more frequently than that to review your projects. You will want to keep a note of what you conclude to feed into the six-monthly MOA reporting. • Your regular review of projects will inform a six-monthly programme-wide self-assessment.

  29. Reviewing your projects We suggest you ask these standard questions, among others, at your project review: • Have the activities and products been completed according to plan - in terms of both timing and quality? If not, why not? What have you done to address completion and quality challenges? • What benefits is the project producing – for people and the institution as a whole? • Are there any negative effects, if so what are they? What is being done to mitigate any negative effects?

  30. Monitoring and evaluating your programme

  31. The self-assessment process • Meeting of ETI core team plus project leaders every 6 months • 4-6 hours • Report in full, with 1-2 page summary for MOA requirement (MOA also requires six-monthly reporting on projects, as we have seen).

  32. Self-assessment at programme level How is the overall ETI progressing? • What are the main changes/outcomes that have taken place for people and the institution as a whole – both positive and negative - as a result of the ETI? • Are there any outcomes/changes that you were expecting by now, but which have not taken place? If so, why do you think they haven’t taken place? • How useful has your ET Strategy been in this period? What specifically has it helped with?

  33. Self-assessment at programme level How is the overall ETI progressing? • What aspects of your team’s work have been most constructive and productive? • Are there aspects of your team’s work that have not worked well? If so, what are the probable reasons? • In what ways has the wider institution supported progress in the ETI? • Are there aspects of the wider institution that have hindered progress?

  34. Self-assessment at programme level How is the overall ETI progressing? • What has been helpful to you in the work of the SAIDE-CET team and the external evaluator • What has not been helpful? • What other support could they have given you that would have been helpful?

  35. ETS Logical Framework

More Related