1 / 17

MONITORING AND EVALUATION – OPPORTUNITIES AND CHALLENGES

MONITORING AND EVALUATION – OPPORTUNITIES AND CHALLENGES. Presentation 25 July 2005 TMIG - Treasury. OUTLINE. Evaluation – means to enhance democracy and effectiveness M&E in South Africa - PSC M&E System, other systems Assumptions in performing M&E Challenges in M&E Questions.

eldon
Download Presentation

MONITORING AND EVALUATION – OPPORTUNITIES AND CHALLENGES

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MONITORING AND EVALUATION – OPPORTUNITIES AND CHALLENGES Presentation 25 July 2005 TMIG - Treasury

  2. OUTLINE • Evaluation – means to enhance democracy and effectiveness • M&E in South Africa - PSC M&E System, other systems • Assumptions in performing M&E • Challenges in M&E • Questions

  3. (1) Evaluation – means to enhance democracy and effectiveness • Recent global emphasis on evaluation – in particular for government – linked to notions that government also needs to demonstrate accountability and efficiency. • Government required to measure and report on its performance – similar to bottom-line sectors such as the private sector. • Demystification implies that both the political (elected) and administrative (appointed) spheres show results, and are open to measured externally. Similar to boards and management in private sector

  4. ….impetus for evaluation demand • Changing political context has created evaluation “demand” – M&E dependent upon a conducive context (political, civic and administrative) that is accountable and receptive to measurement • Growth in M&E accelerated from 1999, evident in the 500 delegate conference at the 3rd African Evaluation Association Conference in 2004, Cape Town. Theme “Evaluation, democracy and development”. • African M&E linked to other continental initiatives such as APRM, NEPAD and the more vigorous engagement by the continent with international forums.

  5. …primacy of M&E (political-economic engagement) • M&E viewed as a means through which engagement between the west and Africa takes place, as it provides critical information • Viewed as critical profession and management tool to translate vision into practice. Has become mandatory in most governments. • The AFrEA Conference looked at how M&E could promote and measure democracy – even in existing democracies.

  6. ….some M&E questions • Without M&E and concrete data, philosophy and ideal will be untested, taken for granted, unmeasured and not realised • Critical M&E questions being asked at international forums are: How can M&E promote democracy, and through this demystify the workings of government, so that its impact is felt? • What are the enabling conditions for M&E? Is the avowed intentions of transparency, accountability and efficiency tested in real contexts of debate over results?

  7. (2) M & E in South Africa • Recently adopted, flurry of M&E activity, some of which is not well informed. The key questions relate to: • What should be the outcome of M&E? • Where should the function be located, how should it be managed? • What should be the processes and methodologies that assist in the formulation of indicators? • How do we know that we are selecting the right indicators, and that we do not cause goal displacement by measuring the wrong things correctly, and through these correct answers arrive at wrong conclusions? • How do we move M&E from policing (accountability) to empowerment (learning), and thus shift organisations from defensiveness to receptiveness on M&E findings.

  8. ….the GWM&E system…. • M&E is a government priority and a key project of the Governance and Administration (G&A) Cluster. • The PSC is mandated to M&E the quality of governance • A Government-wide M&E system is being designed, which will draw on existing systems to answer strategic questions at various levels • Operational issues: funnel upwards from specialised units

  9. the GWM&E system… • The system aims to provide accurate and reliable information to a range of users • A modular approach is being taken, with specialised elements being added on on an incremental basis • National Treasury will have data that allows it to assess “value for money”, the DPSA “human resources utilisation” the PSC”adherence to the 9 constitutional principles” the DPLG on how provinces and local authorities are meeting their mandates etc. • System objectives: • Ensure transparency and accountability • Promote service delivery improvement • Ensure compliance with statutory and other requirements • Promote the emergence of a learning culture in the public sector

  10. the GWM&E system • In essence the system will aim towards a dash-board system, that will be able to provide information around Key Performance Indicators • A compendium of indicators is being developed, together with a methodology and protocol for gathering, verifying and reporting on such information • The system will draw in “specialised” inputs from those entities/departments that have responsibility for certain functions, and generate high level reports – both quantitative and qualitative • It’s a high priority of government – noted in the President`s State of the Nation Address – 2004 and 2005. In 2004 targets were given. • The 2005 address was explicitly an M&E report, reporting on 2004 targets. • The result has been improvements in planning and implementation, with a focus on service delivery and impact Methodological issues in designing indicators… • Given that indicators directs practice, it is important that the right indicators are selected. • A process is required that consults and debates widely for the indicators to be selected • A methodology is required to ensure that data is collected accurately, tested, analysed and reported on • The questions that need to be posed are: • - How can an abstract concept like Human Rights be quantified? • - What within this concept should be identified for testing? • - How does one collect and analyse data to answer the indicator questions? • In terms of Human Rights Indicators (HRI) the selected principles need to be unpacked. • It may require that proxy information is used to assess whether these are realised or not.

  11. PSC M&E System… • The Public Service M&E System uses the 9 constitutional principles for testing the quality of governance. These are: • A high standard of professional ethics • Efficient, economic and effective use of resources • Development-oriented public administration • Provision of services in an impartial, fair, equitable and unbiased manner

  12. Participation in policy-making and responsive to people`s needs • Accountability • Transparency • Good human resource management and career development practices to maximise human potential • Representativity

  13. A set of questionnaires is administered • Draft reports are provided to departments to comment upon • Reports and scores are finalised • The reports are drawn into the annual State of the Public Service Reports and Consolidated Reports.

  14. Through this system, and other evaluations – ie. Batho Pele, the HOD evaluation process, etc. comprehensive evaluations are made of departments. • Overall, the questions of service delivery is answered.

  15. (3) Assumptions… • That a culture of M&E exists, that people are receptive to being evaluated, and evaluation findings are used • That the independence of evaluators is respected, there is no pressure to alter findings • That there is genuine engagement with results

  16. (4) Challenges… • A common discourse needs to be developed, which involves a move from a policing to a learning mode • That the different levels of evaluation users engage with each other • That findings are not used punitively

  17. Questions?

More Related