1 / 36

The Presidency Department of Planning, Monitoring and Evaluation

The Presidency Department of Planning, Monitoring and Evaluation. Results of the 2013 moderated assessments on the quality of management practices in all 155 national and provincial departments.

abiondi
Download Presentation

The Presidency Department of Planning, Monitoring and Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The PresidencyDepartment of Planning, Monitoring and Evaluation Results of the 2013 moderated assessments on the quality of management practices in all 155 national and provincial departments Presentation to Portfolio Committee on Public Service and Administration and Planning, Monitoring and Evaluation 17 September 2014

  2. Focus of DPME to date M&E of national priorities • Plans for the 14 priority outcomes (delivery agreements, MTSF) • Monitoring (ie tracking) progress against the plans • Evaluating to see how to improve programmes, policies, plans (2012-13 8 evaluations, then 10-15 per year) Management performance M&E • Assessing quality of management practices in individual departments and municipalities (MPAT and LGMIM) • Moderated self assessment and continuous improvement M&E of front-line service delivery • Monitoring of experience of citizens when obtaining services (joint with provinces), including citizen-based monitoring • Presidential Hotline – analysing responses and follow-up Government-Wide M&E System • Guidelines for M&E across government • Data quality • Capacity development • Programme planning guidelines • National Evaluation System • Custodian of strategic and annual performance planning

  3. Why assess management practices? • Capable and developmental state is prerequisite to achieving NDP objectives • Weak administration is a recurring theme and is leading to poor service delivery, e.g. • Shortages of ARVs in some provinces • Non-payment of suppliers within 30 days • MPAT measures whether things are being done right or better • Departments must also be assessed against the outcomes and their strategic and annual performance plans to determine if they are doing the right things

  4. Background • DPME, together with the Offices of the Premier and transversal policy departments have since 2011 been assessing the quality of management practices • The MPAT tool was developed collaboratively with other transversal policy departments and OAG and OPSC • MPAT provides a holistic view of management performance and draws on data from other agencies, such as the OAG, as secondary sources • MPAT is not simply an external audit of management practices, it includes a strong element of self-assessment, learning and improvement • It is based on international good practice, drawing on the experience of Canada, Russia, India, Kenya and New Zealand

  5. Levels of assessment 5

  6. MPAT measures 31 standards in 4 KPAs, eg:

  7. Governance & Accountability • Service Delivery Improvement • Management Structures • Accountability • Ethics • Internal Audit • Risk Management • Delegations • Governance of ICT • Promotion of access to information • Strategic Management • Strategic Planning • Annual performance planning • Monitoring & Evaluation Management Performance Areas • Human Resource Management • HR Strategy and Planning • HR Practices and Administration • Management of Performance • Employee Relations • Financial Management • Supply Chain Management • Expenditure management 7

  8. The assessment process Self-assessment and validation External moderation and feedback Improve and monitor Senior management agreed score DPME/OTP feedback to department Department improvement plan Internal Audit certify process and check evidence Department monitors External Moderation HOD sign off Department prepares for next round Have we improved from baseline?

  9. 2013 assessment results • Noted improvements are evident when comparing the 2013 results to the 2012 results across most departments - in some areas of management however there has not been significant improvement • DPME has documented good practices since 2011 to assist departments to improve their management practices

  10. Analysis – Strategic Management KPA • Standards in this management area - Annual Performance Plans and Monitoring and Evaluation - have declined from 2012 • But bar was raised in 2013: Departments should achieve at least 80% of their targets and no findings by AGSA on the relevance, reliability and quality of reports against pre-determined objectives in the APP for level 4 • In the 2013 assessments: • 43% of departments scored at level 1 or 2 for the APP standard – meaning that their APPs also do not comply with the TR and guidelines and that management does not regularly engage with the quarterly progress report • 43% of departments scored at level 1 or 2 for the M&E standard, meaning that they do not have standardised processes to collect, manage and store data • The implication of this is that departments are struggling to set realistic performance targets and accurately report achievements against them

  11. Lessons from Case Studies – M&E and APP standards Examples of departments which scored at level four for both the APP and M&E standard: dti and EC Economic Development Environmental Affairs and Tourism • Documented M&E processes and policies • Systems put in place to collect, store and verify data • Programme Managers understand importance of M&E to help with decision-making based on evidence and not just for compliance

  12. Analysis – Governance and Accountability • 70% of departments (79% in 2012) are at level one or two for the standard for service delivery improvement • Poor performance in this standard raises questions about the appropriateness of the Service Delivery Planning Framework issued by DPSA in terms of the Public Service Regulations • 73% of departments score at level one or two with the standard related to the Promotion of Access to Information Act

  13. Lessons from Case Studies – Service Delivery Improvement Standard DHA; EC Rural Development • National departments still do not comply with DPSA requirements for SDIP compliance – DHA scored at only level 1 despite intense interventions to improve service delivery for issuing of IDs and Passports • EC Rural Development scored a 4 and has good practice with regard to consultation with farmers and front line extension workers on setting service standards

  14. Lessons from Case Studies – Risk and Fraud Management Standard DMR; NW Agriculture scored at level 4 • In DMR buy in from top management by allocating a CD as a risk champions in each branch, DDG’s assessed on how they manage risk • NW Agriculture has the function located in the HoDs office to manage and respond to emerging risks. Also good practice in communicating issues related to risk and fraud via staff newsletter

  15. Analysis – Human Resource Management • 65% of departments (74% in 2012) scored at level 1 or 2 with the organisational design standard • 82% of department (88% in 2012) scored at level 1 or 2 for the human resource planning standard • Only two departments achieve level 3 and above for the standard on diversity management • 90% of departments (88% in 2012) were assessed at level 1 or 2 for the standard related to management of disciplinary cases

  16. Lessons from Case Studies – Organisational Design Standard DOE; NC Social Development scored at level 4 • Lesson for DOE is that it takes time to implement necessary organisation change and consultation is crucial • NC Social Development made effective use of DPSA Guide and Toolkit in Organisational Design, to ensure that the department was positioned to implement its War on Poverty Programme – Foetal alcohol syndrome reduced by 30% in De Aar

  17. Lessons from Case Studies – Recruitment and Retention Standard GCIS; NC Roads and Public Works scored at level 4 • GCIS only department to meet equity targets, also all vacancies filled within 2 months • GCIS also good practice in conducting exit interviews and analysis of why staff leave

  18. Lessons from Case Studies – Management of disciplinary cases DMR; KZN Economic Development, Tourism and Environmental Affairs scored at level 4 • DMR good practice in creating awareness amongst staff on what constitutes misconduct. Managers empowered to manage their own DC processes, cases resolved with 90 days • KZN EDTEA: all cases completed within 90 days by collaborating with other departments to assist with chairing of DC hearings and they have clear documented processes in place

  19. Analysis – Financial Management • 87% of departments were assessed at level 1 or 2 on the standard related to payment of suppliers (bar was raised in this standard making it a level 3 requirement to pay suppliers within 30 days) • This negatively affects cash flow and sustainability of small businesses • 50% of departments (60% in 2012) were assessed at level 1 or 2 for the standard related to the management of unauthorized, irregular, fruitless and wasteful expenditure

  20. Lessons from Case Studies – Payment of Suppliers Standard DOE, NC Social Development scored at level 4 • DOE buy in from DG in driving improvement process, issue regularly monitored though all management structures • NC Social Development through leadership commitment not only implemented effective decentralised delegations but managed to pay suppliers within 5 days • In both cases failure to comply by staff is managed though disciplinary procedures

  21. Analysis against external criteria • Statistical analysis of results by P&DM at Wits, together with data on certain external criteria, indicated that: • HR-related standards are particularly important for achieving results in terms of the Auditor-General’s indicator of meeting more than 80% of performance targets in the APP • Senior Management Service (SMS) stability (the proportion of DGs and DDGs in office for more than three years) correlated frequently with a range of MPAT standards

  22. Conclusions (1) • These has been some improvement from 2012 to 2013 • In 2013, 69/155 departments were assessed as compliant or working smartly in at least half of the standards measured, as apposed to 59 in 2012 • For national departments as a group and in 7 of the provinces, the average scores have increased since the 2012 assessment - Free State and Mpumalanga have declined

  23. Conclusions (2) • Although there has been improvement in many standards the following are areas where more than 50% of department do not meet legal requirements • SDIP; Fraud Prevention • HR Planning; Organisational Design, Management of Diversity; SMS PMD; HoD PMDS; Disciplinary cases • Payment of Suppliers; Unauthorised, Wasteful and Fruitless Expenditure • National Treasury, DPSA and DoJ need to review regulatory frameworks or provide additional support in areas where the majority of departments do not comply

  24. Conclusions (3) • For all standards, there are at least some departments operating at level 4 • Implies that it is possible for all departments to operate at this level for all the standards • DPME in collaboration with Wits University School of Governance have documented and are disseminating case studies of departments operating at level 4, to assist departments to improve • Executive Authorities and Accounting Officers should ensure that their departments implement improvement plans to reach level 4 for all standards

  25. Additional Slides

  26. Key Lessons from analysis of data and good practice GOOD MANAGEMENT

More Related