1 / 24

Dr Mark Orkin and Mr Oliver Seale SAMEA Conference Friday, 30 March 2007

Dr Mark Orkin and Mr Oliver Seale SAMEA Conference Friday, 30 March 2007. Building Capacity for effective government wide Monitoring and Evaluation . Presentation Structure. Strategic Objectives and Outputs (Mark Orkin) Transformation to the Academy (Mark Orkin)

salene
Download Presentation

Dr Mark Orkin and Mr Oliver Seale SAMEA Conference Friday, 30 March 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dr Mark Orkin and Mr Oliver Seale SAMEA Conference Friday, 30 March 2007 Building Capacity for effective government wide Monitoring and Evaluation

  2. PresentationStructure • Strategic Objectives and Outputs (Mark Orkin) • Transformation to the Academy (Mark Orkin) • Capacity building for Monitoring and Evaluation (Oliver Seale) Objectives • To providean overview of SAMDI’s current focus areas and plot its future course. • To explore and engage with SAMDI’s capacity building mandate for government wide monitoring and evaluation.

  3. Strategic Objectives and OutputsGovernance & Administration (G&A) Cluster Priorities • Good Governance: Anti-corruption, Gender and disability, Batho Pele and Public Participation programmes. • Capacity of the State: Local Government Strategic Agenda, Skills Assessment and Capacity Building programmes. • Macro-organisation of the State: Single Public Service, Integrated Service Delivery and E-Government Service Delivery Projects. • Transversal Systems: Integration of Planning and Government-wide Monitoring and Evaluation System.

  4. Strategic Objectives and OutputsSAMDI 2007/08 to 2009/10 • Develop and administer a training framework for curricula and materials. • Co-ordinate the provision of executive development programmes for the Senior Management Service. • Develop and implement a quality management and monitoring system. • Capacitate departments to identify their human resource development needs. • Establish and maintain partnerships and links with national and international institutes and training providers. • Arrange customised training programmes to support foreign policy on the African Union (AU) and the New Partnership for Africa’s Development (NEPAD).

  5. Strategic Objectives and OutputsTraining Statistics: 1 Apr. 2006 – 28 Feb. 2007

  6. Strategic Objectives and OutputsPTDs* delivered in Provinces (to end-Jan. 2007) Distribution of Government Employees per Province Distribution of PTDs per Province 14% 20% 18% 5% 31% 5% 10% 12% 14% 6% 6% 10% 11% 2% 2% 5% 18% 5% 3% 3% * PTDs = Person Training Days

  7. Strategic Objectives and OutputsFocus areas for 2007/08 and beyond • Support JIPSA (Joint Initiative On Priority Skills Acquisition) policy formulation and training. • Incubate AMDIN (African Management Development Institutes’ Network) and DRC (Democratic Republic of Congo). • Contribute to the ASGI-SA (Accelerated and Shared Growth Initiative for South Africa) through concentrated public sector human resource development activitiesand operations. • Transformation to the Academy – refer detail slides. • Capacity building for Monitoring and Evaluation – refer detail slides.

  8. Transformation to the AcademySAMDI: Need for a paradigm shift • R1 billion p/a spent in departments but 43% of staff in provincial departments reported no training in 2006. • International benchmarks suggest at least 5 days training per annum: • For approx. 250,000 middle and junior managers requires 1,25 million PTDs p.a.; • Allowing for 60% of training already occurring in departments still requires 0,5 million PTDs p.a. • For induction, staff turn-over is 120 000 people p.a. requiring another 0,2 million PTDs. • Thus, the total-demand driven requirement is 0,7 million PTDs: nearly 10 times SAMDI’s present output!

  9. Transformation to the AcademyVision and activities • Three “mantras” • Provision to facilitation. • Competition to collaboration. • Selective coverage to massification. • First main stream of activity • Executive development programmes for SMS. • Entrant, lower and upper SMS: programmes, courses and events. • In collaboration with universities and counterparts. • Second main stream of activity • “Massified” management training for junior and middle managers. • Training frameworks of curriculum and materials in conjunction with provincial academies and DPLG; • Monitoring and Evaluation to regulate providers; • The induction programme for new entrants at all levels.

  10. Transformation to the Academy2006/7 SAMDI outputs: basis of new approach MDT- Management Development Training; ELD - Executive Leadership Development; HRDT- Human Resource Development & Training; SCM: Supply Chain Management; HRMT - Human Resource Management Training; FPMT - Finance & Project Management Training; SDT - Service Delivery Training; ID - Institutional Development

  11. Transformation to the AcademyENE training spend in national departments

  12. Transformation to the AcademyLearning framework: tentative harmonised modules Supply Chain Human Res. Information Immigration Performance levels Information Pensions Induction Finance Finance Projects Culture Other Other People Other Senior Middle Junior Supervisor Generic competencies Functional competencies Sectoral competencies

  13. Transformation to the AcademyProjects for Internal Task Teams

  14. Transformation to the AcademyRecap and way forward • Executive development programmes. • Learning framework for massified middle and junior management learning. • Curricula and material development, guality assurance and accreditation. • Provider and user relations; M&E of large-scale provision • Provincial infrastructure. • Research capacity and networking. • Continental support for Management Development Institutes, international relations. • Impending restructuring process.

  15. Capacity building for Monitoring & Evaluation Background • Aims and objectives • The aim of the system is to contribute to improved governance and enhanced effectiveness of public sector institutions. • The system aims to collect, collate, analyse and disseminate information on the progress and impact of programmes. • Result Areas • Accurate information on progressin the implementation of public sector programmes is updated on an ongoing basis; • Information on the outcomes and impactachieved by government is periodically collected and presented; • The quality of monitoring and evaluation (M&E) practicesin government and public bodies is continuously improved.

  16. Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (A) • Little coherent or articulated strategy in provinces, though expenditure on expensive systems to collate M&E data. • What would a coherent strategy need to contain? • What is an articulated strategy? What type of links are we looking for? • What systems are there? What do we mean by a system? • What data are there? How were they obtained? What is the quality of this data? • What collation is taking place? How? Can a system collate data? What are the implications for training?

  17. Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (B) 2. Monitoring programmes are just about collecting data, very little analysis and feedback given. • What data, why and how are they being collected? 3. Alignment of plans doesn’t exist • What planning does take place and how? • How is M&E incorporated into planning? • What do we mean by alignment why do we need it? • Is alignment always possible and necessary? What are the implications for training?

  18. Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (C) 4. Planning without indicators • What type of indicators do we mean and how should these be developed? • What indicators do exist and how are they measured? • How are they decided on? 5. Lack of or poor baseline data • What baseline data do exist and do we evaluate it? • What type of baseline data are required? • How are these presently obtained? What are the implications for training?

  19. Description • Existing data bases • Data collection methods • Baseline data Planning • What will be done (strategy) • Why will it be done (policy) • How will it be done (operations) • Indicators and criteria (how to measure) • When (timeframes) Existing situation New project or programme Monitoring • System to be used (EIS) • Indicators • Methods • Baseline data • Criteria • Assessment • Process • Impact • Lessons learned • Feedback • System to be used (MIS) • Indicators • Methods • Baseline data • Inputs • Tracking • Processes • Activities • Interventions and modifications • Outputs • Outcomes Evaluation Capacity building for Monitoring & Evaluation Conceptual framework for training

  20. Capacity building for Monitoring & Evaluation Training principles for various levels Levels Basic for general users of information Basic for project managers Intermediate for programme managers Advanced for executive managers Advanced for CFOs and DDGs Specialist technical training for M&E staff Principles • Understanding of basic principles of M&E • Applying principles to a specific project • Applying principles to a programme • Applying principles to overall management in departments • Applying principles across departments/provinces • Actually performing evaluations

  21. Capacity building for Monitoring & Evaluation Target Audiences 1. Users • Political heads and parliamentarians (incorporated into report-backs to portfolio committees) • Accounting officers (DGs) • Executive managers and managers in govt departments • Users of the service or the information outside government 2. Producers • Programme managers • Project managers • Operations staff • Participants 3. M&E staff in national and provincial departments

  22. Capacity building for Monitoring & Evaluation Examples of current provision

  23. Capacity building for Monitoring & Evaluation Strategy and Plan of Action • Progress Report • Terms of reference developed for Task Team. • 15 workshops on M&E for programme and project management (340 officials). • Initial needs analysis on provincial M&E capacity. • Consultation with key internal and external stakeholders. • Plan of Action • Research on M&E training needs - SMS, MMS, JMS, practitioners (March ‘07). • Determine current providers of M&E - HEIs, privates, NGOs etc. (March ‘07). • Undertake training needs analysis for M&E (May ‘07). • Develop M&E training programme (Sept ‘07); roll-out (Nov. ‘07).

  24. Siyabonga Thank you Rolivhuwa Dankie Nakhensa Re a leboga

More Related