1 / 39

TECHNICAL OVERSIGHT OF ANNUAL REPORTING Portfolio Committee of Public Works

TECHNICAL OVERSIGHT OF ANNUAL REPORTING Portfolio Committee of Public Works. 11 March 2008. Session Outcomes. Explaining PEM cycle Evaluating Measurable Objectives Critiquing Performance Measures & Targets Legislative Oversight of Annual Reporting Alignment between SPP and AR.

conley
Download Presentation

TECHNICAL OVERSIGHT OF ANNUAL REPORTING Portfolio Committee of Public Works

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TECHNICAL OVERSIGHT OF ANNUAL REPORTINGPortfolio Committee of Public Works 11 March 2008

  2. Session Outcomes • Explaining PEM cycle • Evaluating Measurable Objectives • Critiquing Performance Measures & Targets • Legislative Oversight of Annual Reporting • Alignment between SPP and AR

  3. Public Expenditure Management Cycle

  4. Assessing Performance Information

  5. Whole of Department Strategic Goals Situation Analysis Overall Strategic Objectives Measurable Objectives Programme Situation Analysis Subprogramme Situation Analysis (if required) Resource Information Background Information Analysis of Service Delivery Environment Organisational and Institutional Environment Overview: Components of SPP Five-year SPP Part A Part B Part C

  6. Why Evaluate Performance? • Achieving objectives? • Outputs delivered? • Institutional comparisons • Productivity • Output – outcome?

  7. Defining objectives “Measurable objectives” need to: • Reflect organisational priorities. • Be related to activities and resources. • Adhere to S.M.A.R.T. principle: Specific Measurable Appropriate Realistic Time-bound E, and R?

  8. Three components of a good objective • Primary output that the programme will achieve. • Intended impact that the programme’s output will have on the public or client. • Level of performance. • The desired level of service delivery

  9. Objectives checklist

  10. What’s wrong with this MO? The objective of Further Education and Training (FET) is to provide services in terms of the FET Act.

  11. It only refers to activities. It does not refer to any level of performance – and is therefore not measurable! And, even more importantly, it does not state the impact of this objective on society.

  12. Revised Objective “To provide FET colleges with the required resources necessary to roll out and implement the requirements of the FET Act in KZN.” Further improvements?

  13. Change in focus • Measurable objectives change the focus from activities to outputs and outcomes. • For example: From Administer polio vaccines to children under 6 years old in certain hospitals. To To eradicate polio among children under 6 years in certain areas.

  14. MO from Public Works • Programme 3: National Public Works P. MO/output:Formalized mentorship programme as a regulated profession • Is there an output? • Is there an outcome?

  15. Exercise • Choose a programme from DoPW. Examine MOs for that programme. 1.) Assess whether it contains all the necessary components of an MO. 2.) Are all MOs in the control of the department? 3.) Are they aligned to the strategic obj?

  16. Performance Measures & Indicators • Performance measures and indicators are statements that describe the dimension of performance that is to be monitored. • The dimension of performance to be monitored must be the most appropriate and under the control of the component.

  17. Criteria for Performance Measures and Indicators:NT’s “Framework for Managing Programme Performance Information” - 2007 • Reliable: accurate for intended use, respond to changes in level of performance • Well-defined: clear, unambiguous definition, consistency • Verifiable: validate the processes and systems that produce indicator • Cost-effective: usefulness of indicator to justify cost of collecting data • Appropriate: indicator must avoid unintended consequences • Relevant: relate logically and directly to aspect of mandate

  18. Dimensions of Performance Measures • Quantity • Cost • Quality • Timeliness

  19. Quantity • Describe outputs in terms of how much or how many. • Require a unit of measurement (e.g. kg, litres, km). • Examples: • number of students passing per year per grade; • number of schools build; • number of earmarked FET colleges invested in; • number of finance management personnel on SCoA, BAS training.

  20. Cost • Should reflect full cost of producing an output • Should include unit cost for each deliverable described under quantity targets • Examples • Cost per unit of materials used; • Average annual operating cost per learner per year; • Cost per ABET targeted individual served; • Total operating expenditure.

  21. Quality • Reflect service standards based on customer needs and contribute to government outcomes. • Product or service should fit intended purpose. • Balance efficiency with effectiveness so that price is not predominant factor. • May address: • Parent relations, quality of schooling. • Examples: • Number of parent complaints filed; • Minimum standards of electronic connectivity at schools; • Minimum set of qualifications attained by teachers.

  22. Timeliness • Provide parameters for how often, or within what time-frame, outputs will be delivered. • Measured by turnaround times, waiting or response times (deliver service yearly/quarterly). • Examples: • Whether the brief and instructions to the Minister have been completed within the deadline; • Proportion of case reviews conducted by due date; • Percentage of responses answered within a given timeline; • Children enrolled yearly.

  23. PMI from DoPW • SP 3.1: Constr. Industry Development Policy & Monitoring • PMI – Implement and Monitor HIV/AIDS Policy • Target – Ongoing • Is the PMI measurable? • Is it one of four performance dimensions?

  24. Constraints & MOs/PMIs • SP constraints to be addressed by MOs and PMIs • SP 3.1: Constr. Industry Development Programme [P49 of SPP] • Lack of co-ordination of infrastructure depts • Unco-ordinated BEE programmes • Obstacles such as access to credit for emerging enterprises • Lack of appropriate skills in the industries

  25. Exercise – Measures & Constraints • Exercise:Choose a SP. Evaluate whether SP address the constraints by specifying MOs and PMIs for them.

  26. Developing Performance Measures & Indicators Be aware of perverse incentives! E.g. if a performance measure is the number of schools built, property developers might use cheap labour and cheap building materials in order to increase productivity levels, but in fact effectiveness and efficiency are compromised. E.g.: “number of policies, guidelines, and legislation formulated…”

  27. Perverse Incentives Activity:For a measurable objective in the Department of Public Works, give an example of how focussing on one dimension of performance (eg. cost, quantity) can undermine other performance dimensions (eg quality, equity etc). Type: Group Time:20 min. Require:Flipchart sheet and markers

  28. Developing Performance Targets Key Characteristics of Targets • Defined in precise terms relating to delivery of outputs • Relate to a single performance measure of a particular output • Specify a time frame or milestone • Are measurable i.e. actual numbers and percentages (not terms like increase, decrease or optimal unless quantified) • Linked to baseline achievements

  29. Performance Targets • Set the quantity, quality, cost and timeliness levels for output delivery • Governments will use targets to: • set delivery levels; and • assess departmental performance. • To ensure targets are achieved, need to involve all stakeholders in the process • SMART applies

  30. Examples:Performance targets • Performance target (1) • Less than 10% permanent staff turnover rate • Performance target (2) • 1:34 teacher/learner ratio for KZN.

  31. Legislative Oversight of AR (1) • S55 (2) of Constitution outlines oversight powers of NA • AR allows parliament to evaluate performance of dept after financial yr • PFMA requires AO to table performance targets for their dept – ENE & SP • Challenge for PC is to get depts to provide good quality perf. info. with tight perf. targets & then get depts to report against these in AR

  32. Oversight Process

  33. Oversight Process (2) • Key question: How did the executive perform in using its budget effectively to deliver services? • Biggest weakness is the poor quality of the non – financial performance information

  34. Consideration of AR 1. What is the technical quality of AR? 2. Does the dept report on each and every performance target specified in ENE & budget? 3. What is the quality of perf. info. as highlighted by performance audit by A-G? 4. Is the dept. achieving economy, efficiency & effectiveness? 5. Equity in service delivery? 6. Evaluating mgt’s explanations of why dept’s peformance did not attain targets set in SP and budget 7. Investigating circumstances of under & over-expenditure & its impact on service delivery

  35. Alignment of AR and SPP • PMIs in SPP should be reported on in AR • PMIs and planned targets should be consistent • Poor specification of PMIs in SPP non-reporting in AR • Require internal processes to capture non – financial information

  36. Example - DoPW • P3 (NPWP);SP1: Construction Industry Development Programme • Output 1 (ENE) – Construction Industry T. Charter • PMI1 – Charter Gazetted • Target1– Dec 2006 • Output 2 – Contractors exit from incubator progr. • PMI 2 – No. of contractors graduated • Target2 – At least 50 contractors by Dec 2008 • SPP– Pge 53 • AR – Pge 35

  37. Exercise - DoPW • Examine PMIs for a Programme in the DoPW budget statement. Cross – check SPP to see if PMIs are recorded there. • Turn to AR and assess whether it reports on these PMIs.

  38. END Questions?

More Related