1 / 20

2008 Estimates of National Expenditure (ENE) 29 November 2007

2008 Estimates of National Expenditure (ENE) 29 November 2007. Content. 2008 Budget Process and MTEF Provisional allocation letters Efficiency savings Critical dates Important new features (departments and entities) Performance Information Science and Technology ODA Database.

aquarius
Download Presentation

2008 Estimates of National Expenditure (ENE) 29 November 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2008 Estimates of National Expenditure (ENE) 29 November 2007

  2. Content • 2008 Budget Process and MTEF • Provisional allocation letters • Efficiency savings • Critical dates • Important new features (departments and entities) • Performance Information • Science and Technology • ODA • Database

  3. Critical dates

  4. Important Features • Accountability information • Mirrors Appropriation bill for 2008/09

  5. Important features cont. • Programme purposes and objectives • Strategic overview and key policy developments • 1 000 words • Recent achievement 2006/07 and 2007/08 • Key performance indicators table (also entities)

  6. Important features cont • Service delivery objectives and indicators • replaced by performance indicator table and strategic overview section • Efficiency savings • Transfers to public entities and agencies • Do not duplicate under programme information • Applicable to trading entities • Number formatting style • Full stop instead of comma • Space to separate thousands

  7. Infrastructure • Five categories of infrastructure projects/ programmes: • “Mega” projects (cost more than R300 million per year for minimum 3 years, or R900 million total project cost ), and should be reported as a single line item. • “Large” projects (cost between R50 million and R300 million per year within a given MTEF); and • “Small” projects (less than R50 million per year), should be grouped together and reported as a single item.

  8. Infrastructure categoris cont. • Infrastructure transfers to other spheres and entities/agencies – the line item should reflect transfers by dept to entities/agencies and other spheres of government. • Infrastructure/fixed installations transferred to households – the line item should reflect transfer of funds to households

  9. Contacts • Provide details of dedicated ENE contact for department and entity • Liaise with NT budget examiners www.treasury.gov.za/publications/guidelines

  10. Performance information in the 2008 ENE

  11. ENE Performance Information • In the 2008 ENE an effort is underway to improve the substance of non-financial performance information including the meaningfulness of measurable objectives and the usefulness of performance indicators for monitoring • Measurable objectives (MO): • In some cases, MOs are vague and do not translate the programme purposes into quantifiable goals reducing clarity concerning “what the money is being used for” • Performance indicators • In general, there are too many indicators with the relevant indicators being crowded-out • Resulted in part from the requirement that each subprogramme must have outputs indicators • In general, there are not enough numbers and there are too many words making it difficult to monitor performance • Resulted in overly flexible formats whereby reporting of milestones, dates and other qualitative performance information was not indicated separately • The indicators are not broad enough • Resulted from the exclusive emphasis on output indicators at the expense of monitoring other important areas such as inputs, activities and outcomes

  12. What’s in the instructions? • Strategies to address these improvements include: • Measurable objectives • New bulleted format for presenting MOs (1-5 per programme) • MOs have been moved to the front of the chapter following the programme purpose • Requirement that MOs should identify the most important programme areas by “providing a snap-shot of the main objectives (outputs and outcomes) of a programme based on its major operational elements” • MOs are defined as quantifiable results that can be achieved within a foreseeable period • We are not reinventing the MO’s, we are just trying to improve upon them so that they say more about the “how” aspect of achieving programme purposes

  13. What’s in the instructions? • Strategies to address these improvements include (continued): • Key performance indicators • Seven year timeframe format • Indicators have been moved to the front of the vote so that both cross-cutting or programme-specific indicators may be highlighted • Indicators may relate to inputs, activities, outputs, outcomes and may also provide contextual or explanatory information • Average of 10 performance indicators recommended per vote or entity • Two areas to discuss performance: • Quantitative performance indicators in the key performance table (also see guidelines on using percentages) • Qualitative performance information in the strategic overview section

  14. Helpful tips • In general, indicators should relate to ongoing functions but may also be project-specific • The average of 10 indicators per vote/entity means that the number of indicators may range from 7 to 17 more or less depending on need • If an entity does not have a separate section at the back of the chapter, specific indicators may be included in the department’s indicator table with a reference in the indicator title (e.g., number of scientific researchers on staff at xxx entity) • Ensure that only numerical values are used to reflect past, current and projected performance in the seven year table • Ensure that the indicator is clear and obviously relates to the work of the department. (e.g., “number of claims processed” could be clarified as “number of workers’ compensation claims processed per month”) • The key indicators do not have to cover all the aspects of the departments work just the most important from an oversight monitoring standpoint • Key performance indicators in the ENE represent only a small subset of departmental performance information for auditing purposes

  15. Helpful tips • Don’t forget to include the unit of measure in the indicator title • Don’t forget to provide a brief definition of each indicator (i.e., enough detail to give a general understanding of the indicators • The process of piloting trendable and quantitative output and outcome performance indicators at the subprogramme and programme level with 5 departments continues unaffected by the ENE changes • Review the strategic plan • Review the subprogrammes and the outputs in the 2007 ENE. There are many meaningful and useful indicators that may be carried-forward. • Review the Presidency’s “Development Indicators Mid-term Review” of 72 national indicators • Review the current MOs in the 2007 ENE. In some cases, they can be subdivided into separate objectives in other cases they are already written in the form of quantifiable results and can be carried-forward • If an indicator is cumulative then use the term “total” in the title (e.g., Total number of foreign missions in Africa – 06/07=45, 07/08=47 and 08/09=50) • In an indicator is non-cumulative then use the term “new”, ”additional” or “per year” in the title (e.g., Number new of foreign missions in Africa per year – 06/07=0, 07/08=2 and 08/09=3)

  16. Science and Technology expenditure

  17. Official development assistance expenditure

  18. Completion of ENE database

  19. Questions and Answers

More Related