1 / 23

Public Expenditure Tracking and Service Delivery Surveys

Public Expenditure Tracking and Service Delivery Surveys. Flagship Course on Governance and Anti-corruption April 21, 2003 Magnus Lindelow Development Research Group The World Bank. The presentation. Why new tools for public expenditure analysis?

sabina
Download Presentation

Public Expenditure Tracking and Service Delivery Surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Public Expenditure TrackingandService Delivery Surveys Flagship Course on Governance and Anti-corruption April 21, 2003 Magnus Lindelow Development Research Group The World Bank

  2. The presentation • Why new tools for public expenditure analysis? • Characteristics of PETS and related approaches • The Uganda experience • Some conceptual and practical challenges in designing and implementing PETSs • Examples of more recent surveys • What have we learnt about designing and implementing a PETS?

  3. Why new tools to analyze public spending and service delivery? • Evidence shows limited impact of public spending on growth and human development indicators • Demand for evidence on efficiency of spending and quality in service delivery • Lack of reliable data in many developing countries • New approaches to aid delivery • Move towards budget support (e.g., PRSC) • Focus on poverty-focused strategic framework (PRSP) • Related fiduciary and accountability concerns

  4. New challenges… • Are budget allocations pro-poor? • Are budget outturns consistent with established allocations? • Quantitative measurement of corruption • Do expenditures result in intended outputs and outcomes?

  5. Outturn Timely disbursements in accordance with established policies and priorities Policy framework Government program PRSP Sector strategies Budget allocation Outputs Impact Outcomes The ideal situation…

  6. Nontransparent process - Poor reporting on execution - High level of aggregation - Discretion in allocation • Weak service delivery • Accountability • Efficiency • - Quality Inherently difficult to assess - Household surveys - Participatory approaches - Social Impact Assessment Unclear policy framework Lack of clarity about how resource allocation relates to policies and priorities - budget not comprehensive - classification system Political economy Weak management information systems - limited coverage - poor data quality - late and scattered reporting  The “typical” situation… Policy framework Govt. program PRSP Sector strategies Budget allocation Timely disbursements in accordance with established policies and priorities Outputs Impact Outcomes PUBLIC EXPENDITURE TRACKING SERVICE DELIVERY SURVEYS

  7. Characteristics of PETS • Diagnostic or monitoring tool to understand problems in budget execution delays / predictability leakage discretion in allocation of resources • Data collected from different levels of government, including service delivery units • Reliance on record reviews, but also interviews • Variation in design depending on perceived problems, country, and sector

  8. Characteristics of service delivery surveys • Perception based surveys Interviews with households, providers, firms, key informants, focus groups (e.g., score-card approaches) • Quantitative surveys (QSDS) Focus on frontline service providing unit (e.g., health facilities or schools) Inspired by micro-level household and firm surveys • Resource flows (financial and in-kind) • Availability / adequacy of inputs • Service outputs and efficiency • Quality Focus on cost analysis, dimensions of performance in service delivery, comparisons across ownership.

  9. Hybrid approaches • Link facility surveys with surveys of administrative levels “upstream” (public officials) • Why different performance in the same system? • Link facility surveys with household surveys • Effect of school/facility characteristics on household behavior and outcomes? • Mix quantitative and perception-based approaches (e.g., exit polls, staff interviews, focus group discussions) • Relationship between perceptions and observable characteristics of schools or facilities?

  10. The Ugandan experience • Many improvements since 1992 • macroeconomic stability and growth • shift of public resources from defense to roads and social sectors • decentralization • Poverty Eradication Action Plan (PEAP) • Poverty reduced from 56% in 1992/93 to 35 % in 2000 • Strong budget management • MTEF, Poverty Action Fund (PAF) • Sector level performance did not keep up

  11. The PETS 1996 • Health and education sectors. • Data collected from different levels of administration, including 250 schools and 100 health facilities. • Only 13 percent of intended capitation grant actually reached schools (1991-95). • Large schools with wealthier parents and qualified teachers were able to obtain more of their budget allocation. • Other findings • Enrollment trends differed from published data. • Importance of parental contributions.

  12. Impact and follow-up • Mass information campaign by Ministry of Finance (the press, posters) • A signal to local governments • Lower the cost of information to parents • Follow-up surveys in the education sector • Ministry of Education initiative and local implementation • shows a major improvement • Follow-up surveys in health sector • Broadening agenda: service delivery

  13. Impact and follow-up (2)

  14. Some conceptual and practical challenges UGANDA (Education) LESS IDEAL CONDITIONS FOR TRACKING Donor contributions MoF MoF Budget allocation Capitation grant Sector Ministry Sector Ministry Sub-national Level 1 Sub-national Level 1 • Discretionary allocation • financial resources • material • staff, etc. SDU SDU Assessment of leakage more difficult – compare amount received at SDU with…? But… Possible to assess leakage in narrow sense And there are other issues… Equity Delays and other problems in budget execution Service delivery issues Leakage is assessed by comparing amount received at SDU level with amount allocated, but… Leakage = corruption (embezzlement) ?

  15. “Leakage” and data discrepancies in Mozambique • Discrepancies in financial records (75% of districts) but no systematic pattern • Facility reporting of user fee revenues approx. 70% of expected amounts • Discrepancies in drug records • Lack of information about HR across levels (provincial admin., district admin, facilities) • Difficult to make confident statements about leakage, but clear evidence of lack of control

  16. Equity in district financing for health care (Mozambique)

  17. Delays in budget transfers (Mozambique)

  18. PETS in other countries • Tanzania (1999 and 2001) • Tracking of pro-poor expenditures in priority sectors at all levels • Ghana (2000) • Expenditure tracking based on data collected at facility, district, and central level • Honduras (2000) • Survey looking at ghost workers, absenteeism, and “job-migration” • Other past, ongoing, or future surveys • Bolivia, Chad, Georgia, Laos, Madagascar, Mozambique, Nigeria, PNG, Peru, Rwanda, Zambia.

  19. Absenteeism studies

  20. Emerging issues • Many good reasons for doing surveys • Diagnosis of problems, such as corruption – shaping the agenda • Analysis: guiding reform • Monitoring over time / benchmarking • Tool for understanding and creating dialogue about PEM and service delivery systems – useful for donors andgovernments • Research • But questions remain • Surveys only give part of the answer (what about allocations? Link with outcomes?) • Surveys provide information but is it used?

  21. Survey Design: Surveying what? Why? • What are the problems? Are there important gaps in our understanding of the nature, extent, and source of problems? (Potential usefulness of qualitative work) • Is a survey the appropriate tool? Stand-alone or as a complement? Worth the cost? • Is it feasible? How is the budget structured and implemented? • Who is the audience and is there a likely impact? Is there a political demand? • Will the information be used? By whom?

  22. Implementation issues: Who? How? • Requires skills like any other micro survey • Steps in implementation • Concept – what are the issues? • Buy-in across the board • Questionnaire design • Identify (and contract) implementing agency • Pilot • Enumerator training • Field work (including quality control and data entry) • Analysis and dissemination

  23. Implementation issues • Who can do it? • Local or international? • Capacity building objective? • Who does the analysis? • Getting quality data • Field test important • Quality control in field and data entry • Promoting impact • Strategic partnerships (between ministries, using university or local research institutes, civil society involvement) • Linking into existing instruments and systems ~~

More Related