1 / 46

Measuring Service Delivery

Measuring Service Delivery. Markus Goldstein DECRG/AFTPM. Spending ≠ outcomes. And the same for health…. Control for leakage and things look better…. Gauthier and Wane 2006 . Outline. An introduction to how we measure it Levels of analysis

bernad
Download Presentation

Measuring Service Delivery

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Service Delivery Markus Goldstein DECRG/AFTPM

  2. Spending ≠ outcomes

  3. And the same for health…

  4. Control for leakage and things look better… Gauthier and Wane 2006

  5. Outline • An introduction to how we measure it • Levels of analysis • Rich set of tools for measurement, but measure different things • Why measure service delivery • Accountability • Measuring poverty & designing a response • Evaluation • Policy relevant research

  6. How we measure service delivery

  7. General organization of public service provision Central ministry District/State gov’t facility Service providers Potential clients Current clients

  8. Administrative data Central ministry Information on resource flows information on clients served District/State gov’t Information on resource flows information on clients served facility information on clients served Service providers Potential clients Current clients

  9. Tools: Administrative Data The basic tool to measure quality (and quantity) of service delivery Data collected from different levels Can provide extensive coverage (all clients) and pictures at different levels Other tools we will talk about are not substitutes for improving administrative data – should be in addition

  10. Some ideas on decent quality administrative data Quality = credibility Timeliness – key for use and relevance Focus attention on a small set of relevant core indicators Make the analysis accessible and relevant to policymaker, service providers, managers, other user needs These things  increase demand  better and more data

  11. Public Expenditure Tracking Surveys Central ministry Measures and verifies resource flows District/State gov’t Measures and verifies resource flows facility Measures and verifies resource flows Service providers Potential clients Current clients

  12. Tools: PETS Diagnostic or monitoring tool to understand problems in budget execution delays/predictability of public funding leakage / shortfalls in public funding discretion in allocation of resources Data collected from different levels of government, including service delivery units Reliance on record reviews, but also school principal/health facility manager interviews Variation in design depending on perceived problems, country, and sector

  13. Quantitative Service Delivery Surveys Central ministry District/State gov’t Information on facility functioning facility Service providers Potential clients Current clients

  14. Tools: QSDS Generally used for evaluating efficiency of public spending and incentives Data can be collected on inputs, outputs, quality, costs, pricing, oversight, throughputs, outputs

  15. Household surveys Central ministry District/State gov’t Measure usage, outcomes Measure use of alternatives, outcomes facility Service providers Potential clients Current clients

  16. Tools: Household surveys Examples: Living Standard Measurement Surveys, Demographic and Health Surveys, MICS Will generally have detailed individual and/or household data on a wide range of characteristics • e.g. not just health seeking behavior, but also wealth levels • e.g. not just water source, but education levels Can be combined with facility surveys (e.g. 17 LSMS surveys) Collect data not only on clients, but potential clients

  17. Absenteeism surveys Central ministry District/State gov’t Characteristics of providers and facilities check to see when providers are present and working facility Service providers Potential clients Current clients

  18. Tools: absenteeism surveys Conduct random visits by enumerators during working hours to locate doctors/teachers. These visits could be randomized over few months. Some studies did two checks over the period of a few months others were visited around the official opening time and closing time of these facilities. In between that time, the team collects facility-specific and provider-specific information. There was no notification of the visit given before the survey team arrived at the facility.

  19. Vignettes Central ministry District/State gov’t Quality of provider (e.g. skill set) facility Service providers Potential clients Current clients

  20. Tools: vignettes Goal: to test the ability of medical personnel to diagnose and treat common conditions in a setting that is similar to their normal practice Structure: An enumerator gets trained as a sick person and the characteristics of the illness are predetermined. The practitioner must ask questions and perform physical examination for diagnoses. Provider then makes diagnoses as under normal circumstances. A competence index is then constructed based on the specific questions asked regarding the history of the case, the examination of the patient, the tests prescribed and the treatment given.

  21. Tools: vignettes Variations: Other types of Vignettes include use of hypothetical scenarios where the practitioner is either asked to list the specific procedures he would use to diagnose a particular type of patient, or whether or not he would a particular procedure for a patient with specific symptoms, etc. Direct observation is another option where the behavior of clinicians with their own patients is studied. However, because the case mix varies between clinicians, it is difficult to compare across practitioners and not always relevant.

  22. Exit surveys Central ministry District/State gov’t Client satisfaction, perceptions, informal payments, waiting time, etc facility Service providers Potential clients Current clients

  23. Tools: exit surveys Exit polls for user satisfaction (can be done for patients alone or a sample of households if non-users are included). Data can also be collected through focus group discussions and report cards. Limitations of exit polls Problems in interpreting the subjective perceptions of health care quality “Courtesy bias", where individuals may provide responses that they are socially acceptable. Different to interpret because of important systematic differences across demographic and socio-economic groups, possibly making client perceptions poor proxies for objective assessments of different dimensions of quality.

  24. Report cards Central ministry District/State gov’t Assessment of services and opinions facility Service providers Potential clients Current clients

  25. Tools: report cards Citizen/Community-wide report cards: use a range of different tools to get information and opinions on prices, quality, waiting times, courtesy, etc. Can also be used to complement and support facility surveys. For example, the Bangalore report cards by the Public Affairs Center (PAC) in Bangalore summarize citizens' assessment of services provided by public agency officials and solicits opinions on specific aspects of service provision, including staff behavior, quality of service and communication of information, information on bribes paid in connection with service provision, etc.

  26. Tools: report cards Citizen report cards Use a randomized survey questionnaire Community report cards Use focus groups Citizen reports easy to aggregate, but with community report cards and the need to reach consensus  hard to aggregate Both will be colored by expectations (more on this later)

  27. Why we measure service delivery

  28. Reason 1: Accountability B A Source: WDR 2004

  29. Provider-citizen leg (A) • Realized demand • How much is used, how much is paid, etc. • QSDS, exit surveys, administrative data, hh surveys • Satisfaction • e.g. length of wait for Dr, teacher’s performance • Report cards, questions in hh surveys, exit surveys • Is it correlated with objective measures of quality? Not always • Lundberg: vitals, examinations not corr w/ satisfaction  Think about why you are doing this…

  30. Reason 1: Accountability B A Source: WDR 2004

  31. Government – provider leg (B) • Monitoring (administrative data) • Most effective when: • Routine collection, timely availability • Need sufficient quality • Adequate breadth, but not over burden providers • They have to be used • Can be used to draw inferences about program performance • Combine for impact evaluation, dose response (Galasso, Behrman and King) • Set service standards and measure relative performance

  32. Government – provider leg (B) • Absenteeism surveys • Admin systems may get these data wrong • Facility surveys • Not a replacement for monitoring • Can get at broader, deeper data that would overwhelm monitoring system • Can get at more nuanced issues such as incentives, motivations and behavior

  33. Government – provider leg (B) • Tracking the flow of resources: PETS • In-depth information on flows and losses • What is fraud, what is inefficiency, what are legitimate reallocations? • If there is a fairly open dialogue, this can feed into thinking about allocation rules in government

  34. And what happens in A&B may impact C C B A Source: WDR 2004

  35. And report cards may provide a way to get “C” moving C B A Source: WDR 2004

  36. Reason 2: Understanding poverty & inequality and targeting the response • Whether we see poverty as income or multidimensional, measuring health & education is important • Understanding poverty & the service environment of the poor • LSMS surveys, didn’t originally contain facility component, 17+ do. • Link households to facilities they use (e.g. IFLS) • HH as starting point

  37. Targeting the policy response • Separate out measures of quality that reflect the underlying poverty (development response) and those due to deficiencies in service delivery • Vignettes e.g. – why we can’t use whether a Dr. follows a protocol • Educated patients might encourage doctor, etc • Need to put Dr. through a vignette • Das & Leonard: Poor served by worse quality physicians

  38. Targeting: natural disaster response • Frankenberg, et. al. – response to Tsunami • hh surveys + facility surveys + GIS information (combination) • What facilities were destroyed • But also: where population has moved so you can build back more appropriately (considering both disaster hit and surrounding areas) – get at dynamics

  39. Reason 3: Evaluation, especially impact evaluation • IE defined: counterfactual construction • We can see this as part of both the citizen/gov’t links (demonstrating validity) and gov’t/provider links (what works) • Use service delivery data to look at marginal impacts of program exposure • Galasso uses phase in and time of exposure to look at outcomes (such as malnutrition)

  40. Evaluation and impact evaluation • Evaluating a change in management • Look at how changes in service delivery (e.g. performance based pay for health care providers) changes welfare outcomes (e.g. child mortality) • Look at changes in service provision in its own right • Look at how increases in client voice/information change service delivery and outcomes • Bjorkman & Svensson: information on health provider performance and gov’t standards  better health outcomes, perceptions of service • Look at impacts with heterogeneous treatment • Answer the question: what type of facilities provide this intervention with the greatest impact?

  41. Reason 4: policy relevant research • Understanding the link between quality (e.g. skill of provider, infrastructure, etc) and client outcomes • Understanding the demand for services • Understand who the client are, who are not clients and why • Sampling is tricky…all available, all the hh uses?

  42. Reason 4: policy relevant research • Understanding facility production processes • Going beyond, deeper than monitoring • e.g. whether facilities are at the optimal size (efficiency), are human and physical capital being used in the right proportions, what inputs are being wasted

  43. Thank you Are You Being Served on the web: http://go.worldbank.org/F6KIIC0700

  44. Framing the exercise using the results chain

  45. Central ministry District/State gov’t facility Service providers Potential clients Current clients

  46. Perceptions unpacked (Lundberg) • Compare facility survey data with exit polls in Uganda • Significant correlations: • Waiting time (-) • Consultation time (-) • Treated politely (+) • Asked questions (+) • Not significant • Given physical exam • Touched during examination • Pulse taken

More Related