Measuring service delivery
Download
1 / 46

measuring service delivery - PowerPoint PPT Presentation


  • 547 Views
  • Updated On :

Measuring Service Delivery. Markus Goldstein DECRG/AFTPM. Spending ≠ outcomes. And the same for health…. Control for leakage and things look better…. Gauthier and Wane 2006 . Outline. An introduction to how we measure it Levels of analysis

Related searches for measuring service delivery

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'measuring service delivery' - bernad


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Measuring service delivery l.jpg

Measuring Service Delivery

Markus Goldstein

DECRG/AFTPM


Spending outcomes l.jpg
Spending ≠ outcomes




Outline l.jpg
Outline

  • An introduction to how we measure it

    • Levels of analysis

    • Rich set of tools for measurement, but measure different things

  • Why measure service delivery

    • Accountability

    • Measuring poverty & designing a response

    • Evaluation

    • Policy relevant research



General organization of public service provision l.jpg
General organization of public service provision

Central ministry

District/State gov’t

facility

Service providers

Potential clients

Current clients


Administrative data l.jpg
Administrative data

Central ministry

Information on resource flows

information on clients served

District/State gov’t

Information on resource flows

information on clients served

facility

information on clients served

Service providers

Potential clients

Current clients


Tools administrative data l.jpg
Tools: Administrative Data

The basic tool to measure quality (and quantity) of service delivery

Data collected from different levels

Can provide extensive coverage (all clients) and pictures at different levels

Other tools we will talk about are not substitutes for improving administrative data – should be in addition


Some ideas on decent quality administrative data l.jpg
Some ideas on decent quality administrative data

Quality = credibility

Timeliness – key for use and relevance

Focus attention on a small set of relevant core indicators

Make the analysis accessible and relevant to policymaker, service providers, managers, other user needs

These things  increase demand  better and more data


Public expenditure tracking surveys l.jpg
Public Expenditure Tracking Surveys

Central ministry

Measures and verifies resource flows

District/State gov’t

Measures and verifies resource flows

facility

Measures and verifies resource flows

Service providers

Potential clients

Current clients


Tools pets l.jpg
Tools: PETS

Diagnostic or monitoring tool to understand problems in budget execution

delays/predictability of public funding

leakage / shortfalls in public funding

discretion in allocation of resources

Data collected from different levels of government, including service delivery units

Reliance on record reviews, but also school principal/health facility manager interviews

Variation in design depending on perceived problems, country, and sector


Quantitative service delivery surveys l.jpg
Quantitative Service Delivery Surveys

Central ministry

District/State gov’t

Information on facility functioning

facility

Service providers

Potential clients

Current clients


Tools qsds l.jpg
Tools: QSDS

Generally used for evaluating efficiency of public spending and incentives

Data can be collected on inputs, outputs, quality, costs, pricing, oversight, throughputs, outputs


Household surveys l.jpg
Household surveys

Central ministry

District/State gov’t

Measure usage, outcomes

Measure use of alternatives, outcomes

facility

Service providers

Potential clients

Current clients


Tools household surveys l.jpg
Tools: Household surveys

Examples: Living Standard Measurement Surveys, Demographic and Health Surveys, MICS

Will generally have detailed individual and/or household data on a wide range of characteristics

  • e.g. not just health seeking behavior, but also wealth levels

  • e.g. not just water source, but education levels

    Can be combined with facility surveys (e.g. 17 LSMS surveys)

    Collect data not only on clients, but potential clients


Absenteeism surveys l.jpg
Absenteeism surveys

Central ministry

District/State gov’t

Characteristics of providers and facilities

check to see when providers are present and working

facility

Service providers

Potential clients

Current clients


Tools absenteeism surveys l.jpg
Tools: absenteeism surveys

Conduct random visits by enumerators during working hours to locate doctors/teachers. These visits could be randomized over few months.

Some studies did two checks over the period of a few months

others were visited around the official opening time and closing time of these facilities.

In between that time, the team collects facility-specific and provider-specific information. There was no notification of the visit given before the survey team arrived at the facility.


Vignettes l.jpg
Vignettes

Central ministry

District/State gov’t

Quality of provider (e.g. skill set)

facility

Service providers

Potential clients

Current clients


Tools vignettes l.jpg
Tools: vignettes

Goal: to test the ability of medical personnel to diagnose and treat common conditions in a setting that is similar to their normal practice

Structure:

An enumerator gets trained as a sick person and the characteristics of the illness are predetermined. The practitioner must ask questions and perform physical examination for diagnoses.

Provider then makes diagnoses as under normal circumstances. A competence index is then constructed based on the specific questions asked regarding the history of the case, the examination of the patient, the tests prescribed and the treatment given.


Tools vignettes21 l.jpg
Tools: vignettes

Variations:

Other types of Vignettes include use of hypothetical scenarios where the practitioner is either asked to list the specific procedures he would use to diagnose a particular type of patient, or whether or not he would a particular procedure for a patient with specific symptoms, etc.

Direct observation is another option where the behavior of clinicians with their own patients is studied. However, because the case mix varies between clinicians, it is difficult to compare across practitioners and not always relevant.


Exit surveys l.jpg
Exit surveys

Central ministry

District/State gov’t

Client satisfaction, perceptions, informal payments, waiting time, etc

facility

Service providers

Potential clients

Current clients


Tools exit surveys l.jpg
Tools: exit surveys

Exit polls for user satisfaction (can be done for patients alone or a sample of households if non-users are included). Data can also be collected through focus group discussions and report cards.

Limitations of exit polls

Problems in interpreting the subjective perceptions of health care quality

“Courtesy bias", where individuals may provide responses that they are socially acceptable.

Different to interpret because of important systematic differences across demographic and socio-economic groups, possibly making client perceptions poor proxies for objective assessments of different dimensions of quality.


Report cards l.jpg
Report cards

Central ministry

District/State gov’t

Assessment of services and opinions

facility

Service providers

Potential clients

Current clients


Tools report cards l.jpg
Tools: report cards

Citizen/Community-wide report cards:

use a range of different tools to get information and opinions on prices, quality, waiting times, courtesy, etc. Can also be used to complement and support facility surveys.

For example, the Bangalore report cards by the Public Affairs Center (PAC) in Bangalore summarize citizens' assessment of services provided by public agency officials and solicits opinions on specific aspects of service provision, including staff behavior, quality of service and communication of information, information on bribes paid in connection with service provision, etc.


Tools report cards26 l.jpg
Tools: report cards

Citizen report cards

Use a randomized survey questionnaire

Community report cards

Use focus groups

Citizen reports easy to aggregate, but with community report cards and the need to reach consensus  hard to aggregate

Both will be colored by expectations (more on this later)



Reason 1 accountability l.jpg
Reason 1: Accountability

B

A

Source: WDR 2004


Provider citizen leg a l.jpg
Provider-citizen leg (A)

  • Realized demand

    • How much is used, how much is paid, etc.

    • QSDS, exit surveys, administrative data, hh surveys

  • Satisfaction

    • e.g. length of wait for Dr, teacher’s performance

    • Report cards, questions in hh surveys, exit surveys

  • Is it correlated with objective measures of quality? Not always

    • Lundberg: vitals, examinations not corr w/ satisfaction

       Think about why you are doing this…


Reason 1 accountability30 l.jpg
Reason 1: Accountability

B

A

Source: WDR 2004


Government provider leg b l.jpg
Government – provider leg (B)

  • Monitoring (administrative data)

    • Most effective when:

      • Routine collection, timely availability

      • Need sufficient quality

      • Adequate breadth, but not over burden providers

      • They have to be used

    • Can be used to draw inferences about program performance

      • Combine for impact evaluation, dose response (Galasso, Behrman and King)

    • Set service standards and measure relative performance


Government provider leg b32 l.jpg
Government – provider leg (B)

  • Absenteeism surveys

    • Admin systems may get these data wrong

  • Facility surveys

    • Not a replacement for monitoring

    • Can get at broader, deeper data that would overwhelm monitoring system

    • Can get at more nuanced issues such as incentives, motivations and behavior


Government provider leg b33 l.jpg
Government – provider leg (B)

  • Tracking the flow of resources: PETS

    • In-depth information on flows and losses

    • What is fraud, what is inefficiency, what are legitimate reallocations?

    • If there is a fairly open dialogue, this can feed into thinking about allocation rules in government


And what happens in a b may impact c l.jpg
And what happens in A&B may impact C

C

B

A

Source: WDR 2004



Reason 2 understanding poverty inequality and targeting the response l.jpg
Reason 2: Understanding poverty & inequality and targeting the response

  • Whether we see poverty as income or multidimensional, measuring health & education is important

  • Understanding poverty & the service environment of the poor

    • LSMS surveys, didn’t originally contain facility component, 17+ do.

    • Link households to facilities they use (e.g. IFLS)

    • HH as starting point


Targeting the policy response l.jpg
Targeting the policy response the response

  • Separate out measures of quality that reflect the underlying poverty (development response) and those due to deficiencies in service delivery

    • Vignettes e.g. – why we can’t use whether a Dr. follows a protocol

      • Educated patients might encourage doctor, etc

      • Need to put Dr. through a vignette

      • Das & Leonard: Poor served by worse quality physicians


Targeting natural disaster response l.jpg
Targeting: natural disaster response the response

  • Frankenberg, et. al. – response to Tsunami

    • hh surveys + facility surveys + GIS information (combination)

    • What facilities were destroyed

    • But also: where population has moved so you can build back more appropriately (considering both disaster hit and surrounding areas) – get at dynamics


Reason 3 evaluation especially impact evaluation l.jpg
Reason 3: Evaluation, especially impact evaluation the response

  • IE defined: counterfactual construction

  • We can see this as part of both the citizen/gov’t links (demonstrating validity) and gov’t/provider links (what works)

  • Use service delivery data to look at marginal impacts of program exposure

    • Galasso uses phase in and time of exposure to look at outcomes (such as malnutrition)


Evaluation and impact evaluation l.jpg
Evaluation and impact evaluation the response

  • Evaluating a change in management

    • Look at how changes in service delivery (e.g. performance based pay for health care providers) changes welfare outcomes (e.g. child mortality)

    • Look at changes in service provision in its own right

  • Look at how increases in client voice/information change service delivery and outcomes

    • Bjorkman & Svensson: information on health provider performance and gov’t standards  better health outcomes, perceptions of service

  • Look at impacts with heterogeneous treatment

    • Answer the question: what type of facilities provide this intervention with the greatest impact?


Reason 4 policy relevant research l.jpg
Reason 4: policy relevant research the response

  • Understanding the link between quality (e.g. skill of provider, infrastructure, etc) and client outcomes

  • Understanding the demand for services

    • Understand who the client are, who are not clients and why

    • Sampling is tricky…all available, all the hh uses?


Reason 4 policy relevant research42 l.jpg
Reason 4: policy relevant research the response

  • Understanding facility production processes

    • Going beyond, deeper than monitoring

    • e.g. whether facilities are at the optimal size (efficiency), are human and physical capital being used in the right proportions, what inputs are being wasted


Thank you l.jpg

Thank you the response

Are You Being Served

on the web:

http://go.worldbank.org/F6KIIC0700



Slide45 l.jpg

Central ministry the response

District/State gov’t

facility

Service providers

Potential clients

Current clients


Perceptions unpacked lundberg l.jpg
Perceptions unpacked (Lundberg) the response

  • Compare facility survey data with exit polls in Uganda

  • Significant correlations:

    • Waiting time (-)

    • Consultation time (-)

    • Treated politely (+)

    • Asked questions (+)

  • Not significant

    • Given physical exam

    • Touched during examination

    • Pulse taken


ad