1 / 25

Overview and Emerging Evidence of Benchmarking as a Regulatory and Policy Tool

Overview and Emerging Evidence of Benchmarking as a Regulatory and Policy Tool. Rui Cunha Marques rui.marques@tecnico.ulisboa.pt University of Lisbon. Agenda. Introduction Regulatory Benchmarking Case studies – Price Regulation Case studies – Quality of Service Regulation

yitro
Download Presentation

Overview and Emerging Evidence of Benchmarking as a Regulatory and Policy Tool

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Overview and Emerging Evidenceof Benchmarking as a Regulatoryand Policy Tool Rui Cunha Marques rui.marques@tecnico.ulisboa.pt Universityof Lisbon

  2. Agenda Introduction Regulatory Benchmarking Case studies – Price Regulation Case studies – Quality of ServiceRegulation KeyIdeas

  3. Introduction

  4. Introduction Merits of competition Natural monopoly and othermarketfailures • Quietlife; • X-inefficiencies • Excessprofits Reduced incentives towardsefficiency and innovation Solution: A regulatoryframeworkbasedon… (1) Yardstickcompetition: priceorrevenuecapregulation Benchmarking (2) Sunshineregulation: ‘embarrass’ theutilities Determine in anobjectivewaytheoptimal incentive schemebyconsideringefficiency

  5. Regulatory Benchmarking

  6. Regulatory Benchmarking? What For? • Benchmarking iswidelyusedbytheregulators to: • Introduce incentives to operators to be efficient and innovative, mitigating the costs of operation and the capital expenses; • Put an on-going pressure on the utilities to improve the quality of service; • Assure a fairer recovery of costs and of the capital investments; • Increase transparency and sharing of information, minimising its asymmetry between different stakeholders (specially between regulator and operators).

  7. How to Apply?

  8. Compulsory Benchmarking • Price Regulation • Benchmarking isused to estimate productivity gains anticipated from each WWS during the regulatory period. • SunshineRegulation • Consists of comparing and publicly discussing operators’ performance. Operators become aware of their performance through the pressure put upon them by their stakeholders (customers, media, politicians, NGOs and so on).The operator that performs poorly gets embarrassed and, as a result, tends to correct the failures.

  9. Applying Benchmarking Metrics Measurement criteria Computation of the metrics Explanatory factor analysis Comparison with reference values

  10. ExplanatoryFactors Explanatory factors are factors or indicators able to justify the level of performance (better or worse) attained. • Examples of explanatory factors: • Market structure factors (scale, scope and density economies); • Historicalfactors(past investments interfere with CAPEX/OPEX …); • Socialfactors (% of industrial customers, bigger customers, consuming habits, peak factor, population density, GDP,…); • Environmentalfactors (weather, …); • Regulatoryfactors (regulation, price policies, taxes, demand policies, …); • Localfactors (topography, availability of resources, …).

  11. ReferenceValues

  12. Data anditsQuality Reliability is defined as the confidence degree of how the data was gathered; Accuracy is a number indicating its likely range of error; The standardised confidence indicator is built upon the reliability and accuracy factors by joining the letter with the number. And it is very important to audit the information collected… … byexperts,sincethey are awareof (typical) errors, thusincreasingthequality of benchmarking (Source: OFWAT 1997 and IWA, 2000)

  13. It’s all about the Data… Need of adequate data of sufficient quality ‘garbage in’ => garbage out . Consultants and academics recognize that: “If you torture the data enough they will confess.” Regulators should note that not all elements that can be counted really “count”: “Make what’s ‘important’ measurable not what’s measurable important.” Ormaybenot…

  14. Case StudiesPrice Regulation

  15. Case studies WWS regulation in England and Wales uses the CPI-X to set prices The Water Services Regulation Authority (OFWAT) Regression DEA Determine the efficient costs that are the basis for the X factors calculation (in the price cap formula) A – Most efficient; B – Over average; C – Average; D – Below average; E – Less efficient The efficient costs are transformed into the targets to be reached by each WWS in the following regulatory period.

  16. Case studies (cont.) Chile … efficient operator is imposed to enable the regulator to determine the base costs for the setting of tariffs and it can further include the expected productivity earnings (X factor) in the price cap formula Colombia Economic regulation of the water sector La Comisión de Regulación de Água Potable y SaneamientoBasico(CRA) Definition of methods and tariff formulas It uses benchmarking (DEA technique) to compute the efficient administrative costs and the efficient OPEX; Regulatory Process Based on a system of price cap defined for a period of five years which also includes a prices floor (minimum limit of 50 %).

  17. Case StudiesQuality of ServiceRegulation

  18. Case studies Victoria, Australia Responsible for… The regulator (now Essential Services Commission – ESC) applies sunshine regulation here since 1994. • the quality of service supervision the quality of the supplying (e.g. water quality and compliance with the norms); the service reliability (e.g. interruptions, non-revenue water and blockages); the services availability (e.g. prices, special customers and lack of payment); the customer service (e.g. call centres, claims and customers’ satisfaction). • the economic regulation … increases the transparency and accountability of the WWS. The WSS performance improvementcan be seen, i.e., in the evolution of the indicator water interruptions

  19. Case studies (cont.) Portugal

  20. Case studies (cont.) State of Ceará, Brazil ARCE (Regulator) – uses sunshineregulation to monitor the WWS quality of service It adopts an evaluation system, which encompasses four levels of performance according to the score obtained and the benchmark AnnualReport of Performance Evaluation “Letter” of performance

  21. Case studies (cont.) Zambia supervise the quality of service provided, including the drinking water quality The National Water Supply and Sanitation Council (NWASCO) Itis assured by the comparison of a PIs set applied to each operator followed by its public display (sunshine regulation). NWASCO PIs for water supply and their weights Reference values are made available by the regulator for each PI. NWASCO develops a ranking based on the PIs results, assuming weights for the PIs.

  22. Case studies (cont.) South East Europe There are several interesting cases of benchmarking applications Albania – Water RegulatoryAuthority Commonly,the regulatory model is based on a sunshine regulation (publicizing the performance results of a set of metrics…)

  23. Case studies (cont.) Kosovo – WWRO The overall performance assessment and ranking of the companies as a result of this assessment is done based on the key performance indicators (KPIs) Uses sunshineregulation for WWS performance monitoring A ranking methodology is adopted: • Several KPIs; • The KPIs are given weights depending on their importance; • The reliability of the data is scored based on the findings from the audit process;

  24. Keyideas Comparing apples with apples (like with like) Computing a performance metric without including the explanatory factors has little interest • We should always bear in mind reference values not only to measure the performance but also to set targets for the future Garbage in – Garbage out Simple models with robust analysis and consistency checks better than complex models with superficial analysis

  25. Thankyou!! Rui Cunha Marques rui.marques@tecnico.ulisboa.pt www.ruicunhamarques.com

More Related