1 / 65

June, 2010

Standard & P oor’s Risk Solutions Data Consortia. June, 2010. Agenda. Standard & Poor’s Risk Solutions – Introduction Data Consortium – What is it? Why are Consortia Needed? Benefits of a Credit Data Consortium What does Standard & Poor’s Provide? Step 1: Initial diagnosis

ashton
Download Presentation

June, 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Standard & Poor’s Risk Solutions Data Consortia June, 2010

  2. Agenda • Standard & Poor’s Risk Solutions – Introduction • Data Consortium – What is it? • Why are Consortia Needed? • Benefits of a Credit Data Consortium • What does Standard & Poor’s Provide? • Step 1: Initial diagnosis • Step 2: Implementation of the consortium • Step 3: PD data pooling, cleaning, aggregating, testing and analysis of the data • Step 4: Reporting & Deliverables • Step 5: Building models on the aggregated data • Standard & Poor’s Consortia Experience

  3. Standard & Poor’s Risk Solutions - Introduction

  4. Standard & Poor’s Risk Solutions - Introduction • Standard & Poor's Risk Solutions provides financial analysis and risk management solutions to assist credit sensitive institutions make informed decisions regarding originating, measuring and managing credit risk arising from their day-to-day business activities • We address all major components of financial analysis, including data, methodologies and processes for the analysis of probability of default, loss given default and exposure at default • These integrated credit risk management solutions leverage Standard & Poor's experience in credit assessment to help institutions manage credit risk, calculate economic and regulatory capital, and manage their balance sheets more effectively

  5. Standard & Poor’s Risk Solutions - Introduction • Core Competencies • Internal Rating Systems • Internal rating systems design, assessment and improvement • Obligor and facility ratings • Validation • Models. Off-the-shelf and custom models to measure PD, LGD or to estimate credit ratings • Data. Globally we facilitate or run a significant number of data collection exercises • PD & LGD. PD & LGD data collection, analysis and modeling. S&P Risk Solutions is a leader in this field

  6. S&P Risk Solutions – corporate structure • Confidential information is “firewalled” between Risk Solutions and the Rating Services of Standard & Poor’s. Risk Solutions is a “non-ratings” business of Standard & Poor’s Standard & Poor's Fixed Income & Risk Management Services Rating Services Structured Corp. & Risk Leveraged Commentary & Data Finance Govt. Solutions Ratings Ratings

  7. Data Consortium – What is it?

  8. Data Consortium – What is it? • A Data Consortium is a group of banks that agree to pool data, usually on a confidential basis, to a central repository, whereupon data cleansing, aggregation and analysis takes place • The data will typically relate to one or more homogeneous asset class and may be examining default or both default and recovery, or just recovery • Standard & Poor’s preserves the confidentiality of both the bank’s clients and the performance of the individual bank’s portfolio • Reporting outputs by Standard & Poor’s are agreed collectively with the participating banks • Standard & Poor’s could develop PD & LGD Models from the aggregated data

  9. Why are Consortia needed?

  10. Why are Consortia needed? • Individual banks’ default and loss experience is relatively sparse within specific asset, industry and collateral sub-groups • often relatively few defaults a year • resolution of final losses can take considerable time • scarcity drives compromise; one must balance statistical significance against granularity of estimates produced • Need bigger, deeper data set to provide more statistically robust information quicker • to achieve objective of estimating PD and LGD as accurately as possible • difficult for banks to address individually • it may be that the whole market does not have statistically robust data for certain asset classes, but this should be demonstrated

  11. Why are Consortia needed? • Importance of robust Probability of Default (PD) and Loss Given Default (LGD) benchmarks • Pressure for change in approach to credit risk measurement • Risk based pricing and economic capital allocation require the separate consideration of PD & LGD • Basel II Internal Ratings Based Approaches (Foundation and Advanced) • Both are important in determining expected loss and unexpected loss • For level of capital – capital is a buffer against uncertain outcome • For capital allocation – risk-based pricing & performance management • For credit risk management processes • Multi-dimensional ratings

  12. PD and LGD meeting banking needs Credit Approval Portfolio Management Treasury/ CFO/CEO Business Development Loan Origination • Economic Capital • Securitisation • Regulatory Capital management • RAROC • Unexpected loss • Stress Test • Portfolio analysis • Risk Mitigation • Expected loss • Loan/Credit MIS (Mgt info System) • Stress Test • Formal assessment of pricing • Financial Statement Spreading • Database on clients and prospects • Benchmark comparison • Model • Pro forma pricing • Stress Test (Company and industry) • Pricing assessment • Is credit rated properly?

  13. Benefits of a Credit Data Consortium

  14. Benefits of a Credit Data Consortium • With Basel II, Banks have to move away from the traditional assessment of lending on an “Expected Loss” basis and separate it into the probability of default (PD) and the loss given default (LGD). The data collected in pooling exercises greatly facilitates this exercise, both by providing more robust statistics and, in certain instances, by enabling the construction of quantitative models • All banks will benefit by the more rapid aggregation of data and the building of a robust set of normalized statistics. In fairly short order the banks will receive their own conformed default experience compared with the industry as a whole, together with some key financial statement benchmarks • Stakeholders • Banks (large & small) • Regulator • Data Agent & Supplier of Services

  15. Benefits of a Credit Data Consortium • For the larger banks: • Those aspiring to Advanced IRB status can build up more observations on recovery more quickly. LGD has to be captured over a period of time, often considerable, whereas default is a binomial, instantaneous event • The consortium can decide to exchange data with a consortium in another country, which would prove useful should the bank be in that market or considering entry • Although a bank may be large, smaller banks often have interesting regional or industry-specific data, so that their data, whilst not so numerous, may still add value to the larger bank • Large banks, when using the benchmark data to present comparative analysis to external parties, such as regulators or rating agencies, can refute suggestions of “cherry picking” if they include all the banks • The banks receive expert advice on how to compile an appropriate database of its own credit experience

  16. Benefits of a Credit Data Consortium • For the smaller banks: • Access to countrywide experience • A benchmarking portfolio that replicates the market • Insight on the experience in particular industrial sectors, in which it is not presently participating, thus informing expansion decisions • Some of the “large” bank benefits apply – for instance, a “small” bank in the corporate market may be a large retail lender that would benefit from attaining the Advanced IRB standard

  17. Benefits of a Credit Data Consortium • For the larger and smaller banks: • Top management has benchmarks against which to assess the performance of their own bank • The business development area has benchmark comparisons on lending decisions and pricing • Credit Risk departments can benchmark their internal credit ratings • Guidance for stress-testing and scenario analysis • An informed strategy and risk appetite, from industry and regional analysis • More accurate pricing and analytical assumptions for CDOs. • The underpinning by facts of assumptions for RAROC models

  18. Benefits of a Credit Data Consortium • For the Regulator: • A reliable historical benchmark against which the performance of each bank can be measured using conformed data. Interpretation of the results is still essential – a higher default rate may be indicative of a greater risk appetite in that bank and supported by higher margins • The bigger, deeper data set should lead to an improvement in the quality of risk management throughout the industry • Successful implementation of the consortium would cement a reputation as a forward-looking regulator. For instance, Saudi Arabia has led the way and other regulators are contemplating consortia

  19. Benefits of a Credit Data Consortium • Benefits of (i.e. data driven) quantitative Models: • A robust benchmark for a bank’s own IRB internal rating system • Or, an input to a bank’s own IRB with the bank’s expert judgment overlay • Leverage of S&P’s expertise, with the overhead effectively spread over the members of the consortium • An effective tool for the analysis of structured transactions • A quick and effective input to pricing and economic capital allocation models • A tool for rapid assessment of potential new business, marketing approaches, etc.

  20. Data Ownership • Ownership of the data remains with the banks throughout • We are highly experienced in maintaining the confidentiality of information – it is core to many facets of our business • All distribution of conformed statistics back to banks does not reference individual customers and is sufficiently aggregated to disguise the portfolio of individual banks • We could build models trained on the aggregated data, but it does not distribute the data in any manner • Numerical identifiers can be substituted

  21. What does Standard & Poor’s Provide?

  22. What does Standard & Poor’s provide? • Step 1: Initial Diagnosis of potential data availability • Detailed Structured Questionnaire • Management Interviews • Security Requirements • Questions & Answers for consortium members • Step 2: Implementation of the Consortium • Agreement on the consortia structure and terms of reference • Agreement on the deliverables • Step 3: Pooling, cleaning, aggregating, testing and validation of the data • Step 4: Delivering the data reports • Step 5: Building models on the aggregated data

  23. What does Standard & Poor’s provide? Step 1: Initial diagnosis of potential data availability

  24. What does Standard & Poor’s provide? • Step 1: Initial diagnosis of potential data availability • For each Member Bankreview the existing data and workflows and so determine: • Definitions and standards of default, emergence, and recovery • Volume and historical timeframe of existing datasets • Format and structure of non-electronic documentation • Data storage format – in databases, desktop PC’s, paper files • Data storage location geographically • Early view of portfolio (to help develop segmentation) • Workflows for existing loans, distressed and defaulting credits • Structure of datasets versus an “ideal” dataset • The IT environment of the bank • Leading to an efficient and effective implementation of the consortium

  25. What does Standard & Poor’s provide? Step 2: Implementation of the consortium

  26. What does Standard & Poor’s provide? - Governance • Step 2: Implementation of the consortium • It is important to establish the “rules of the game” at the outset • There are a number of feasible structures • We favour an appropriately resourced two-committee structure • A Management Committee to take policy decisions, inevitably all events cannot be predicted at the outset • A Methodological Committee dealing with technical issues in more detail • Standard & Poor’s can assist in drawing up Terms of Reference for the Committees

  27. Management Committee Methodology Committee S&P S&P What does Standard & Poor’s provide? - Consortium Organization

  28. What does Standard & Poor’s provide? - Consortium Organization • Management Committee decisions – acceptance of new members – communicating with banks not in compliance – sharing some statistics with other consortia • Methodology Committee – minimum standards (“must have” data fields & quantity) – model drivers discussion with Standard &Poor’sexperts – Standard &Poor’s contributes knowledge and experience

  29. Borrower Financial Performance Histories Borrower Credit Performance Histories and Other Borrower Information Link Industry Geography Company Type Asset Class Instrument Payment Delinquencies Write-offs Financial Statement Accounts Probability of Default (PD) Data Consortium Basics • For each bank in the consortium S&P links the history of Borrower’s Credit performance and Other Borrower Data (qualitative) to the history of that borrower’s financial performance • The aggregate set allows predictive modeling of credit performance based on time series of financial accounts • Approach effective for middle-market and corporates where financial performance determines credit performance and a statistically large number of cases can be collected

  30. What does Standard & Poor’s provide? Step 3: PD data pooling, cleaning, aggregating, testing and analysis of the data

  31. What does Standard & Poor’s provide? • Step 3: PD data pooling, cleaning, aggregating, testing and analysis of the data • Objective - aggregate a robust PD dataset for quantitative modeling and statistical benchmarking • Collect a sufficient number of observations (both defaulters and performing companies) • Best practices PD data set – combination of borrowers’ credit histories and their financial histories • Rely on objective data elements (financials, balances, days past due, etc.) • Aggregate a chronologically “deep” data set - covering one economic cycle • Quality of data: ensure that all aspects of consortium data are a close representation of the credit reality in the marketplace

  32. Middle Market PD Data for Model Development – Data Quantity • Corporate/SME modelling • To develop a powerful model, a data set of 400 to a500 defaulted entities (entire consortium) • Most effective way to achieve consortium goals – historical PD data submission (3-4 years) + data going forward, and LGD collection (a ”go-forward approach”)

  33. PD Data Process Flow Loan Accounting System Extract (Borrowers&Loans) Matching, Linking Extracts, Treating Duplicates, i.e. Develop a “System” Data Validation Routines Data Standardization Data Consolidation Reporting Financial Statements Extract Mapping

  34. PD Data Structure Borrower 1 Statement FYE 1 Statement FYE 2 Statement FYE 3 Borrower 2 Statement FYE 1 Statement FYE 2 Statement FYE 3 Borrower 3 Statement FYE 1 Statement FYE 2 Statement FYE 3 Financial Statements from Spreading System Loan Accounting System Borrower 1 Borrower 2 Borrower 3 Portfolio Default Report Industry Geography Company Type Asset Class Instrument Payment Delinquencies Counts of Defaulters vs. all companies in portfolio Balance Sheet Items Income Statement Items Statement Period (Year)Audit Quality

  35. Bank’s Historical Financial Statements - Scenario 1 Borrower Financial Statements Bank-analysts have already input over the years Database Statements already in database format Many 1000s of Statements Name Matching Loan Accounting System Statements Table (“unrefined” data) Project Action:Data is extracted for matching and clean-up

  36. Bank’s Historical Financial Statements - Scenario 2 Borrower Financial Statements Bank-analysts have already input over the years Extracts containing multiple electronic borrower files Data Aggregation Loan Accounting System Project Action:Data is extracted from many hard-drives and aggregated Name Matching Statements Table (“unrefined” data)

  37. Data Clean-up Tools Example Name-matching

  38. Data Standardization – Chart of Accounts Mapping

  39. Proposed Data Validation Process Methodology Committee Provides Guidance Management Committee Provides Feedback And Directs Action Data Quality Assessment Stage 2 Borrower Matching And Removal Of Duplicates Data Quality Assessment Stage 3 Portfolio Level Data Analysis Data Quality Assessment Stage 1 Automated Data Integrity Checks Management Committee Data Quality Report and Review Data Quality Workshops Are Held At the Beginning Of Every New Collection Cycle

  40. Data Validation Process – Automated Data Checks Mandatory Elements Checks Relational Rules Verification Logical Tests

  41. Data Validation Process – Automated Data Checks Financial Statement Validity Rules Qualitative Data Validity Rules Prioritization Rules

  42. Borrower Characteristics Instrument Information Security Details Guarantor Description LGD/Recovery Data – Credit Events and Time-points of Interest approx. 1 ~ 5 years O D – 1 D 1st CF 2nd CF Nth CF R O: Origination D – 1: One-Year Prior to Default D: Default R: Resolution CF: Cash Flow

  43. LGD Data Structure • Basel II requires LGD estimates at the facility level. So LGD data has to be collected on the borrower, loan and credit mitigation/cashflow level Borrower ABC Loan 1 Loan 2 Collateral Cash Recovered Guarantor Collateral Cash Recovered Guarantor

  44. KEY ACTIVITIES 95% of value added LGD Data Process Flow Post-Default Recovery Records Accounting System Extract (Borrowers&Loans) Input of 30 Resolved Defaulters Per Year Into “Rec. System” Data Validation Routines Data Standardization Data Aggregation Reporting Collateral Records Financial Records Mapping Automated Processes S&P Consortium analysts Resources - Data Team: S&P Loss Data System + Bank’s Analyst + S&P Credit Data Expert

  45. What does Standard & Poor’s provide? Step 4: Reporting & Deliverables

  46. PD Data Quality Benchmarks and Bank Ranking Reports • Absolute Score Develop a confidence interval regarding model accuracy based on data quality • Relative (bank-specific) Score Quantify bank-specific data quality, and at the same time compare that to consortium benchmark

  47. PD Data Quality Benchmarks and Bank Ranking Reports Data submission comparison on all aspects of quality – PD data

  48. PD Data Quality Benchmarks and Bank Ranking Reports • Example: • Number of historical financial statements per borrower as submitted by the banks

  49. PD Benchmark Reporting Deliverables • Database containing aggregate, anonymized consortium data • Electronic Reports • Reports will contain: • ratio analyses, averages, medians, quartiles for different regions and industry sectors and size • probability of default averages, medians, quartiles by industry sector, region and size • statistics comparing financial performance of defaulters vs. non-defaulters • correlation analyses – mostly industry based

  50. PD Reporting Examples Financial Statement Ratio Analysis Industry Default Correlations * Correlation coefficient varies between plus 1 (perfect positive correlation) and negative 1 (perfect negative correlation). A correlation of 0 indicates no relationship between the time-series being correlated.

More Related