1 / 30

Dissertation paper Operational Risk in Banking System -modeling and advanced quantification methods-

THE ACADEMY OF ECONOMIC STUDIES, BUCHAREST DOFIN-DOCTORAL SCHOOL OF FINANCE AND BANKING. Dissertation paper Operational Risk in Banking System -modeling and advanced quantification methods-. Student: Cosma Anita Georgiana Supervisor: Professor Mois ă Altăr.

deirdre
Download Presentation

Dissertation paper Operational Risk in Banking System -modeling and advanced quantification methods-

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. THE ACADEMY OF ECONOMIC STUDIES, BUCHAREST DOFIN-DOCTORAL SCHOOL OF FINANCE AND BANKING Dissertation paperOperational Risk in Banking System -modeling and advanced quantification methods- Student: Cosma Anita Georgiana Supervisor: Professor Moisă Altăr Bucharest, July 2010

  2. Abstract Operational risk has become an area of growing concern in banking;the increase in the complexity of banking practices has raised both regulatory and industry awareness of the need for an effective operational risk management and measurement system . The paper presents an integrated procedure for the construction of the distribution of aggregated operational losses using external data . The Extreme Value Theory plays an important role; modeling is based on Peaks Over Threshold method, in its Generalised Pareto Distribution and Point Process representations. When the relationship between severity and frequency is taken into account, a bottom-up capital figure is computed . The operational risk assessment should rely on advanced approaches and on solid management frameworks .

  3. Introduction For banks, the urgency for operational risk management has been underscored each time they confronted: system failures, accounting improprieties, fraud, improper sales practices, terrorist acts, system attacks, sabotage, environmental disasters, to name only a few. During the last 10 years, the magnitude of these events served as a wake-up call for the regulatory community. In June 2004, the Basel Committee on Banking Supervision published the “Basel II”, introducing a capital charge for operational risk as a part of the new capital adequacy framework. Risk Management Group(RMG) of the Basel Committee and industry representatives have agreed on a standardised definition of operational risk :”the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events”. The paper is organised as follows: • Literature Review • Background and Theory • Applying the model • Conclusions

  4. Literature Review • A.A.Balkema and L. de Haan (1974)- first described the asymptotic behaviour of the residual life, exploiting the close similarity with Extreme Value Theory • Embrechts (1997,1999,2003), McNeil (1997), Bali(2003)- EVT applied in finance and insurance by modeling the tail of the distribution of losses • Artzner (1999)- coherent risk measures and aggregation of risks • Bertsimas, Lauprete and Samarov (2004)- introduced a risk measure called shortfall • Carol Alexander (2000,2003)- Bayesian Methods and dimensions of operational risk management

  5. Background and Theory The Basel Committee on Banking Supervision(BCBS) proposed several approaches of increasing sophistication, leaving the door open for the development of a range of advanced approaches. • The Basic Indicators Approach (BIA)- based on a fixed percentage (‘alpha’) of gross income • The Standardised Approach (STA)- extends the basic method by decomposing banks’ activity in eight business lines and setting different percentages (‘betas’) for each • The Advanced Measurements Approaches(AMA)- based on banks’ internal models The paper focuses on modeling operational risk losses and developing an advanced model of quantification. Extreme Value Theory -foundations in the mathematical theory of the behaviour of extremes -operational risk data- two “souls”: high-frequency low-impact events(the body/ expected losses), and low-frequency high-impact events(the tail/ extreme events); in practice, the body and the tail of data do not necessarily belong to the same underlying distribution -does not require particular assumptions on the nature of the original underlying distribution of all the observations, which is generally unknown

  6. EVT- two approaches • “Block Maxima” -deals with the maximum/minimum values the variable takes in successive periods -at the heart of this approach is the “three types theorem” (Fisher and Tippet(1928)), which states that there are only three types of distributions which can arise as limiting distributions of extreme values in random samples: Weibull type, Gumbel type and Frechet type -a single three parameter model, known as the Generalised Extreme Value Distribution(GEV) • “Peaks Over Threshold” -focuses on the realisations exceeding a given (high) threshold -the theory ( Balkema and de Haan(1974), Pickands(1975) ) maintains that for a large class of underlying distribution functions F, the conditional excess distribution function , ,for u large, is well approximated by The Generalised Pareto Distribution (GPD): (1) where x ≥ 0 if ξ ≥ 0, 0≤ x ≤-σ ∕ξ if ξ < 0 and ξ and σ represent the shape and the scale parameter -adding a location parameter: (2)

  7. -having the unknown distribution function of a random variable x, ,which describes the behaviour of the operational risk data in a certain BL, the excess distribution at the threshold u , , can be introduced as a conditional distribution function : (3) -adopting the approximation using the theory: (4) where is the right endpoint of the distribution and the “excess GPD” is (5) with: y = x-u = excess, ξ = shape, β = scale -isolate from (3): (6) -using the exceedances x in place of the excesses y : transform into -looking at the limit condition (4), can be approximated by a suitable GPD: (7)

  8. -substituting the in (6): (8) -element required :the value of the distribution function in correspondence with the threshold u (9) where n is the total number of observations and the number of observations above the threshold u • can be completely expressed : which simplifies to (10) -the tail estimator is also GPD distributed: it is the semiparametric representation of the referred to all the original data, with the same shape ξ and location and scale equal to μ and σ -semiparametric estimates for the “full GPD” parameters can be derived from those of the “exceedance GPD” : (11)

  9. -expressing the scale parameter using the one-to-one relationship between the full GPD and the exceedance GPD (12) -moving easily from the excess data (y = x-u) to the tail of the original data (x > u) and from the excess distribution to the underlying distribution -consequence: if the exceedances of a threshold u follow a ,the exceedances over higher thresholds v > u are Measures of severity Value at Risk – obtained by inverting the tail estimator at (10): (13) Median Shortfall -starting from the Median Excess Function (that is the median of excesses over a threshold) by inverting the expression at (5) that is the excess GPD at u for a probability p: -imposing p=1/2 to get : (14) -obtaining the median shortfall at u and higher thresholds v: (15)

  10. Tail frequency estimate -time-adjusted intensity : (16) where is the per-bank mean number of exceedances ( at v or at u ) in a period of length T -empirical average intensity pertaining to the i-th BL: the ratio between the total number of exceedances that occurred in that BL and the number of banks providing data to the i-th BL Capital Charge (17) where CaR is the capital at risk and i ranges from the starting threshold u to that close to the 99.9 percentile

  11. Applying the model Data characteristics and assumptions -group database, 28 subsidiaries from different countries of an international bank -gather information on operational losses exceeding 1000 EUR -loss database for 2004-2009, 2406 observations -grouped by business lines(5 BL) -independence assumption: losses are caused especially by internal drivers; banks have not similar characteristics being located in different countries and are exposed to losses of any kind or size -dataset appears to capture the large-impact events, but the distribution of loss amounts contains few very large-impact losses

  12. -empirical distributions are very skewed to the right and heavy in the tail

  13. -extreme losses are spread out across the BLs

  14. Fitting distributions to the original data -fitting several distributions to the data in each BL, according to an increasing level of kurtosis, starting from light-tail distributions(Weibull), passing through medium-tail ones(Gamma, Exponential, Gumbel, LogNormal) to heavy-tail models(Pareto) -the aim is to detect the best curve that explains the behaviour of the severity of losses in the tail -fitting parametric distributions to the five overall data sets and obtaining a parameters estimate that optimize the criterion of maximum likelihood -formulate the null hypothesis that the data originates from the chosen distribution with the estimated parameters, and adopt the “goodness of fit” tests Kolmogorov-Smirnov and Anderson-Darling -the test values are much higher than the critical ones, at every significance level -the lowest values for the K-S and A-D tests are bolded showing that the Pareto distribution function, followed by LogNormal, best fit the data

  15. -comparing the Empirical Cumulative Distribution Function(ECDF) with the tested distributions’ CDF

  16. -comparing the Empirical Cumulative Distribution Function with LogNormal and Pareto CDFs

  17. -even though some selected distributions fit the body of data well, these would underestimate the severity of data in the tail area -considerable skewness to the right of the empirical distribution causes each curve parameters estimate to be mainly influenced by the observations located in the left area of the empirical distribution, reducing the power of data located in the tail, which is the most important Threshold selection and GPD fitting -plotting the Mean Excess Function for each BL -detect a change in the slope of the plot -leaving a sufficient number of observations above

  18. -several GPD models are fitted to the excesses, with thresholds starting with the chosen one, to higher values -final results of the threshold selection: parameters estimate,K-S and A-D test and critical values

  19. -GPD curve together with the empirical distribution function; the closeness of the GPD and the empirical distribution of the tails is now found

  20. Measures of severity Value at Risk computed on the basis of the estimates of ξ and β gained before different confidence levels,computed for each BL not a coherent measure of risk: does not satisfy the property of subadditivity Median Shortfall each BL’s severity riskiness increases remarkably at the highest percentiles the ranking of the first three riskiest BL (Commercial Banking, Trading and Sales, Retail Banking) does not change as the threshold is raised to the 99.5’th percentile GPD-VaR(‘000 EUR) GPD-MS(‘000 EUR)

  21. Frequency modeling -per-bank analysis of the actual number of exceedances at the GPD starting threshold occurring in each BL: this number is rather widespread over the panel of banks;some may not have losses higher than the selected threshold; possible gaps in the data collection -variability of the frequency of large losses may lie in the fact that banks have different sizes and are not in the same position to produce the same magnitude losses -number of per-bank exceedances :fitted by a Poisson, Binomial Negative and Geometric distribution -the closest values of the K-S and A-D tests to the critical values are bolded :the Binomial Negative model is the best fitting one

  22. -graphical representations of the ECDF of the number of exceedances for each BL, versus the Negative Binomial and Geometric CDFs

  23. -possible incompleteness in the number of extreme losses -assume that the extreme losses that exceed the 99.9’th percentile (in our database having a frequency of 1) will not occur with a frequency higher than the one corresponding to the extreme losses that exceed the 99’th percentile (given that the frequency is decreasing as we move to high percentiles) -the floor was chosen taking in consideration the highest percentile where we have more than one exceedance for all the business lines -compute for each BL and at any percentile, an estimate of the operational risk capital charge required to cover expected plus unexpected losses frequencies corresponding to 90’th-99.99’th percentiles intensities corresponding to 90’th-99.99’th percentiles

  24. Capital Charge

  25. Conclusions • Low performance of conventional severity models in describing the overall data characteristics : any traditional distribution applied to all the data in each business line tends to fit the central observations, and not very well the tail • The Peaks Over Threshold approach appears to be a suitable tool since it takes into account the relationship between the frequency and the severity of large losses up to the end of the distribution • A high sensitivity of the model remains for the largest observed losses : the constructions rely on mathematical assumptions on which the reality does not always fit • A solid reporting system for operational losses would help banks expand approaches to operational risk and improve risk assessment

  26. Thank you !

  27. Selective References • Alexander,C (2000a), “Bayesian Methods for Measuring Operational Risk”,Discussion Paper in Finance-ISMA Centre (2003b), “Operational risk: regulation,analysis and management”, Financial Times,Great Britain • Artzner,P.,F. Delbaen, J.M. Eber and D. Heath (1999), ”Coherent measures of risk”, Mathematical Finance, 9(3),203-228 • Allen,L. and T.G. Bali (2004), “Cyclicality in Catastrophic and Operational Risk Measurements”,NYU Working Paper No. FIN-04-019 • Balkema,A.A. and L. de Haan (1974), “Residual life time at great age”, Annual Probability,2,792-804 • Bali,T.G. (2003), “An extreme value approach to estimating volatility and Value at Risk”,Journal of Business,76(1),83-108 • Bertsimas,D.,G.J. Lauprete and A. Samarov (2004), “Shortfall as a risk measure: properties, optimization and applications”,Journal of Economic Dynamics & Control 28, 1353-1381 • Basel Committee on Banking Supervision (2005), “International Convergence of Capital Measurement and Capital Standards”,Revised Framework,140-152,Basel,BIS • Basel Committee on Banking Supervision (2001), “Working Paper on the Regulatory Treatment of Operational Risk”,Basel,BIS • Basel Committee on Banking Supervision (2003), “The 2002 Loss Data Collection Exercise for Operational Risk:Summary of the Data Collected”,Basel,BIS • Basel Committee on Banking Supervision (2004), “Principles for the home-host recognition of AMA operational risk capital”,Basel,BIS • Basel Committee on Banking Supervision (2009), “Basel Committee initiatives in response to the financial crisis”,Basel,BIS

  28. Chapelle,A.,Y. Crama,G. Hubner and J.P. Peters (2005a), “Measuring and Managing Operational Risk in the Financial Sector:An Integrated Framework”,Working Paper-Social Science Research Network(SSRN) (2006b), “Practical Methods for Measuring and Managing Operational Risk in the Financial Sector:A Clinical Study”, HEC de l’Universite de Liege,No. 200611/13 • Chavez-Demoulin,V.,P.Embrechts and J. Neslehova (2005), “Quantitative Models for Operational Risk:Extremes, Dependence and Aggregation”,Paper presented as an invited contribution at the meeting “Implementing an AMA for Operational Risk”, Federal Reserve Bank of Boston • Chorafas,D.(2004), “Operational Risk control with Basel II-basic principles and capital requirements”,Elsevier,Oxford • De Fontnouvelle,P.and J. Jordan (2004), ”Implications of Alternative Operational Risk Modeling Techniques”,Paper prepared for the NBER Project on the Risks of Financial Institutions • De Fontnouvelle,P.,V. DeJesus-Rueff,J. Jordan and E. Rosengren (2003), ”Using Loss Data to Quantify Operational Risk”,Federal Reserve Bank of Boston, Working Paper • Davison, A.C. and R.L. Smith (1990), “Models for exceedances over high thresholds (with discussion)”, Journal of the Royal Statistical Society, 52, 393-442 • Embrechts,P.,R. Kaufmann and G.Samorodnitsky (2004), “Ruin theory revisited: stochastic models for operational risk”,ECB,Volume on Foreign Reserves Risk Management • Embrechts,P.,S.I. Resnick and G. Samorodnitsky (1999), “Extreme Value Theory as a risk management tool”,North American Actuarial Journal,3,30-41 • Embrechts,P.,C. Kluppelberg and T. Mikosch (1997), “Modelling Extremal Events for Insurance and Finance”,New York,Springer

  29. Embrechts,P. (1999), “Extreme value theory in finance and insurance”,Manuscript, Department of Mathematics,ETH,Swiss Federal Technical University • Frachot,A.,T.Roncalli and E. Salomon (2004), “The Correlation Problem in Operational Risk”,Social Science Research Network(SSRN),OperationalRisk Risk’s Newsletter • Hill,B.L (1975), “A simple general approach to inference about the tail of a distibution”, Annals of Statistics,3(5),1163-1174 • Hussain,A. (2000), “Managing operational risk in financial markets”,Butterworth-Heinemann,Oxford • Hoffman,D. (2002), “Managing operational risk-20 firmwide best practice strategies”, John Wiley&Sons,Inc.,New York • Jarrow,R. (2008), “Operational risk”,Journal of Banking and Finance,Elsevier,32(5), 870-879 • Moscadelli,M. (2004), "The modelling of operational risk: experience with the analysis of the data collected by the Basel Committee",Working Paper -Bank of Italy publications, No. 517 • McNeil,A.J. and T. Saladin (1997), “The peaks over threshold method for estimating high quantiles of loss distributions”,Proceedings of XXVIIth International ASTIN Colloquium, Cairns Australia, 23-43 • McNeil,A.J. (1997), “Estimating the tails of loss severity distributions using extreme value theory”,ASTIN Bulletin,27,1117-1137 • Neslehova,J.,P. Embrechts and V. Chavez-Demoulin (2006), “Infinite mean models and the LDA for operational risk”,Journal of Operational Risk,1(1),3-25 • Rootzen,H. and N. Tajvidi (2006), “Multivariate generalized Pareto distributions”, Bernoulli Volume 12(5),917-930 • Panjer,H. (2006), “Operational risk:modeling analytics”,John Wiley&Sons,Inc., New Jersey

More Related