1 / 40

Putting the Power of Modern Applied Stochastics into DFA

Putting the Power of Modern Applied Stochastics into DFA. Peter Blum 1)2) , Michel Dacorogna 2) , Paul Embrechts 1). 1) ETH Zurich Department of Mathematics CH-8092 Zurich (Switzerland) www.math.ethz.ch/finance. 2) Zurich Insurance Company Reinsurance CH-8022 Zurich (Switzerland)

vasanti
Download Presentation

Putting the Power of Modern Applied Stochastics into DFA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Putting the Power of Modern Applied Stochastics into DFA Peter Blum 1)2) , Michel Dacorogna 2), Paul Embrechts 1) 1) ETH Zurich Department of Mathematics CH-8092 Zurich (Switzerland) www.math.ethz.ch/finance 2) Zurich Insurance Company Reinsurance CH-8022 Zurich (Switzerland) www.zurichre.com

  2. Situation and intention • Applied stochastics provide lots of models that lend themselves to use in DFA scenario generation:=> Opportunity to take profit of advanced research. • However, DFA poses some very specific requirements that are not necessarily met by a given model.=> Risk when using models uncritically. • Goal: provide some guidance on how to (re)use stochastic models in DFA.

  3. Topics • Observations on the use of models from mathematical finance (one discipline of applied stochastics) in DFA • Updates on the modelling of rare and extreme events (multivariate data and time series) • Annotated bibliography

  4. DFA & Mathematical Finance: Situation • DFA scenario generation requires models for economy and assets: interest rates, stock markets, inflation, etc. • Mathematical finance provides many such models that can be used in DFA. • However, care must be taken because of some particularities related to DFA. • Hereafter: some reflections...

  5. Mathematical Finance: Background • Most models in mathematical finance were developed for derivatives valuation. Fundamental paradigms here: • No – arbitrage • Risk – neutral valuation • Most models apply to one single risk factor; truly multivariate asset models are rare. • Most models are based on Gaussian distribution or Brownian Motion for the sake of tractability. (However: upcoming trend towards more advanced concepts.)

  6. Excursion: the principle of no-arbitrage • „In an efficient, liquid financial market, it is not possible to make a profit without risk.“ • No-arbitrage can be given a rigorous mathematical formulation (assuming efficient markets). • Asset models for derivative valuation are such that they are formally arbitrage-free. • However, real markets have imperfections; i.e. formally arbitrage-free models are often hard to fit to real-world data.

  7. Excursion: risk-neutral valuation • In a no-arbitrage environment, the price of a derivative security is the conditional expectation of its terminal value under the risk-neutral probability measure. • Risk neutral measure: probability measure under which the asset price process is a martingale. • Risk-neutral measure is different from the real-world probability measure: different probabilities for events. • Many models designed such that they yield explicit option prices under risk neutral measure.

  8. Implications on models • Many models in mathematical finance are designed such that • They are formally aribtrage-free. • They allow for explicit solutions for option prices. • i.e. model structure often driven by mathematical convenience. • Examples: Black-Scholes, but also Cox-Ingersoll-Ross, HJM. • These technical restrictions can often not be reconciled with the observed statistical properties of real-world data. • classical example: volatility smile in the Black-Scholes model.

  9. Consequences for DFA • Most important for DFA: Models must faithfully reproduce the observable real-world behaviour of the modelled assets. • Therefore: fundamental differences in paradigms underlying the selection or construction of models. • Hence: take care when using models in DFA that were mainly constructed for derivative pricing. • A little case study for illustration...

  10. A little case study: CIR • Cox-Ingersoll-Ross model for short-term interest rate r(t) and zero-coupon yields R(t,T).

  11. CIR: Properties • One-factor model: only one source of randomness. • Nice analytical properties: explicit formulae for • Zero-coupon yields, • Bond prices, • Interest rate option prices. • (Fairly) easy to calibrate (Generalized Method of Moments). • But: How well does CIR reproduce the behaviour of the real-world interest rate data?

  12. CIR: Yield Curves

  13. CIR Yield Curves: Remarks • CIR: yield curve fully determined by the short-term rate! • Simulated curves always tend from the short-term rate towards the long-term mean. • Hence: Insufficient reproduction of empirical caracteristics of yield curves: e.g. humped and inverted shapes. • From this point of view: CIR is not suitable for DFA! • But: What about the short-term rate?

  14. CIR: Short-term Rate (I) • Classical source: the paper by Chan, Karolyi, Longstaff, and Sanders („CKLS“). • Evaluation based on T-Bill data from 1964 to 1989: • involving the high-rate period 1979-1982 • involving possible regime switches in 1971 (Bretton-Woods) and 1979 (change of Fed policy). • Parameter estimation by classical GMM. • CKLS‘s conclusion: CIR performs poorly for short-rate!

  15. CIR: Short-term rate (II) • More recent study: Dell‘Aquila, Ronchetti, and Trojani • Evaluation on different data sets: • Same as CKLS • Euro-mark and euro-dollar series 1975-2000 • Parameter estimation by Robust GMM. • Conclusions: classical GMM leads to unreliable estimates; CIR with parameters estimated by robust GMM describes fairly well the data after 1982. • Hence: CIR can be a good model for the short-term rate!

  16. Methodological conclusions • Thorough statistical analysis of historical data is crucial! Alternative estimation methods (e.g. robust statistics) may bring better results than classical methods. • Models may need modification to fit needs of DFA. • Careful model validation must be done in each case. • Models that are good for other tasks are not necessarily good for DFA (due to different requirements). • Residual uncertainty must be taken into account when evaluating final DFA results.

  17. Excursion: Robust Statistics • Methods for data analysis and inference on data of poor quality (satisfying only weak assumptions). • Relaxed assumptions on normality. • Tolerance against outliers. • Theoretically well founded; practically well introduced in natural and life sciences. • Not yet very popular in finance, however: emerging use. • Especially interesting for DFA: Small Sample Asymptotics. • Relevance of estimates based on little data...

  18. An alternative model for interest rates (I) • Due to Cont; based on a careful statistical study of yield curves by Bouchaud et al. (nice methodological reference) • Consequently designed for reproducing real-world statistical behaviour of yield curves. • Can be linked to inflation and stock index models. • Theoretically not arbitrage-free. However – if well fitted: „as arbitrage-free as the real world...“

  19. An alternative model for interest rates (II)

  20. Multivariate Models: Problem Statement • Models for single risk factors (underwriting and financial) are available from actuarial and financial science. • However: „The whole is more than the sum of its parts.“ Dependences must be duly modelled. • Not modelling dependences suggests diversification possibilities where none are present. • Significant dependences are present on the financial and on the underwriting side.

  21. Dependences: Example

  22. Particlular problem: integrated asset model • An economic and investment scenario generator for DFA (involving inflation, interest rates, stock prices, etc.) must reflect various aspects: • marginal behaviour of the variables over time • in particular: long-term aspects (many years ahead) • dependences between the different variables • „unusual“ and „extreme“ outcomes • economic stylized facts • Hence: need for an integrated model, not just a collection of univariate models for single risk factors.

  23. General modelling approaches • Statistical: by using multivariate time series models • established standard methods, nice quantitative properties • practical interpretation of model elements often difficult • Fundamental: by using formulae from economic theory • explains well the „usual“ behaviour of the variables • often suboptimal quantitative properties • Phenomenological: compromise between the two • models designed for reflecting statistical behaviour of data • allowing nevertheless for practical interpretation • Phenomenological approach most promising for DFA.

  24. Economic and investment models • „CIR + CAPM“ as in Dynamo • Wilkie Model in different variants (widespread in UK) • Continuous-time models by Cairns, Chan, Smith • Random walk models with Gaussian or  - stable innovations • Etc.: see bibliography. • None of the models outperforms the others.

  25. Investment models: open issues • Exploration of alternative model structures • Model selection and calibration • Long-term behaviour: stability, convergence, regime switches, drifts in parameters, etc. • Choice of initial conditions • Inclusion of rare and extremal events • Inclusion of exogeneous forecasts • Time scaling and aggregation properties • Framework for model risk management

  26. Excursion: Model Risk Management • Qualify and (as far as possible) quantify uncertainty as to the appropriateness of the model in use. • Which relevant dangers are (not) reflected by the model? • Interpretation of simulation results given model uncertainty • Particularly important in DFA: long-term issues. • Little done on MRM in quatitative finance up to now (exception: pure parameter risk). • Sources of inspiration: statistics (frequentist and Bayesian), economics, information theory (Akaike...), etc.

  27. Rare & extreme events: problem statement • Rare but extreme events are one particular danger for an insurance company. • Hence, DFA scenarios must reflect such events. • Extreme Value Theory (EVT) is a useful tool. • C.f. Paul Embrechts‘ presentation last year. • Some complements of interest for DFA: • Time series with heavy-tailed residuals • Multivariate extensions

  28. The classical case • X1, ... , Xn ~ iid (or stationary with additional assumptions) • Xi : univariate observations • Investigation of max {X1, ... , Xn}=> Generalized Extreme Value Distribution (GEV) • Investigation of P (Xi – u  x | x > u)(excess distribution of Xi over some threshold u)=> Generalized Pareto Distribution (GPD)

  29. The classical case: applications • Well established in the actuarial and financial field: • Description of high quantiles and tails • Computation of risk measures such as VaR or Conditional VaR (= Expected Shortfall  Expected Policyholder Deficit) • Scenario generation for simulation studies • Etc. • In general: consistent language for describing extreme risks across various risk factors.

  30. Multivariate extremes: setup and context • As before: X1, ... , Xn ~ iid, but now: Xi n (multivariate) • Relevant for insurance and DFA? Yes, in some cases, e.g. • Correlated natural perils (in the absence of suitable CAT modelling tool coverage). • Presence of multi-trigger products in R/I • Area of active research; however, still in its infancy: • Some publications on workable theoretical foundations • Few (pre-industrial) applied studies (FX data, flood, etc.) • Considerable progress expected for the next years.

  31. Multivariate extremes: problems (I) • No natural order in multidimensional space: • => no „natural“ notion of extremes • Different conceptual approaches present: • Spectral measure + tail index (think of a transformation into polar coordinates) • Tail dependence function (= Copula transform of joint distribution) • Both approaches are practically workable. • Generally established workable theory not yet present.

  32. Multivariate extremes: problems (II) • In the multivariate setup:„The Curse of Dimensionality“ • Number of data points required for obtaining „well determined“ parameter estimates increases dramatically with the dimension. • However, extreme events are rare by definition... • Problem perceived as tractable in „low“ dimesion (2,3,4) • Most published studies in two dimensions • Higher-dimensional problems beyond the scope of current methods

  33. Time series with heay-tailed residuals • Given some time series model (e.g. AR(p)):Xt = f (Xt-1 , Xt-2 , ... ) + t | 1 , 2 , ... ~ iid, E (i) = 0 • Usually: t ~ N(0,  2) (Gaussian) • However: there are time series that cannot be reconciled with the assumption of Gaussian residuals (even on such high levels of time aggregation as in DFA). • Therefore: think of heavier-tailed – also skewed – distributions for the residuals! (Various approaches present.)

  34. Heavy-tailed residuals: example • QQ normal plots of yearly inflation (Switzerland and USA) • Straight line indicates theoretical quantiles of Gaussian distribution.

  35. Heavy-tailed residuals: direct approach • Linear time series model (e.g. AR(p)), with residuals having symmetric--stable (ss) distribution. • ss: general class of more or less heavy-tailed distributions; •  = characteristic exponent; can be estimated from data. •  = 2  Gaussian;  = 1  Cauchy. • Disadvantage: ss RV‘s in general difficult to simulate. • Take care with other heavy-tailed distributions (e.g Student‘s t): multiperiod simulations may become uncontrollable.

  36. Superposition of shocks • Normal model with superimposed rare, but extreme shocks:Xt = f (Xt-1 , Xt-2 , ... ) + t + t t • 1 , 2 , ... ~ iid Bernoulli variables (occurrence of shock) • 1 , 2, ... the actual shock events • Problem: recovery of model from the shock! • Shock itself is realistic as compared to data. • But model recovers much faster/slower than actual data. • Hence: Care must be taken.

  37. Continuous-time approaches • „Alternatives to Brownian Motion“ (i.e. Gaussian processes) • General Lévy processes • Continuous-time  - stable processes • Jump – diffusion processes (e.g. Brownian motion with superimposed Poisson shock process) • Theory well understood in the univariate case. • Emerging use in finance (e.g. Morgan-Stanley) • Mutivariate case more difficult: difficulties with correlation because second moment is infinite.

  38. Further approaches • Heavy-tailed random walks (ss – innovations); possibly corrected by expected forward premiums (where available). • Regime-switching time series models, e.g. Threshold Autoregressive (TAR or SETAR = Self-Excited TAR). • Non-linear time series models: ARCH or GARCH (however: more suitable for higher-frequency data).

  39. Conclusions (I) • Applied stochastics and, in particular, mathematical finance offer many models that are useful for DFA. • However, before using a model, careful analysis must be made in order to assess the appropriateness of the model under the specific conditions of DFA. Modifications may be necessary. • The quality of a calibrated model crucially depends on sensible choices of historical data and methods for parameter estimation.

  40. Conclusions (II) • Time dependence of and correlation between risk factors are crucial in the multivariate and multiperiod setup of DFA. When particularly confronted with rare and extreme events: • Time series models with heavy tails are well understood and lend themselves to the use in DFA. • Multivariate extreme value theory is still in its infancy, but workable approaches can be expected to emerge within the next few years.

More Related