1 / 10

How is VaR applied/validated?

How is VaR applied/validated?. Used in Limits, Industry standard component to regulatory capital calc Multipliers - stress testing or just multiply VaR. stress testing implies heavyweight re-pricing vs. estimated “greeks approach”

emelda
Download Presentation

How is VaR applied/validated?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How is VaR applied/validated? • Used in Limits, Industry standard component to regulatory capital calc • Multipliers - stress testing or just multiply VaR. • stress testing implies heavyweight re-pricing vs. estimated “greeks approach” • Validated by comparing to “clean” PnL (PnL due to market moves vs. fee based pnl) • We “backtest” by comparing to actual “clean PnL” • With greeks approach we need to store relevant pre-calculated sensitivities for that day • Storing greeks allows us to avoid full re-pricing (won’t have to instantiate calculators) • Quick way to do “attribution” • PnL can be attributed to 2 changing variables: Spread (credit spread) and Yield (treasury rate) • However, without full reval we won’t capture optionality in path dependent products Good VaR quote: • “One to three times VaR are normal occurrences. You expect periodic VaR breaks. The loss distribution typically has fat tails, and you might get more than one break in a short period of time. … So an institution that can't deal with three times VaR losses as routine events probably won't survive long enough to put a VaR system in place.Three to ten times VaR is the range for stress testing. Institutions should be confident they have examined all the foreseeable events that will cause losses in this range, and are prepared to survive them. …Foreseeable events should not cause losses beyond ten times VaR. If they do they should be hedged or insured, or the business plan should be changed to avoid them, or VaR should be increased. It's hard to run a business if foreseeable losses are orders of magnitude larger than very large everyday losses. It's hard to plan for these events, because they are out of scale with daily experience. Of course there will be unforeseeable losses more than ten times VaR, but it's pointless to anticipate them, you can't know much about them and it results in needless worrying. Better to hope that the discipline of preparing for all foreseeable three-to-ten times VaR losses will improve chances for surviving the unforeseen and larger losses that inevitably occur. A risk manager has two jobs: make people take more risk the 99% of the time it is safe to do so, and survive the other 1% of the time. VaR is the border."

  2. How is VaR applied? Cont… In April 1995, the Basle Committee came forth with another set of proposals,which was nothing short of a regulatory innovation: theInternal Model approach. For the firsttime banks would be allowed to use their own risk management models to determine their VaRand with it their capital requirement. This capital requirement follows simply by multiplying theVaR by an add-on factor. This add-on factor, sometimes called the“hysteriafactor”, may varybetween three and four, depending on the accuracy of the bank model in the past.The hysteriafactor is intended to provide additional protection against environments that are much less stablethan historical data would lead to believe. As a result of above, Investment Banks set about justifying their risk models to regulators…

  3. Problems with VaR More recently David Einhorn and Aaron Brown debated VaR in Global Association of Risk Professionals Review[14][23] Einhorn compared VaR to an airbag that works all the time, except when you have a car accident. He further charged that VaR: • Led to excessive risk-taking and leverage at financial institutions • Focused on the manageable risks near the center of the distribution and ignored the tails • Created an incentive to take excessive but remote risks. • Was potentially catastrophic when its use creates a false sense of security among senior executives and watchdogs.

  4. VaR Implementation Notes • Load historical data file - one per security. You can use SBB_io .h/.cc • Build a separate “PnL Vector” for each bond • You will have to instantiate 1 io object per file, then 1 pnl vector per file: • Assume all historical files will live in a sub dir “./var”. • For each SecID in our book(s), have to look up a file named “SecID.txt” in that dir. For ex., SBB_001.txt. • Once you have a vector for each unique bond: • Add yield change to current yield to calculate new_price. • The base_price is the closing price in the end of day file. • price_change = new_price - base_price (daily price change) • Total_pnl_change = price_change * Amount; • Add up individual bond vectors to get a total book vector • Ascending sort gives you biggest negative PnL daily change • Largest negative is worst case loss • Depending on our confidence interval this may or may not be our VaR. • There will be 100 daily changes • Sign is important - amount to multiply is either long or short (+ or -) • Refer to : var_example.xls

  5. Credit Risk Component • Our credit risk measure is LGD “Loss Given Default” • For detailed coverage of LGD refer to online paper: • Basel Committee on Banking Supervision “An explanatory Note on the Basel II IRB Risk Weight Functions July 2005 • Expected Loss = Probability of Default * Loss Given Default • Probability of Default - per rating grade, which gives the average percentage of obligors that default in this rating grade in the course of one year • Loss Given Default - per rating grade, which gives the percentage of exposure the bank might lose in case the borrower defaults • EL = PD * LGD • LGD = EL / PD • We will measure LGD’s before and after ratings downgrades/upgrades!

  6. Regulatory Capital • Capital is needed to cover the risks of peak losses, and therefore it has a loss absorbing function. (Black Jack analogy) • Good quote: “The worst case one could imagine would be that banks lose their entire credit portfolio in a given year. This event, though, is highly unlikely, and holding capital against it would be economically inefficient. Banks have an incentive to minimize the capital they hold, because reducing capital frees up economic resources that can be directed to profitable investments. On the other hand, the less capital a bank holds, the greater is the likelihood that it will not be able to meet its own debt obligations, i.e., that losses in a given year will not be covered by profit plus available capital, and that the bank will become insolvent. Thus banks and their supervisors must carefully balance the risks and rewards of holding capital.” • How to manage it? • Pricing of Credit Exposures (add a “vig”) • “Provisioning” (putting reserves from revenues aside) • Hedging with “protection” (insurance) - e.g., Credit Default Swaps

  7. Upcoming Deliverables • Clarifications/Questions on Requirements? • Third and final quiz on MMM • New posted files (Nov 19 weekend) • opening_tradingbook.txt & closing_tradingbook.txt • A mix of YIELD and SPREAD priced • Historical data files for each bond in the trading book. • Example of historical data files • Measures of trading activity during the day: • Positions • Ratings • One historical data file for each security in the closing_tradingbook.txt file

  8. Units consistency • Market value can be in dollars “invoice price” or in thousands (we will standardize on 000’s) • Example: • 10,000,000 of a bond - “I own ten million bonds” • 9% coupon, 20 years to maturity • Current price: 134.6722 and yield is: 6% • Market value is: $10,000,000 * 134.6722/100 = $13,467,220 (in dollars) 13467.220 in thousands as output • Bump yield by up by 100 basis points (1%) and price is now: 121.3551 • Our data file has amounts in 000’s so $10,000,000 would be entered as 10000 • Dollar Value of an “01” (DV01) - price diff between starting yield and 1/100th of a percent move of yield, or 1 basis point, (1/100th of above example). Additionally, it is the average of two shifts: the absolute value of the differences resulting from both an up and down move. • “Risk” = Amount * DV01/100 and thus stays in thousands since Amount is in thousands. • double yield_delta_abs = fabs(bump_amount); • // yield goes up, price goes down • double down_price = PV(base_yield + yield_delta_abs); • double price_delta_down = base_price - down_price; • // yield goes down, price goes up • double up_price = PV(base_yield - yield_delta_abs); • double price_delta_up = up_price - base_price; • dv01 = (price_delta_up + price_delta_down ) / 2.0;

  9. Code Submission feedback • I shouldn’t have to re-compile to run against another file in a different location - include file path in run.sh • Pass file full path as param to executable - still some hard coding • Units should be in thousands throughout (I’ve accepted dollars or thousands thus far) • Provide a README file • Use meaningful, descriptive variable names - e.g., not “a”, “b”, “c” • Free memory • Identify class member data consistently like: “_price” • Use mf TCM() for indentifying bond (not ID - this will be used internally) “Ticker Coupon Maturity” • Always use curly braces • Don’t copy objects - pass by const ref or use pointers • Use the run.sh shell script • Use indents and comments in source code • Format output using tabs and newlines • Be mindful of short and long (negative and positive) • Debug IO kills performance - #ifdef out before submitting or set a global variable that is set by an environment var • Interface classes should not contain state, add another level in hierarchy • Use interface classes as pointers not the derived types • Text titles cut/pasted - archaic comments are evil • Only build what you have to - don’t “program to impress your friends” or add “gratuitous functionality” • Don’t use makefiles for anything other than building code - use shell instead

  10. Server-side Submission 2Nov 29 • VaR of closing portfolio given confidence interval of 99% (1 day) • Total portfolio LGD change due to ratings downgrade or upgrade • Total portfolio Amount change from start to end of day (not “Market Value”) • Python client (write to stdout) • All done as one call with timing info returned as before: • VaR RatingsChangeAmount PositionChangeAmount • 1.345 10000.123 1000.123 2.0 1.0 1.0

More Related