1 / 48

Credible Risk Classification

Credible Risk Classification. CAS Ratemaking Journal 2004 Written by Ben Turner of Farmers Insurance. Skim the Cream vs. Adverse Selection. Segmentation. As book is sliced indications become erratic, unreliable, and disrupting to policyholders. Hence actuaries opt to credibility weight.

schreinera
Download Presentation

Credible Risk Classification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Credible Risk Classification CAS Ratemaking Journal 2004 Written by Ben Turner of Farmers Insurance

  2. Skim the Cream vs. Adverse Selection

  3. Segmentation As book is sliced indications become erratic, unreliable, and disrupting to policyholders. Hence actuaries opt to credibility weight

  4. Roadmap

  5. A Simple Example

  6. A Simple Example—Alternate Class Plan

  7. Potential Groupings of Four Levels

  8. Segmentation vs. Credibility

  9. Segmentation vs. Credibility

  10. Simulation

  11. Simulation Results

  12. Simulation Results

  13. Roadmap

  14. A Simple Example

  15. “Losses Squared” • For EACH POLICY the losses are squared and then divided by the exposures of that policy. • The results can then be summed up and the underlying detail does not need to be maintained. • This allows the computation of variance without having to keep policy level detail.

  16. A Simple Example

  17. Calculation of Credibility Buhlmann-Empirical-Bayes • It assumes no underlying distribution. • It is relatively uncontroversial. • It supplies its own complement of credibility. • It does not require arbitrary selection of parameters. See Loss Models, Klugman, et. al.

  18. Calculation of Credibility Required Calculations • V = Process Variance • A = Variance of Hypothetical Means • K = V/A • Credibility = Exposures / (Exposures + K)

  19. Calculation of Credibility Calculation of V, the Process Variance

  20. Calculation of Credibility Calculation of A, the Variance of the Hypothetical Means

  21. Calculation of Credibility Calculation of K and Credibility • K = V/A • Credibility = Exposures / (Exposures + K)

  22. Calculation of Credibility

  23. Credibility-Weighted Class Mean

  24. Calculation of Score

  25. Score: Calculation of Numerator

  26. Score: Calculation of Denominator Denominator = 66,977,631,413-5,430,959,280 = 61,546,672,133

  27. Calculation of Score

  28. Segmentation vs. Credibility

  29. A Simple Example—Alternate Class Plan

  30. Segmentation vs. Credibility

  31. Roadmap

  32. Score’s Factors An increase in any of the following, will raise Score, ceterus paribus: • The difference between the class means • The credibility of each class • The number of classes

  33. Calculation of Score Factors: 1) Difference between means, 2) Credibility, 3) Number of Classes

  34. Segmentation vs. Credibility

  35. Score’s Theory Score is theoretically correct because it: • Will tend to occur inadvertently via the free markets • Is designed explicitly for this actuarial issue • Uses the correct standard of proof

  36. Roadmap

  37. Complex Hypothetical Example Company introduced specialty line and tracked: • Location • Radius of operation • Whether the business is owner-operated It now seeks to create a class plan, and is willing to have the plan be nonlinear.

  38. A Sample from the Database

  39. Summarized Data

  40. 2,048 Potential Class Plans

  41. Selected Class Plan

  42. Underwriting Guidelines

  43. Roadmap

  44. Complex Example—Linear Class Plan

  45. Complex Example—Linear Class Plan—Alternate A

  46. Complex Example—Linear Class Plan—Alternate B

  47. Roadmap

  48. Conclusion We’ve seen: • Score is a theoretically correct method • Score can be done in a spreadsheet • Score can be iterated over all possible plans via a computer program • Score can be used on just the class plans that are of interest • Score can help you design superior class plans

More Related