1 / 50

MEA 593 Climate Risk Analysis for Adaptation Instructor – Fredrick Semazzi Lecture-5

MEA 593 Climate Risk Analysis for Adaptation Instructor – Fredrick Semazzi Lecture-5 Evaluation of Climate Models Prediction Skill for Applications. Outline. Objective and Motivation Principles of Ensemble Forecasting

tarala
Download Presentation

MEA 593 Climate Risk Analysis for Adaptation Instructor – Fredrick Semazzi Lecture-5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MEA 593 Climate Risk Analysis for Adaptation Instructor – Fredrick Semazzi Lecture-5 Evaluation of Climate Models Prediction Skill for Applications

  2. Outline Objective and Motivation Principles of Ensemble Forecasting Traditional Relative Operating Characteristic (ROC) and Economic Value (EV) analysis Additional Notes Extended Relative Operating Characteristic (EROC)

  3. Motivation Climate prediction is becoming increasingly important for different sectors of the economy worldwide A case of false alarm (i.e miss) Courtesy: El Universo

  4. The Application • Meningitis is a serious infectious disease affecting 21 countries • 300 million people at risk • 700,000 cases in the past 10 years • 10-50 % case fatality rates

  5. Ensemble Dynamical Downscaling • We use the Weather Research and Forecasting WRF Model for the experimental prediction of meningitis • WRF model correctly predicts 40% RH threshold & sharp decline in meningitis occurrences Kano

  6. Criteria for Issuing Forecast Decision to issue a forecast of (E) to occur is probabilistically based on the criteria: (N): size of the ensemble (n): number of the runs in the ensemble for which (E) actually occurs (p): probability given by the ratio (n/N) this is the threshold fraction above which the event (E) is predicted to occur based on the model forecast

  7. Intuitive Interpretation of Illustration • Ensemble forecasts diverge with increasing lead period • Just because a few forecasts of a prediction system agree with the observed weather/climate conditions does not make it a skillful prediction system • The more the ensemble prediction system members agree with the observed conditions the more skillful the prediction system (ROC Method – quality of FS) • The certain level of agreement agreeing with the observed conditions may have very different value for different end-users) – a few wrong forecasts could be acceptable to one • user but totally not acceptable to another user, • ok for others depending of the cost of a wrong forecast

  8. Decision Criteria The user of an ensemble of (N) forecasts has (N) options for use as a decision criteria with respect to his/her climate-related action. (i) He or she can chose to take action only if all (N) forecasts predict the adverse climate (ii) He or she can chose to take action only if at least (N-1), (n-2), …, forecasts predict the adverse climate, OR (iii) He or she can chose to take action even if only one member predicts the adverse climate

  9. Methods for Evaluation of Skill of Ensemble Climate Prediction Systems • Root Mean Square Error (RMSE) • Brier Skill Score (BSS) • Heidke Skill Score (SS) • Kuipers Skill Score (KS) • Relative Operating Characteristics (ROC)-adopted (WMO)

  10. Forecast-Model Contingence Matrix Observations • A decision maker becomes a user of weather forecasts if he/she alters his/her actions based on forecast information • A cost-loss analysis can be assessed based on a 2x2 matrix in which we evaluate the skill of a probabilistic forecast EPS Forecast Hit Rate: H= δ/( +δ) False Alarm Rate: F= /( + )

  11. Quality of Ensemble Climate Prediction Systems

  12. Relative Operating Characteristics (ROC) ROC MATRIX For each there is a corresponding pair of H(t) & F(t) which may be plotted on a ROC plot Note: No relative skill i.e. climatology H=F, the chances for a “Hit” are as good as the chances for a “False Alarm”. Total area under curve is usually used to provide a measure of the skill of the model to predict event “E”. For a skillful forecast: the hit rate is >the false alarm. +1 * • * * * Hit rate * * * • False Alarm ratio +1

  13. Relative Operating Characteristics (ROC) • Relative operating area (ROC area) is one of the summary measure of ensemble forecast performance • ROC area (ROCA) is defined by the points (0,0), (1,1), and the points representing the forecast system

  14. Relative Operating Characteristics (ROC) • The closer a curve is to the upper-left-hand corner, the more skillful is the forecast system • A perfect forecast system would have a ROCA of 1 • A system with no capability of distinguishing in advance between different climate events has a score of 0.5, i.e. lying on the diagonal defined by (0,0) and (1,1)

  15. Value of Ensemble Climate Prediction Systems

  16. Utility of Climate Predictions • The ultimate utility of climate forecasts is economic and other benefits associated with their actual use in the daily decision-making process of individuals and different organizations • Simplistically users of climate forecasts either DO or DO NOT take action

  17. Development of Methodology • Based on trade-off between cost in taking action by the user and the loss when no action is taken & the event occurs • Let $C = cost of protection (See examples) • Let $L = Loss for not protecting when E occurs • Occurs • NoYes EXPENSE • Take NoO L CONTINGENCY • Action YesC C MATRIX • User to take action that • Will minimize expense over a large number of cases • Occurs • No Yes • Take No O L USER • Action Yes C C EXPENSE MATRIX OBJECTIVE

  18. Observations • We can now represent the expected user expense using the variables with the following equation: EPS Forecast

  19. User Expense We can re-write M in a more convenient form by using the definitions of F and H in terms of is the Observed Climatological Observed Frequency of E Over Many Years For this to be true This is the period the user must be prepared to realize the benefits. From the definitions: We can write: Thus; Substituting for, in We Get, Factorizing to simplify, yields

  20. Special Case: Perfect Prediction Model Perfect Forecast H=1 & F=0

  21. Special Case Climatology: Poor Man’s Prediction Model Only Climatology is known (Has Important Economic Benefits) Option 1: Always Take Action (H=F=1) Option 2: Never take action (H=0, F=0)

  22. Option 1: Always Take Action [H=F=1] • ` Observed • N Y • Forecast N0 0 • Y  • Always forecasts E to occur but it is equally likely that E will not occur

  23. Option 2: Never take action (H=0, F=0) O N Y F N   Y O O If L very large If C very small If L is very small Comes from Always take action Comes from never take action

  24. NOTE

  25. Formula for Computing Value Define as reduction of M over normalized by maximum possible reduction I.e. so that O<V<1 V=0 for Climatology V=1 for perfect prediction model

  26. Applying the Following Relations Already Derived or Defined Ideally we want

  27. Assessment of Economic Benefits of Cli.Prediction USER SECTOR MODEL HINDCAST MET. OBS Define (E) Identity C & L Forecast (E) Specify (Pt) Obs (E) Compute Region 1 OCCURANCE Fst No   Yes   No Yes Region 2 Region 9 ROC H F (Perfect) (Climatology) See example figure See example figure DECISION IF $ IS IMPORTANT TO SECTOR

  28. EXAMPLE BASED ON ECMWF

  29. Required Input from User Sectors: • Identification of Important Climatic EventE • e.g. Temperature below which crop fails. • Identification of LossL$which occurs when no action is taken to prevent loss. • Identification of CostC$required to prevent loss. • User Benefits: Level I:Criteria either always action or never to take action when only climatology is known. Level II: Estimate of $ saved over using only climatology information Model Development Benefits: • To Access/Identify model improvements which result in added value • Model Intercomparison

  30. Summary Flexible Definition of Events E (Not limited to) Diagrams of yields estimates of Eco. Ben Methodology provide means of determining when model has achieved usable range. ROC>1 not adequate if C/L is out of range. Ensemble approach provides objective way for combining probabilistic forecasts from different model Method is ideal for investigating added value due to downscaling based on regional climate model Application of the knowledge of climatology alone has economic benefits User need to be prepared to rely on forecast for Optimal (C/L) is ≈ given by . Therefore qualitative potential benefits of climate prediction can be assessed by comparing (C/L) & even before using model output. Specific sector could explore adjusting (C/L) to increased V. Above Normal Normal Below Normal Basedon Basedon V Based on C/L

  31. Conclusion The evaluation may be customized for a specific end-user application and the results are in a form that is understandable by the user & the producer of the forecasts

  32. ADDITIONAL NOTES if You Intend to use this method in your Course Project

  33. More details on the Extended ROC (EROC) Method The traditional ROC method is widely used and indeed highly recommended by WMO. Therefore, it is more desirable to express Relative Economic Value in terms of ROC metrics if at all possible

  34. Extended ROC (EROC) Method • There is one-to-one relationship between EROC and EV • . The base line H=F, widely-used in ROC plots, is a special case of for the optimal user; for other users H is not equal to F.

  35. The EROC Method EROC merges the information for a range of hypothetical users within the ROC plot itself. Using (16), we recognize that the V will change for each μ and that a non-optimal user would shift from its position at H = F (climatology). It is desirable for a user to set a threshold of the minimum added value that make it worth investing in an EPS. Thus, we find that V >Vmin and we rewrite (16): (17)

  36. In order to construct our baselines where Hpt is a function of Fpt we divide both sides of (17) by and re-arranging we obtain: Further dividing both sides by we get: (18) We wish to construct mathematical forms where H is a linear function of F, thus be able to apply similar interpretation of ROC

  37. If (18) becomes: With further simplification yields, (19)

  38. If (18) the function becomes: where the (20)

  39. Note that both (19) and (20) have (i.e a function of F) in common and constant term (i.e not a function of F). The two equations are reproduced below for convenience: (19) (20) Thus, we can combine (19) and (20) into one equation that defines H as an inequality function of F: (21)

  40. for (22) where for (23) and Vmin=0. We note that H=F is a special case of (21), when and Therefore the traditional ROC condition for using the condition as a metric for measuring the skill on an EPS is a very special case of a user where .

  41. Bounds for Our μ range is thus bounded by the two inequalities so that: (24) Outside this range the EPS has no value for the user μ

  42. EROC & EV Relationship We now know that ROC is a special case of EROC for an optimal user. We found that when, , (21) reduces to H = F, or the H=F diagonal in ROC plots. We generalize this by the following equation: (25) so that all baselines can be derived from (25) as a function of a particular μ. In the case of , (25) reduces to: or simply, climatology

  43. Let us now calculate a baseline given a specific μ where H is now the point corresponding to the baseline and also a function of F. To obtain the difference between ROC and the new baseline we set: (26) where H is the baseline. Combining (25) with (26), we get (27) Again the ideal user: (28)

  44. Combining the equation for V in (16) and in (27) yields: for (29) for

  45. The Semazzi-Mera Skill Score We have shown that there is one-to-one relationship between EROC and EV. We also showed that the base line H=F, common to the widely-used ROC plots, is a special case of (21), and as thus, a special case of EROC. Another important use of ROC in EPS skill measurement is the Area Skill Score (ASS) or ROC Skill Score (RSS), as shown in Richardson (2000a) and references therein: (30)

  46. The area A under the ROC is used as an index of the accuracy of the forecast system (Mason 1982; Buizza et al. 1998, 1999). A perfect system would have A = 1.0, while no-skill systems (H = F) would have A = 0.5. Substituting Aper=1, and Aclim=0.5, we have (31) Which reduces to: The skill score as determined by Richardson (2000a), Stanski, and Wilks (2006), corresponds to the optimal user where Mclimin the equation for V is given by H = F. Adopting a more general baseline the Semazzi-Mera Skill Score (SMSS) is given by: (32)

  47. In SMSS where A is the area under the ROC curve, Aper = 1, and Aμ is the appropriate area between the baseline and the ROC graph. In this case, Aμ is determined by the μ. It is the “climatological” baseline for a hypothetical user. (33)

More Related