1 / 82

Model Output Statistics (MOS) - Objective Interpretation of NWP Model Output

Model Output Statistics (MOS) - Objective Interpretation of NWP Model Output. University of Maryland - March 10, 2005. Mark S. Antolik Meteorological Development Laboratory Statistical Modeling Branch NOAA/National Weather Service Silver Spring, MD. (301) 713-0023 ext. 110

rrobinson
Download Presentation

Model Output Statistics (MOS) - Objective Interpretation of NWP Model Output

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Model Output Statistics (MOS) - Objective Interpretation of NWPModel Output University of Maryland - March 10, 2005 Mark S. Antolik Meteorological Development Laboratory Statistical Modeling Branch NOAA/National Weather Service Silver Spring, MD (301) 713-0023 ext. 110 email: mark.antolik@noaa.gov

  2. OUTLINE 1. Why objective statistical guidance? 2. What is MOS? Definition and characteristics The “traditional” MOS product suite (NGM, eta, GFS) Recent additions to the lineup 3. Simple regression examples / REEP 4. Development strategy - MOS in the “real world” 5. Verification 6. Where we’re going - The future of MOS

  3. WHY STATISTICAL GUIDANCE? • Add value to direct NWP model output • Objectively interpret model • - remove systematic biases • - quantify uncertainty • Predict what the model does not • Produce site-specific forecasts • (i.e. a “downscaling” technique) • Assist forecasters • “First Guess” for expected local conditions • “Built-in” model/climo memory for new staff

  4. A SIMPLE STATISTICAL MODEL Relative Frequency of Precipitation as a Function of12-24 Hour NGM Model-Forecast Mean RH 1.0 0.9 3-YR SAMPLE; 200 STATIONS 0.8 1987-1990 COOL SEASON 0.7 0.6 47% OBSERVED REL. FREQUENCY 0.5 0.4 0.3 0.2 0.1 0.0 0 10 20 30 40 50 60 70 80 90 100 NGM MEAN RELATIVE HUMIDITY (%)

  5. MOS Max Temp vs. Direct Model Output

  6. What is MOS?

  7. MODEL OUTPUT STATISTICS (MOS) Relates observed weather elements (PREDICTANDS) to appropriate variables (PREDICTORS) via a statistical approach. Predictors are obtained from: • Numerical Weather Prediction (NWP) Model Forecasts 2. Prior Surface Weather Observations 3. Geoclimatic Information Current Statistical Method: MULTIPLE LINEAR REGRESSION (Forward Selection)

  8. MODEL OUTPUT STATISTICS (MOS) Properties • Mathematically simple, yet powerful • Need historical record of observations • at forecast points • (Hopefully a long, stable one!) • Equations are applied to future run of • similar forecast model

  9. MODEL OUTPUT STATISTICS (MOS) Properties (cont.) • Non-linearity can be modeled by using • NWP variables and transformations • Probability forecasts possible from a • single run of NWP model • Other statistical methods can be used • e.g. Polynomial or logistic regression; • Neural networks

  10. MODEL OUTPUT STATISTICS (MOS) • ADVANTAGES • Recognition of model predictability • Removal of some systematic model bias • Optimal predictor selection • Reliable probabilities • Specific element and site forecasts • DISADVANTAGES • Short samples • Changing NWP models • Availability & quality of obs

  11. MOS OBS MODELS Mesonets NCEP TODAY’S CHALLENGE TO MOS DEVELOPMENT: RAPIDLY EVOLVING NWP MODELS AND OBSERVATION PLATFORMS RAPIDLY EVOLVING NWP MODELS AND OBSERVATION PLATFORMS Make for: 1. SHORT, UNREPRESENTATIVE DATA SAMPLES 1. SHORT, UNREPRESENTATIVE DATA SAMPLES 2. DIFFICULT COLLECTION OF APPROPRIATE PREDICTAND DATA 2. DIFFICULT COLLECTION OF APPROPRIATE PREDICTAND DATA New observing systems: (ASOS, WSR-88D) (Co-Op, Mesonets) “Old” predictands: The elements don’t change!

  12. “Traditional” MOStext products

  13. GFS MOS GUIDANCE MESSAGEFOUS21-26 (MAV) KLNS GFS MOS GUIDANCE 11/29/2004 1200 UTC DT /NOV 29/NOV 30 /DEC 1 /DEC 2 HR 18 21 00 03 06 09 12 15 18 21 00 03 06 09 12 15 18 21 00 06 12 N/X 28 48 35 49 33 TMP 43 44 39 36 33 32 31 39 46 45 41 38 37 39 41 44 45 44 40 40 35 DPT 27 27 28 29 29 29 29 33 35 35 36 35 36 39 41 42 37 34 30 30 28 CLD CL BK BK BK OV OV OV OV OV OV OV OV OV OV OV OV OV BK CL CL CL WDR 34 36 00 00 00 00 00 00 00 14 12 12 10 11 12 19 28 29 29 29 28 WSP 06 02 00 00 00 00 00 00 00 01 02 04 04 06 07 08 15 17 18 09 05 P06 0 0 4 3 11 65 94 96 7 0 0 P12 6 19 94 96 0 Q06 0 0 0 0 0 3 4 4 0 0 0 Q12 0 0 4 2 0 T06 0/ 0 0/18 0/ 3 0/ 0 0/ 0 0/18 2/ 1 10/ 4 0/ 3 1/ 0 T12 0/26 0/17 0/27 10/25 1/38 POZ 2 0 0 1 2 4 4 0 1 1 2 3 3 1 1 0 2 1 2 3 1 POS 13 2 1 2 1 0 0 0 0 0 0 0 0 2 0 0 0 3 0 9 28 TYP R R R R R R R R R R R R R R R R R R R R R SNW 0 0 0 CIG 8 8 8 8 7 7 7 8 8 7 7 7 4 2 3 3 6 7 8 8 8 VIS 7 7 7 7 7 7 7 7 7 7 7 7 5 5 4 2 6 7 7 7 7 OBV N N N N N N N N N N N N BR BR BR BR N N N N N

  14. Eta MOS GUIDANCE MESSAGEFOUS44-49 (MET) KTHV ETA MOS GUIDANCE 11/29/2004 1200 UTC DT /NOV 29/NOV 30 /DEC 1 /DEC 2 HR 18 21 00 03 06 09 12 15 18 21 00 03 06 09 12 15 18 21 00 06 12 N/X 27 52 41 53 27 TMP 47 46 36 34 32 30 31 42 50 51 44 44 44 46 45 50 51 48 41 35 33 DPT 27 27 29 28 27 27 28 31 32 33 35 36 37 40 40 39 35 31 28 24 25 CLD BK BK OV OV OV BK BK OV OV OV OV OV OV OV OV BK BK CL CL CL SC WDR 31 32 00 00 00 00 00 08 16 14 00 14 15 17 21 25 27 28 27 27 25 WSP 05 02 00 00 00 00 00 02 03 04 00 04 08 10 10 17 22 18 08 06 03 P06 1 2 5 4 3 46 91 51 0 0 0 P12 5 4 91 51 0 Q06 0 0 0 0 0 1 3 1 0 0 0 Q12 0 0 4 1 0 T06 0/ 0 0/ 0 0/ 1 0/ 0 0/ 0 0/ 0 11/ 2 11/ 5 8/ 9999/99 T12 1/ 0 0/ 1 0/ 0 18/ 5 999/99 SNW 0 0 0 CIG 8 8 7 8 7 8 8 7 8 8 7 7 4 4 3 7 6 8 8 8 8

  15. GFS / Eta MOS vs. NGM MOS • MORE STATIONS: • Now at approx. 1550 forecast sites • (CONUS, HI, PR) • MORE FORECASTS: • Available at projections of 12- 84 hours • GFS available for 0600 and 1800 UTC cycles • BETTER RESOLUTION: • GFS predictors on 95.25 km grid; Eta on 32 km • Predictor fields available at 3-h timesteps • Predictors available beyond 48-h projection • * No extrapolative forecasts • DEPENDENT SAMPLE NOT “IDEAL”: • Fewer seasons • Non-static underlying NWP model

  16. GFS / Eta MOS vs. NGM MOS • MORE STATIONS: • Now at approx. 1550 forecast sites • (CONUS, HI, PR) • MORE FORECASTS: • Available at projections of 12- 84 hours • GFS available for 0600 and 1800 UTC cycles • BETTER RESOLUTION: • GFS predictors on 95.25 km grid; Eta on 32 km • Predictor fields available at 3-h timesteps • Predictors available beyond 48-h projection • * No extrapolative forecasts • DEPENDENT SAMPLE NOT “IDEAL”: • Fewer seasons • Non-static underlying NWP model

  17. GFSX MOS GUIDANCE MESSAGEFEUS21-26 (MEX) KCXY GFSX MOS GUIDANCE 11/26/2004 0000 UTC FHR 24| 36 48| 60 72| 84 96|108 120|132 144|156 168|180 192 FRI 26| SAT 27| SUN 28| MON 29| TUE 30| WED 01| THU 02| FRI 03 CLIMO X/N 43| 29 47| 40 55| 35 51| 29 45| 32 40| 36 42| 30 45 31 46 TMP 37| 32 43| 43 46| 37 41| 32 39| 35 36| 38 37| 33 37 DPT 24| 27 37| 40 32| 28 28| 26 31| 32 30| 32 27| 24 25 CLD PC| OV OV| OV PC| CL PC| PC OV| OV OV| PC CL| CL CL WND 10| 5 11| 11 16| 10 10| 5 9| 6 10| 12 14| 12 12 P12 0| 5 13| 91 13| 3 9| 14 24| 52 54| 48 21| 12 25 20 18 P24 | 16| 100| 9| 26| 62| 72| 25 29 Q12 0| 0 0| 3 0| 0 0| 0 0| 2 2| 2 | Q24 | 0| 3| 0| 0| 4| | T12 0| 0 0| 3 0| 0 0| 0 4| 6 4| 3 1| 1 1 T24 | 0 | 3 | 0 | 0 | 6 | 4 | 1 PZP 12| 9 12| 4 3| 5 6| 10 8| 8 3| 16 10| 12 8 PSN 62| 15 3| 0 0| 10 9| 15 24| 1 0| 9 32| 27 18 PRS 26| 24 7| 0 17| 18 20| 13 15| 1 2| 18 9| 11 11 TYP S| RS R| R R| R R| R RS| R R| R RS| RS R SNW | 0| 0| 0| 0| | |

  18. Recent additions to the MOS product lineup

  19. Marine MOS 44013 GFS MOS GUIDANCE 6/01/2004 1200 UTC DT /JUN 1/JUN 2 /JUN 3 /JUN 4 HR 18 21 00 03 06 09 12 15 18 21 00 03 06 09 12 15 18 21 00 03 06 WD 05 07 09 12 12 15 16 18 18 19 23 27 29 32 33 35 36 28 25 28 29 WS 14 12 10 07 06 06 07 08 09 09 07 07 06 06 07 07 07 07 07 10 11 WS10 15 13 11 08 07 06 07 09 10 10 07 08 07 06 07 08 08 07 07 10 12 DT /JUN 4 / HR 09 12 15 18 21 00 WD 29 29 30 14 14 17 WS 12 09 03 03 05 07 WS10 13 10 03 03 06 08 Marine MOS sites Standard MOS sites

  20. MOS Snowfall Guidance Uses Observations from Cooperative Observer Network >Trace - 2 2 - < 4 4 - < 6 6 - <8 Verification 36-hr forecast 12Z 12/05/03 – 12Z 12/06/03

  21. Max/Min Guidance for Co-op Sites GFS-BASED MOS COOP MAX/MIN GUIDANCE 3/01/05 1800 UTC WED 02| THU 03| FRI 04 ANNM2 26 46| 24 45| 25 46 BERM2 28 41| 25 39| 25 43 BTVM2 23 39| 21 38| 20 43 CBLM2 20 40| 18 39| 20 46 CHEM2 25 42| 21 39| 21 44 CNWM2 21 42| 21 40| 20 45 DMAM2 20 37| 18 37| 20 42 ELCM2 25 41| 21 41| 18 45 EMMM2 23 42| 20 41| 20 43 FREM2 23 46| 21 42| 23 44 FRSM2 17 27| 13 27| 13 36 GLDM2 21 37| 18 39| 18 43 HAGM2 23 43| 18 43| 19 45 KAPG 27 41| 23 37| 22 43 LRLM2 23 44| 21 42| 22 46 MECM2 24 47| 20 42| 20 45 MILM2 25 48| 22 41| 20 39 MLLM2 22 39| 18 37| 18 41 OLDM2 18 31| 13 28| 12 35 OXNM2 23 42| 22 40| 23 48 PRAM2 22 49| 22 45| 18 45 Beltsville, MD Glenn Dale, MD Laurel 3 W

  22. Application of Linear Regressionto MOS Development

  23. MOS LINEAR REGRESSION JANUARY 1 - JANUARY 30, 1994 0000 UTCKCMH 60 50 40 30 TODAY'S MAX (°F) 20 10 0 -10 1150 1200 1250 1300 1350 18-H NGM 850-1000 MB THICKNESS (M)

  24. MOS LINEAR REGRESSION JANUARY 1 - JANUARY 30, 1994 0000 UTCKCMH 60 MAX T = -352 + (0.3 x 850-1000 mb THK) 50 40 RV=93.1% 30 TODAY'S MAX (°F) 20 10 0 -10 1150 1200 1250 1300 1350 18-H NGM 850-1000 MB THICKNESS (M)

  25. REDUCTION OF VARIANCE A measure of the “goodness” of fit andPredictor / Predictand correlation Variance - Standard Error RV = Variance MEAN PREDICTAND { RV } UNEXPLAINED VARIANCE * PREDICTOR

  26. MOS LINEAR REGRESSION JANUARY 1 - JANUARY 30, 1994 0000 UTCKUIL 60 50 TODAY'S MAX (°F) 40 RV=26.8% 30 1250 1300 1350 1400 18-H NGM 850-1000 MB THICKNESS (M)

  27. MOS LINEAR REGRESSION DECEMBER 1 1993 - MARCH 5 1994 0000 UTCKCMH 1 12-24 H PRECIPITATION ≥ .01" 0 10 20 30 40 50 60 70 80 90 100 AVG. 12-24 H NGM ~1000 - 500 MB RH

  28. MOS LINEAR REGRESSION DECEMBER 1 1993 - MARCH 5 1994 0000 UTCKCMH 1 RV=36.5% 12-24 H PRECIPITATION ≥ .01" 0 10 20 30 40 50 60 70 80 90 100 AVG. 12-24 H NGM ~1000 - 500 MB RH

  29. MOS LINEAR REGRESSION DECEMBER 1 1993 - MARCH 5 1994 0000 UTCKCMH 1 RV=36.5% 12-24 H PRECIPITATION ≥ .01" RV=42.4% 0 10 20 30 40 50 60 70 80 90 100 AVG. 12-24 H NGM ~1000 - 500 MB RH

  30. MOS LINEAR REGRESSION DECEMBER 1 1993 - MARCH 5 1994 0000 UTCKCMH 1 RV=44.9% RV=36.5% 12-24 H PRECIPITATION ≥ .01" RV=42.4% 0 POP = -0.234 + (0.007 X MRH) + (0.478 X BINARY MRH (70%)) 10 20 30 40 50 60 70 80 90 100 AVG. 12-24 H NGM ~1000 - 500 MB RH

  31. EXAMPLE REGRESSION EQUATIONS Y = a + bX CMH MAX TEMPERATURE EQUATION MAX T = -352 + (0.3 x 850 -1000 mb THICKNESS) CMH PROBABILITY OF PRECIPITATION EQUATION POP = -0.234 + (0.007 x MEAN RH) + (0.478 x BINARY MEAN RH CUTOFF AT 70%)* *(IF MRH ≥ 70% BINARY MRH = 1; else BINARY MRH = 0)

  32. If the predictand is BINARY, MOS regression equations produceestimates of event PROBABILITIES... KCMH 1 3 Events P = 30% RF= 30% 12-24 H PRECIPITATION ≥ .01" 0 7 Events 10 20 30 40 50 60 70 80 90 100 AVG. 12-24 H NGM ~1000 - 500 MB RH

  33. Making a PROBABILISTIC statement... Quantifies the uncertainty !

  34. DEFINITION of PROBABILITY (Wilks, 1994) • LONG TERM RELATIVE FREQUENCY OF AN EVENT • DEGREE OF BELIEF OR QUANTIFIED JUDGMENT ABOUT THE OCCURRENCE OF AN UNCERTAIN EVENT KEEP IN MIND: Assessment of probability is EXTREMELYdependent upon how predictand “event” is defined: - Time period of consideration - Area of occurrence - Dependent upon another event? • POINT PROBABILITY • AREAL PROBABILITY • CONDITIONAL PROBABILITY

  35. 40-km gridbox 10% contour interval 20-km gridbox 10% contour interval AREAL PROBABILITIES 3H Eta MOS thunderstorm probability forecasts valid 0000 UTC 8/27/2002 (21-24h proj) What if these were 6-h forecasts?

  36. PROPERTIES OFMOS PROBABILITY FORECASTS • Unbiased • Average forecast probability equals • long-term relative frequency of event • Reliable • Conditionally or “Piecewise” unbiased • over entire range of forecast probabilities • Reflect predictability of event • Range narrows and approaches event RF • as NWP model skill declines • - extreme forecast projection • - rare events

  37. Designing an Operational MOS System: Putting theory into practice…

  38. DEVELOPMENTAL CONSIDERATIONS MOS in the real world • Selection (and QC!) of Suitable • Observational Datasets • ASOS? Remote sensor? Which mesonet?

  39. Suitable observations? Appropriate Sensor? Real or Memorex? Good siting? Photo Courtesy W. Shaffer

  40. DEVELOPMENTAL CONSIDERATIONS MOS in the real world • Selection (and QC!) of Suitable • Observational Datasets • ASOS? Remote sensor? Which mesonet? • Predictand Definition • Must be precise !!

  41. PREDICTAND DEFINITION Max/Min and PoP Daytime Maximum Temperature “Daytime” is 0700 AM - 0700 PM LST Nighttime Minimum Temperature “Nighttime” is 0700 PM - 0800 AM LST Probability of Precipitation Precipitation occurrence is accumulation of ≥ 0.01 inches of liquid-equivalent at a gauge location within a specified period

  42. PREDICTAND DEFINITION GFSX 12-h Average Cloud Amount • Determined from 13 consecutive hourly • ASOS obervations, satellite augmented • Assign value to each METAR report: • CLR; FEW; SCT; BKN; OVC • 0 ; 0.15; 0.38; 0.69; 1 • Take weighted average of above • Categorize: • CL < .3125 ≤ PC ≤ .6875 < OV

  43. Creating a Gridded Predictand Lightning strikes are summed over the “appropriate” time period and assigned to the center of “appropriate” grid boxes A thunderstorm is deemed to have occurred when one or more lightning strikes are observed within a given gridbox: = no thunderstorm = thunderstorm

  44. DEVELOPMENTAL CONSIDERATIONS MOS in the real world • Selection (and QC!) of Suitable • Observational Datasets • ASOS? Remote sensor? Which mesonet? • Predictand Definition • Must be precise !! • Choice of Predictors • “Appropriate” formulation • Binary or other transform?

  45. “APPROPRIATE” PREDICTORS X • RECIPITABLE WATER • VERTICAL VELOCITY • MOISTURE DIVERGENCE • MODEL PRECIPITATION 1000-500 MB THK TROPOPAUSE HGT • DESCRIBE PHYSICAL PROCESSES ASSOCIATED WITH OCCURRENCE OF PREDICTAND • “MIMIC” FORECASTER THOUGHT PROCESS i.e. for POP: (VERTICAL VELOCITY) X (MEAN RH)

  46. POINT BINARY PREDICTOR 24-H MEAN RH CUTOFF = 70%INTERPOLATE ; STATION RH ≥ 70% , BINARY =1 BINARY = 0 OTHERWISE 96 86 89 94 87 73 76 90 KCMH • (71%) 76 60 69 92 64 54 68 93 RH ≥70% ; BINARY AT KCMH = 1

  47. GRID BINARY PREDICTOR 24 H MEAN RH CUTOFF = 70%WHERE RH ≥ 70% ; GRIDPOINT = 1 ; INTERPOLATE 1 1 1 1 1 1 1 1 KCMH • (.21 ) 1 0 0 1 0 0 0 1 0 ≤ VALUE AT KCMH ≤ 1

  48. Logit Transform Example KPIA (Peoria, IL) 0000 UTC ; 18-h projection

More Related