1 / 17

Validation of the AMSU Snowfall Algorithm (over the US)

Validation of the AMSU Snowfall Algorithm (over the US). Ralph Ferraro Matt Sapiano, Meredith Nichols, Huan Meng National Oceanic and Atmospheric Administration (NOAA) National Environmental Satellite, Data & Information Service (NESDIS) AND

yama
Download Presentation

Validation of the AMSU Snowfall Algorithm (over the US)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validation of the AMSU Snowfall Algorithm (over the US) Ralph Ferraro Matt Sapiano, Meredith Nichols, Huan Meng National Oceanic and Atmospheric Administration (NOAA) National Environmental Satellite, Data & Information Service (NESDIS) AND The Cooperative Institute for Climate and Satellites (CICS) College Park, Maryland

  2. Objectives and Science Questions • NOAA has an operational precipitation product from AMSU • Includes a falling snow identification flag over land • Kongoli et al., 2003, GRL • Snowfall rates being developed by H. Meng • We know that it works in some cases and not in others • How do we quantify the accuracy of the detection algorithm? • Work was done in the algorithm development… • Under what meteorological conditions does it work and not work? • This is what we are really after! • Answer is crucial as we enter into the GPM era • Snowfall is important component of hydrological cycle • In some places, snowfall is the primary form of precipitation • What I plan on showing • Several attempts to validate • What we are hoping to accomplish (work in progress)

  3. East Coast Snow/Ice Storm – 14 February 2007 NOAA-16 Precipitation Type/Rainfall Rate NEXRAD Reflectivity NOAA-16 Snowfall Rate • Corresponds with > 20 dBZ • Underestimates in heavy snow

  4. February 5-6, 2010 Snow Event Courtesy H. Meng SFR (mm/hr)

  5. February 5-6, 2010 Snow Event Courtesy H. Meng

  6. Verification Issues • Hourly surface reports of snowfall are widely varying • “S-” can mean just about anything • Visibility, T-DP spread (RH) are better indicators of intensity • Hourly water equivalent are scarce and unreliable • ASOS, wind, etc. • Radar does not make rain/snow distinction w/o human interpretation • Virga, surface temperature are issues • Wide variety of conditions within MW satellite FOV • Previous work we have done show “lag” between surface and satellite signal • Snow fall slower than rain • Others

  7. First Attempt – Climatology • We generated AMSU snowfall “climatology” • 7 years, 5 deg grids • NOAA-15 and -16 • Some assessments • Heaviest occurrences in “transition zones” • But values seem low • Large areas where retrievals don’t occur • Too cold and dry • Other features • E. Canada • Pacific NW/AK • Rocky Mountains • Himalayas SON DJF MAM

  8. Comparison with Snow Cover Rutgers Snow Lab – Jan 2006 AMSU - Jan 2006

  9. Verification • A. Dai (NCAR) • J. Climate, 2001; COADS climatology; 15,000 surface station reports • Can stratify by WMO weather codes • Grouping by all snow reports • Huge discrepancies. Why? • SW-, non accumulating snow • Filtered by S/SW, SW+/S+, temp/visiblity info from Rasmussen & Cole (2002) • Better qualitative agreement • Still not apples to apples comparison • AMSU 4 times/day; COADS, 24 times/day • Does imply that AMSU has skill in these type of events • Some recent work by Liu with CloudSat • Frequency of snow values comparable to these filtered data

  10. AMSU (L) vs. COADS (R) SON DJF

  11. Second Attempt – Daily Snow Reports • Are there denser surface networks that can be used? • Is there a better way to validate the ‘spatial’ patterns of the AMSU • Storm cases indicate ‘skill’; how best to quantify? • WMO/NCDC – “Global Surface Summary of the Day Data V7” • 9000 stations • Gives weather occurrences, max/min temp., precip. totals • Effort led by Matt Sapiano (now at CIRA) • 2000-08, N15, N16, N18 data

  12. Nine Years of Comparisons • High resolution data still fairly sparse • 1 or 2.5 deg better for comparison • GSOD > AMSU just about everywhere • Probably due to very light snow • Let’s look closer…

  13. Example – GSOD vs. AMSU • Extrapolate GOSD on 1 deg grid • Color coding • Green (Hit) • Blue (False on AMSU) • Red (Miss) • Could be rain • Gray (Hit – no snow) • Qualitative assessment • Overrunning snow – good • Upper low/backside snow • Missed • Attempted quantitative assessment at different locations • M. Nichols, HS student/intern • Different locations/regimes • Results inconclusive…time coincidence a limitation

  14. Current Attempt – Hourly Snow Reports • Although we tried to avoid this, this is really the only way to go… • H. Mengs effort, HUGE data base of hourly synoptic reports colocated with AMSU for several years and all satellites. • Stratified data by location, weather conditions, precipitation totals, etc. • Still evaluating data….

  15. Some Results – Jan 2007

  16. Summary • Validation of AMSU snowfall algorithm is a difficult task • Algorithm has known limitations; we are trying to couple physical phenomenon to this • Temp./Moisture profiles, surface conditions, precip. Intensity, etc. • We know that the algorithm has ‘skill’, illustrating this has been a challenge. Why? • Incompatibility between satellite and ground data • More severe than in rainfall • Ground data fairly scarce, quality in question • Current method that should help answer is direct matches between satellite and surface reports • Emerging work with CloudSat (and GV) should also be pursued

More Related