1 / 49

Parameterization Cloud Scheme Validation

ADRIAN. AN ICTP LECTURER. Parameterization Cloud Scheme Validation. tompkins@ictp.it. Cloud observations. parameterisation improvements. error . Cloud simulation. Cloud Validation: The issues. AIM : To perfectly simulate one aspect of nature: CLOUDS

marcin
Download Presentation

Parameterization Cloud Scheme Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ADRIAN AN ICTP LECTURER ParameterizationCloud Scheme Validation tompkins@ictp.it

  2. Cloud observations parameterisation improvements error Cloud simulation Cloud Validation: The issues • AIM: To perfectly simulate one aspect of nature: CLOUDS • APPROACH: Validate the model generated clouds against observations, and to use the information concerning apparent errors to improve the model physics, and subsequently the cloud simulation sounds easy?

  3. Cloud Validation: The problems • How much of the ‘error’ derives from observations? Cloud observations error = e1 parameterisation improvements error Cloud simulation error = e2

  4. turbulence radiation cloud physics dynamics convection Cloud Validation: The problems • Which Physics is responsible for the error? Cloud observations parameterisation improvements error Cloud simulation

  5. The path to improved cloud parameterisation… parameterisation improvement ? Composite studies NWP validation Case studies cloud validation Comparison to Satellite Products

  6. Model climate - Broadband radiative fluxes Can compare Top of Atmosphere (TOA) radiative fluxes with satellite observations: TOA Shortwave radiation (TSR) JJA 87 TSR CY18R6-ERBE Stratocumulus regions bad - also North Africa (old cycle!)

  7. Model climate - Cloud radiative “forcing” • Problem: Can we associate these “errors” with clouds? • Another approach is to examine “cloud radiative forcing” JJA 87 SWCRF CY18R6-ERBE Cloud Problems: strato-cu YES, North Africa NO! Note CRF sometimes defined as Fclr-F, also differences in model calculation

  8. Model climate - “Cloud fraction” or “Total cloud cover” Can also compare other variables to derived products: CC JJA 87 TCC CY18R6-ISCCP references: ISCCP Rossow and Schiffer, Bull Am Met Soc. 91, ERBE Ramanathan et al. Science 89

  9. Another approach is to simulate irradiances using model fields Vapour 1 Liquid cloud 1 Ice cloud 1 Radiative transfer model Vapour 2 Liquid cloud 2 Ice cloud 2 Vapour 3 Liquid cloud 3 Ice cloud 3 Height Climatology: The Problems If more complicated cloud parameters are desired (e.g. vertical structure) then retrieval can be ambiguous Channel 1 Channel 2 ….. Assumptions about vertical structures

  10. Examples: Morcrette MWR 1991 Chevallier et al, J Clim. 2001 Simulating Satellite Channels More certainty in the diagnosis of the existence of a problem. Doesn’t necessarily help identify the origin of the problem

  11. DIURNAL CYCLE OVER TROPICAL LAND VARIABILITY A more complicated analysis is possible: Observations: late afternoon peak in convection Model: morning peak (Common problem)

  12. Has this Improved? 29r1, 06 UTC

  13. 29r1, 12 UTC

  14. 29r1, 18UTC

  15. NWP forecast evaluation • Differences in longer simulations may not be the direct result of the cloud scheme • Interaction with radiation, dynamics etc. • E.g: poor stratocumulus regions • Using short-term NWP or analysis restricts this and allows one to concentrate on the cloud scheme Introduction of Tiedtke Scheme cloud cover bias Time

  16. Example over Europe -8:-3 -3:-1 -1:1 1:3 3:8 • What are your conclusions concerning the cloud scheme?

  17. Example over Europe -8:-3 -3:-1 -1:1 1:3 3:8 • Who wants to be a Meteorologist? • Which of the following is a drawback of SYNOP observations? (a) They are only available over land (b) They are only available during daytime (c) They do not provide information concerning vertical structure (d) They are made by people

  18. Case Studies • These were examples of general statistics: globally or for specific regions • Can look concentrate on a particular location in more details, for which more data is collected:CASE STUDY • Examples: • GATE, CEPEX, TOGA-COARE, ARM...

  19. Evaluation of vertical cloud structure Mace et al., 1998, GRL Examined the frequency of occurrence of ice cloud Reasonable match to data found ARM Site - America Great Plains

  20. Evaluation of vertical cloud structure Hogan et al., 2000, JAM Analysis using the Chilbolton radar and Lidar Reasonable Match

  21. Hogan et al. More details possible

  22. Hogan et al. Found that comparison improved when snow was taken into account

  23. Issues Raised: • WHAT ARE WE COMPARING? • Is the model statistic really equivalent to what the instrument measures? • e.g: Radar sees snow, but the model may not include this is the definition of cloud fraction. Small ice amounts may be invisible to the instrument but included in the model statistic • HOW STRINGENT IS OUR TEST? • Perhaps the variable is easy to reproduce • e.g: Mid-latitude frontal clouds are strongly dynamically forced, cloud fraction is often zero or one. Perhaps cloud fraction statistics are easy to reproduce in short term forecasts

  24. Can also use to validate “components” of cloud scheme EXAMPLE: Cloud Overlap Assumptions Hogan and Illingworth, 00, QJRMS Issues Raised: HOW REPRESENTATIVE IS OUR CASE STUDY LOCATION? e.g: Wind shear and dynamics very different between Southern England and the tropics!!!

  25. Composites • We want to look at a certain kind of model system: • Stratocumulus regions • Extra tropical cyclones • An individual case may not be conclusive: Is it typical? • On the other hand general statistics may swamp this kind of system • Can use compositing technique

  26. 120 120 250 250 1. 2. 350 350 500 500 620 620 750 750 900 900 Composites - a cloud survey From Satellite attempt to derive cloud top pressure and cloud optical thickness for each pixel - Data is then divided into regimes according to sea level pressure anomaly Use ISCCP simulator Data Model Modal-Data -ve SLP Tselioudis et al., 2000, JCL Cloud top pressure +ve SLP • High Clouds too thin • Low clouds too thick Optical depth

  27. Composites – Extra-tropical cyclones Overlay about 1000 cyclones, defined about a location of maximum optical thickness Plot predominant cloud types by looking at anomalies from 5-day average • High Clouds too thin • Low clouds too thick High tops=Red, Mid tops=Yellow, Low tops=Blue Klein and Jakob, 1999, MWR

  28. A strategy for cloud parametrization evaluation Jakob, Thesis Where are the difficulties?

  29. Recap: The problems • All Observations • Are we comparing ‘like with like’? What assumptions are contained in retrievals/variational approaches? • Long term climatologies: • Which physics is responsible for the errors? • Dynamical regimes can diverge • NWP, Reanalysis, Column Models • Doesn’t allow the interaction between physics to be represented • Case studies • Are they representative? Do changes translate into global skill? • CompositesAs case studies. • And one more problem specific to NWP…

  30. NWP cloud scheme development Timescale of validation exercise • Many of the above validation exercises are complex and involved • Often the results are available O(years) after the project starts for a single version of the model • NWP operational models are updated 2 to 4 times a year roughly, so often the validation results are no longer relevant, once they become available. Requirement: A quick and easy test bench

  31. 26r1: April 2003 Example: LWP ERA-40 and recent cycles model 23r4: June 2001 SSMI Diff

  32. Example: LWP ERA-40 and recent cycles model 23r4: June 2001 26r3: Oct 2003 SSMI Diff

  33. Example: LWP ERA-40 and recent cycles model 23r4: June 2001 28r1: Mar 2004 SSMI Diff

  34. Example: LWP ERA-40 and recent cycles model 23r4: June 2001 28r3: Sept 2004 SSMI Diff

  35. Example: LWP ERA-40 and recent cycles model 23r4: June 2001 29r1: Apr 2005 SSMI Diff Do ERA-40 cloud studies still have relevance for the operational model?

  36. So what is used at ECMWF? • T799-L91 • Standard “Scores” (rms, anom corr of U, T, Z) • “operational” validation of clouds against SYNOP observations • Simulated radiances against Meteosat 7 • T159-L91 – “climate” runs • 3 ensemble members of 13 months • Automatically produces comparisons to: • ERBE, NOAA-x, CERES TOA fluxes • Quikscat & SSM/I, 10m winds • ISCCP & MODIS cloud cover • SSM/I, TRMM liquid water path • (soon MLS ice water content) • GPCP, TRMM, SSM/I, Xie Arkin, Precip • Dasilva climatology of surface fluxes • ERA-40 analysis winds All datasets treated as “truth”

  37. OLR -150 Model T95 L91 -300 -150 CERES Conclusions? -300 too high Difference too low

  38. SW 350 Model T95 L91 100 350 CERES Conclusions? 100 albedo high Difference albedo low

  39. TCC 80 Model T95 L91 5 80 ISCCP Conclusions? 5 TCC high Difference TCC low

  40. First Ice Validation: microwave limb sounders 316 hPa 215 hPa Color: MLS White bars: EC IWC sampled with MLS tracks

  41. TCLW LWP 80 Model T95 L91 5 80 SSMI Conclusions? 5 high Difference low

  42. Map of MLS ice error

  43. Daily Report 11th April 2005Lets see what you think?  “Going more into details of the cyclone, it can be seen that the model was able to reproduce the very peculiar spiral structure in the clouds bands. However large differences can be noticed further east, in the warm sector of the frontal system attached to the cyclone, were the model largely underpredicts the typical high-cloud shield. Look for example in the two maps below where a clear deficiency of clod cover is evident in the model generated satellite images north of the Black Sea. In this case this was systematic over different forecasts.” – Quote from ECMWF daily report 11th April 2005

  44. Same Case, water vapour channels Blue: moist Red: Dry 30 hr forecast too dry in front region Is not a FC-drift, does this mean the cloud scheme is at fault?

  45. Future: Long term ground-based validation, CLOUDNET • Network of stations processed for multi-year period using identical algorithms, first Europe, now also ARM sites • Some European provide operational forecasts so that direct comparisons are made quasi-realtime • Direct involvement of Met services to provide up-to-date information on model cycles

  46. Cloudnet Example • In addition to standard quicklooks, longer-term statistics are available • This example is for ECMWF cloud cover during June 2005 • Includes preprocessing to account for radar attenuation and snow • See www.cloud-net.org for more details and examples!

  47. Future: Improved remote sensing capabilities, CLOUDSAT • CloudSat is an experimental satellite that will use radar to study clouds and precipitation from space. CloudSat will fly in orbital formation as part of the A-Train constellation of satellites (Aqua, CloudSat, CALIPSO, PARASOL, and Aura) • Launched 28th April

  48. Cloudsat transect Zonal mean cloud cover

  49. A chasm still exists!!! cloud validation In summary: Many methods for examining clouds but all too often... parameterisation improvement

More Related