1 / 54

SAGES Scottish Alliance for Geoscience, Environment & Society

SAGES Scottish Alliance for Geoscience, Environment & Society. Constraining Climate Sensitivity using Top Of Atmosphere Radiation Measurements Simon Tett 1 , Mike Mineter 1 , Coralia Cartis 2 , Dan Rowlands 3 & Ping Liu 2 1: School of Geosciences, University of Edinburgh

knox
Download Presentation

SAGES Scottish Alliance for Geoscience, Environment & Society

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SAGESScottish Alliance for Geoscience, Environment & Society Constraining Climate Sensitivity using Top Of Atmosphere Radiation Measurements Simon Tett1, Mike Mineter1, Coralia Cartis2, Dan Rowlands3 &Ping Liu2 1: School of Geosciences, University of Edinburgh 2: School of Mathematics, University of Edinburgh 3: AOPP, Department of Physics, University of Oxford

  2. Aims • Report on attempts to optimise atmospheric model parameters to observed global-mean radiation measurements. • Use results from these simulations to give probabilistic estimate of climate sensitivity.

  3. Key results • Successfully optimised HadAM3 to outgoing longwave (OLR) and Reflected Shortwave (RSR) observations. • Many large scale simulated variables correlate strongly with OLR and RSR • There is a relationship between equilibrium climate sensitivity and simulated outgoing radiation • Uncertainty analysis on “plausible” HadAM3 models rules out ECS > 5.6K (and < 2.7K)

  4. Estimates of Climate Sensitivity using models and observations From IPCC CH9 (fig 9.2), after Hegerl et al., Nature 2006. Bars are 5-95% range

  5. Philosophy • Model is a tool which encapsulates our (best) knowledge of relevant physics of the system. • Future predictions are based on models • A model is useful for specific purpose if it is consistent with relevant observations • Uncertainty in future predictions arises because many models are consistent with observations but make different predictions.

  6. Observational Data • Use the CERES (Clouds and the Earth's Radiant Energy System) record of Leob et al, 2009. • CERES flying on TERA & AQUA satellites • Two instruments: • 0.3 to 5 μm (SW) • 0.3 to 200 μm (Total) • Estimate LW from difference between Total & SW • Adjusted to be in net imbalance as estimated from ocean data. • For March 2000-Feb 2005 • Also explore sensitivity to use of ERBE data

  7. Experimental Design • HadAM3 (Atmospheric model – Pope et al, 2000) simulations forced by observed Sea Surface Temperatures & Sea Ice. • Modify four parameters in model which are known to affect climate sensitivity • ENTCOEF -- rate at which convective plumes and environment mix • VF1 -- speed of falling precipitation • CT -- rate at which water vapour converted to precipitation • RHCRIT -- critical humidity at which clouds form. • Simulations started Dec 1998 and ran though to April 2005 starting from same initial condition.

  8. Experimental Design (Cont.) • Modified package of forcings as Tett et al, 2007 • Well mixed GHG, Ozone, Land-Surface, Aerosols, Volcanoes and Total Solar Irradiance • Modifications: • Fix to long-standing bug in SW Rayleigh scattering • Use recent values of TSI (1361 W/m2 from Kopp & Lean, 2011) • Slight changes to GHG – using observed values rather than A2. • Compare simulated global-avg RSR and OLR with CERES observations for the average March 2000 to Feb 2005. First 14 months ignored allowing land-sfc to adjust to parameter change.

  9. Optimisation Algorithm applied to HadAM3 simulations • Error= Root Mean Square difference of 5-year global average RSR and OLR from simulation vs target • “Line search” • Compute finite derivatives d Error/d param for each one of four parameters. Need to perturb parameters enough to make reasonable estimate of the derivative. We perturbed by 5-10% of the parameter range. • Use these to compute direction in which Error is most reduced.

  10. Optimisation Algorithm (Cont.) • Parameters are scaled so they have similar magnitudes. • Jacobian is under-determined so regularise by adding scaled identity matrix till it is invertible. This makes us prefer solutions where all parameters have similar magnitudes. • Do trials at 30%, 90 % and the full vector. • If these values are outside the range of permissible parameter values we move along the boundary.

  11. Optimisation Algorithm (Cont.) • Use the smallest of the three “line search” values to update the parameters and start again. • Terminate when error less than specified value (normally 1 W m-2) or no improvement in line search. Deriv Initial Line Search

  12. Optimisation Trials • Convergence to “zero” error for all but CERES high sens case • CERES case took 3 iterations • ERBE cases took 8 & 9 iterations Low Sens CERES High Sens ERBE

  13. Optimisation • Carried out 16 optimisation cases with initial values the extreme parameter choices with CERES target. Extreme parameter choice is set each one of 4 parameters to maximum or minimum value. • Many simulations failed… (“blew up”) • Redo at 75% of distance between reference case and max/min. • Then carry out set of trials with various target values

  14. Optimisation Trials (Summary)

  15. Optimisation Summary • Failure to converge as (un)common as model failure. • Can adjust model parameters to produce models that are within 1 Wm2 of observations. • Takes about 3-4 iterations. Each iteration requires 7 simulations each of 6 years of atmospheric model. This is very efficient…

  16. What is responsible for changes in RSR/OLR? Little change in clear sky OLR due to compensation between RH and temperature. Changes in radn arise from cloud changes.

  17. Other variables • Why did you not include more variables in RMS error and minimise the total? • Answers: • Model needs to have good simulation of Energy balance as energy key to climate. • Interested in radiative processes as those key to climate sensitivity • Other variables are strongly related to radiation so including them would add no new information.

  18. Predicted from RSR & OLR vs Simulated 90% 89 % Predicted 38% 67% Simulated Simulated 93% Tech issue. Need to weight to cope with uneven sampling. “Voronoi” – see latter. 70 % Simulated Simulated

  19. Estimating Climate Sensitivity • Our experimental design doesn’t give us an estimate of climate sensitivity • But climateprediction.net done 14,000 doubled CO2 slab model experiments (HadSM3) each 20 years long. • Can estimate equilibrium climate sensitivity (ECS) for many parameters from these simulations. • We constructed an emulator of ECS for HadSM3. Give the emulator model parameters and the emulator provides an estimate of climate sensitivity. • Essentially sophisticated regression/interpolation and so generates some additional uncertainty.

  20. Climate Sensitivity vs radn (optimisation simulations) Outgoing LW Reflected SW

  21. Sampling equally in CS and from the 14000 CPDN configurations

  22. Feedback vs total outgoing radn Summary There is a relationship between climate feedback/ sensitivity and outgoing radiation

  23. Probabilistic ECS estimate: Schematic Simulated Parameters S Prior Posterior Likelihood Uncertainty Estimate Observed

  24. Uncertainty estimate for OLR and RSR • Need estimate of uncertainty to decide what is a “plausible” simulation of OLR and RSR? • Make estimates for sources of uncertainty and sum them assuming everything Gaussian. • For Audience.. • Anything major missing? • Anything too small? • From “plausible” configurations can generate “plausible” range. • Using (simple) Bayesian reasoning make probabilistic estimate of ECS.

  25. Sources of Uncertainty • Consider uncertainties that don’t affect climate sensitivity but could affect the outgoing radiation. Assume all uncertainties are independent multi-variate Gaussian and combine them. • Observational uncertainty • Forcing uncertainty • Modelling uncertainty • SST uncertainty • NOT including uncertainty in model structure • results are conditional on HadAM3 structure.

  26. Observational uncertainty • Reflected SW Radn (RSR): 1 Wm-2 • Outgoing LW Radiation (OLR): 1.4 Wm-2 • Then combined with uncertainty on total energy leaving the Earth (0.5 Wm-2) mainly arising from uncertainty in incoming solar radiation.

  27. Forcing and Aerosol Uncertainty • Forcing Uncertainty (from IPCC AR4) • RSR: 1 Wm-2 (mainly aerosol uncertainty) • OLR: 0.25 Wm-2 (O3 & GHG) • Convert to TOA radiation using simulations with aerosols, human forcings, and GHG removed. Then scaling results based on comparison between forcing and TOA fluxes. • Natural aerosol – RSR 1 Wm-2 (Carslaw et al, 2010 is about 0.5) while Penner et al, 2006 models are about 1 Wm-2

  28. Modelling Error • Internal climate variability: 0.1 Wm-2 in both RSR and OLR and is negligible. • Parameter uncertainty – what about parameters that don’t affect climate sensitivity but do affect radiation? • From climateprediction.net database find those model configurations that have a climate sensitivity from 3.2-3.4K (standard configuration is 3.3K). Gives 13 cases. Run the cases and compute covariance of RSR and OLR. Then treat as another source of uncertainty.

  29. SST Uncertainty

  30. SST Uncertainty • Computed: • Difference between two successive Hadley climatologies is less than 0.2K over most of globe. • Being conservative estimate SST 1 sigma error as 0.2K. • Increase HadISST by 0.5K over non sea-ice points, run standard configuration and compute difference in RSR and OLR from reference case (-0.4 & 1.2 W/m2). Then scale for 0.2K case. • Replacement of HadISST dataset with Reynolds dataset (AVHRR + buoys) has negligible impact on OLR and RSR

  31. Sources of Uncertainty

  32. Which Simulations are consistent with Observations?

  33. Which HadAM3 climate sensitivities are consistent with observations? • Find all configurations that have OLR and RSR consistent (95%) level with observations. • What is the range of climate sensitivies? • CERES: 3.0-4.1 K • ERBE: 3.0-5.1 K

  34. Generating A PDF for Climate Sensitivity

  35. Generating a PDF for climate Sensitivity • Want to use uncertainty estimate and model configurations to generate PDF • Issue: Model configurations not generated randomly and density varies enormously • Assume properties vary smoothly and explore impact of different prior assumptions • Use Bayes theorem to compute: Which is the same as P(Si|O)

  36. Calculating Likelihoods for each configuration f(r) Areas are the areas of the Voronoi polygons capped at π (gray polygons)

  37. Computing a probability distribution • Individual model realisations not randomly generated though have 16 different initial conditions and 5 different targets • Explore several different prior assumptions on the individual realisations. Uniform all configurations equally likely Radiation OLR and RSR equally likely Parameter Parameter values equally likely S ECS values equally likely in range 1/S All feedbacks equally likely in range

  38. Cumulative Distribution Functions

  39. Taking Account of Emulator Uncertainty Error varies with climate sensitivity. Compute its impact by assuming error is coherent over all samples and monte-carlo sampling. (modify the estimated sensitivity of each configuration) Error varies

  40. Taking account of Emulator Error 2.7-4.2

  41. Sensitivity Studies • Use ERBE observations • Amplify Observational co-variance by a factor of 20 – makes CERES and ERBE consistent… • Amplify total covariance by two. • And some more including using a single year rather than 5. • Only the first three make any difference.

  42. Cumulative Distribution Functions 2 x Cov: 2.7-4.9 20xSat: 2.7-5.2 ERBE:2.9-5.5 ERBE

  43. Summary • Found could automatically tune HadAM3 to fit TOA radiation observations. • Variation largely due to changes in clouds with clear sky cancelling • Get to within 1 W/m2 RMS with median of 3-4 iterations. • Other climatologies strongly related to RSR/OLR • Made an estimate of uncertainty in model-data difference which is dominated by modelling uncertainty rather than observational uncertainty. • Used this to make probabilistic estimate of climate sensitivity for HadAM3. • This is 2.7-4.2 K for CERES with little sensitivity to prior assumptions • Amplifying observational error to make CERES and ERBE consistent gives range of 2.6-5.4K with considerable sensitivity to prior.

  44. Thoughts • Straight forward to tune models to global SW & LW radiation • Could probably do it with a series of short (18 months) simulations • It matters for three reasons • Coupled models with poor radiation balance will drift (or need flux correction) • Related to feedbacks in HadAM3 (and probably other models) • Modifies the climate in SST forced – energy balance is probably physical reason.

  45. Next stages • Couple tuned atmosphere models to ocean. • Do they behave as well as we expect? • Explore methods to allow more parameters and observations • Would allow better constraints and explore which parameters and observations really matter. • Bringing observations and modelling together in coherent way. • Which observations are sufficiently independent? • Parameter importance from d error/d param

  46. Next Stages (cont.) • Use optimisation to efficiently generate atmospheric model configurations that are appropriately sampled from model/obs uncertainty estimate • Allows generation of uncertainty in future model properties. Not just ECS. Would be helped by reasonable first guesses. • Use optimisation to efficiently generate models of differing resolution consistent (on large scales) with observations and each other • Allows exploration of impact of resolution change without error in energy balance/basic climatology

  47. Thanks and Questions

  48. Failures

  49. Equi-finality • Can different parameter choice results in models with similar climatologies? • Do this by looking at final configurations of CERES and ERBE optimisation cases + cases within 1 W/m2 of target.

  50. Precip vs Temp

More Related