1 / 36

Entering the cauldron of climate change: Science & Policy

Entering the cauldron of climate change: Science & Policy. Robert Rosner Astrophysics/ Physics/Harris Policy School The University of Chicago Fermi National Accelerator Laboratory February 17, 2016. It appears that this topic is controversial …. (Wall Street Journal Online).

rohlson
Download Presentation

Entering the cauldron of climate change: Science & Policy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Entering the cauldron of climate change: Science & Policy Robert Rosner Astrophysics/Physics/Harris Policy School The University of Chicago Fermi National Accelerator Laboratory February 17, 2016

  2. It appears that this topic is controversial … (Wall Street Journal Online) "Among abstracts that expressed a position on AGW [Anthropogenic, or human-cause, Global Warming], 97.1% endorsed the scientific consensus. Among scientists who expressed a position on AGW in their abstract, 98.4% endorsed the consensus.” J. Cook et al. Env. Research Letters 8 No. 2 (June 2013)

  3. So before we begin, let me be clear about where I stand … • I believe that existing data is convincing that anthropogenically-driven climate change is currently underway, and poses serious public policy issues • However, prediction of when and where climate impacts will arise on decadal scales is currently not possible, and may be impossible

  4. What I’ll talk about … • The uncomfortable intersection between physics (especially computational physics), climate science and public policy … • The science … • The policy/science intersection: how we as scientists struggle to convey the science honestly and transparently, with the goal of positively affecting the creation and execution of policy …

  5. The context … • The re-visit of the 2007/2010/2012 ‘climate statement’ by the American Physical Society (APS) • The APS ‘statement process’: • The APS Panel on Public Affairs (POPA) is responsible for originating and formulating APS statements … • APS Bylaws require revisits of current statements by POPA every 5 years • I chaired POPA in CY 2013 … served on the subcommittee(s) delegated to deal with this issue, and was on the ‘writing committee’ charged with its ‘final’ revision in 2014 • The APS finished the story roughly one year later …

  6. The Timeline … APS Council has approved the proposed statement!

  7. What APS/POPA did … • Steering committee: • M. Beasley (2014 APS President), K. Kirby (APS EO), S. Koonin (then, POPA Chair-elect), F. Slakey (APS Assoc. Director, Public Affairs), R. Jaffe (2014 POPA Chair) • POPA Subcommittee (“Review Committee”, reporting to the POPA Steering Committee) • Members: P. Coyle, R.S. Kemp, S. Koonin, T. Meyer, R. Rosner, S. Seestrom • Actions • Process: How should we go about deciding what to do? • Initial perspective • Actions can include: (i) doing nothing: (ii) endorsing existing statement; (iii) modifying existing statement; (iv) writing a new (replacement) statement • Workshop at NYU – January 2014 • Invitees: John Christy (U. Alabama/Huntsville), Bill Collins (UC Berkeley/LBNL), Judy Curry (Georgia Tech), Isaac Held (Princeton), Dick Lindzen (MIT), Ben Santer (LLNL) • Initial presentation of 0th-order draft statement to POPA – February 2014 • Revision process, with new ‘writing group’ (R.S. Kemp, W. McCurdy, S. Pratt, R. Rosner, J. Wells) … S. Koonin withdrew from POPA (before publishing the WSJ op-ed …), and POPA vote took place on Oct. 3, 2014 … then on to the APS • Then – on the the rest of the APS process …

  8. The final agreed-upon text … 15.3 STATEMENT ON EARTH'S CHANGING CLIMATE (Adopted by Council on November 14, 2015) On Climate Change: Earth's changing climate is a critical issue and poses the risk of significant environmental, social and economic disruptions around the globe. While natural sources of climate variability are significant, multiple lines of evidence indicate that human influences have had an increasingly dominant effect on global climate warming observed since the mid-twentieth century. Although the magnitudes of future effects are uncertain, human influences on the climate are growing. The potential consequences of climate change are great and the actions taken over the next few decades will determine human influences on the climate for centuries. On Climate Science: As summarized in the 2013 report of the Intergovernmental Panel on Climate Change (IPCC), there continues to be significant progress in climate science. In particular, the connection between rising concentrations of atmospheric greenhouse gases and the increased warming of the global climate system is more compelling than ever. Nevertheless, as recognized by Working Group 1 of the IPCC, scientific challenges remain in our abilities to observe, interpret, and project climate changes. To better inform societal choices, the APS urges sustained research in climate science. On Climate Action: The APS reiterates its 2007 call to support actions that will reduce the emissions, and ultimately the concentration, of greenhouse gases as well as increase the resilience of society to a changing climate, and to support research on technologies that could reduce the climate impact of human activities. Because physics and its techniques are fundamental elements of climate science, the APS further urges physicists to collaborate with colleagues across disciplines in climate research and to contribute to the public dialogue. http://www.aps.org/policy/statements/15_3.cfm

  9. First, some background … • The APS “climate statement” … • What was the original“current” statement? • What was the problem?

  10. The original APS statement on climate change … The evidence is incontrovertible: Global warming is occurring. http://www.aps.org/policy/statements/07_1.cfm

  11. The Cambridge English Dictionary and Karl Popper incontrovertible • adjective [not gradable] US /ɪnˌkɑn·trəˈvɜr·tə̬·bəl/ FML › impossible to d​oubt because o​bviously t​rue: • incontrovertible p​roof Karl Popper & Falsifiability: • In its basic form, falsifiability is the belief that for any hypothesis to have credence, it must be inherently disprovable before it can become accepted as a scientific hypothesis or theory.

  12. The original APS statement on climate change … v2 Further edits: February 2012 http://www.aps.org/policy/statements/climate.cfm The Talmudic approach to solving the problem: If you cannot change the “Ur-text”, envelop it in commentary that’s longer than the original statement …

  13. What was wrong? • Three things: • First, the word ‘incontrovertible’ … • Second, while the observational evidence is very strong that the Earth is absorbing more energy than it is emitting, and that human activity is now the dominant driver • That is, the Earth must be asymptotically warming … certain aspects of “climatology” turn out to be controversial within the physics community, raising questions about our ability to predictwhen and where climate change and its impacts will be felt … • Third, the physics community is hugely conflicted on the question of how (or even whether) to air discussions about climate change in the public realm …

  14. What exactly were – and still are – the issues? • The physics issues … • The politics of the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment … • The political environment … • So, let’s start with the physics …

  15. Building “validated” simulations for natural phenomena … • Simulations of complex multi-scale multi-physics systems of necessity involve “models” for physics that currently cannot be captured from first principles … • There are two distinct issues: • Spatial or temporal scales too extensive to captured all relevant physics, given our current computational capabilities … • or • We do not fully understand the physics • In either case, we end up resorting to phenomenological models that aim to capture observed physical behavior

  16. And, in either case … • The models must be validated – and calibrated … • Such modeling is very familiar in the engineering domain ... • For example, bridge design involves • Models for buckling of steel beams ... • Models for breaking strength of cables ... • ... • and bridge designers rely heavily on experimental data that underpin such models – and nevertheless build in considerable safety margins because the applicability limits on such models under conditions that lie outside experimentally-validated domains are highly uncertain – engineers hate to extrapolate ...

  17. In other words: Models have knobs, and the knobs must be tuned to work appropriately …

  18. Computational climate modeling Modeling physics on “missing” spatial scales … Direct numerical simulations (‘DNS’) Subgrid models spatial scale Physically smallest relevant scale: viscous scale Smallest ‘DNS’ scale Physically largest relevant scale: Earth

  19. Modeling physics we don’t fully understand … • Astrophysics and climate science regularly deal with physics issues that are not yet fully understood from first principles … • In astrophysics: • The orgin of magnetic fields – in stars, in galaxies, ... • ... • In climate science: • Cloud dynamics … • Deep ocean vertical mixing … • Evolution of glaciers and ice sheets … • ... • In both domains, one uses phenomenological models, based on current observations … • In both fields, we are forced to extrapolate into temporal and spatial domains for which observational constraints are not possible ...

  20. It’s the model building and the knob tuning that is the real challenge … Examples of sources of uncertainty Problems with data 1. Missing components or errors in the data 2. “Noise” in the data associated with biased or incomplete observations 3. Random sampling error and biases (non-representativeness) in a sample Problems with models 4. Known processes but unknown functional relationships or errors in the structure of the model 5. Known structure but unknown or erroneous values of some important parameters 6. Known historical data and model structure, but reasons to believe parameters or model structure will change over time 7. Uncertainty regarding the predictability (e.g., chaotic or stochastic behavior) of the system or effect 8. Uncertainties introduced by approximation techniques used to solve a set of equations that characterize the model. Other sources of uncertainty 9. Ambiguously defined concepts and terminology 10. Inappropriate spatial/temporal units 11. Inappropriateness of/lack of confidence in underlying assumptions 12. Uncertainty due to projections of human behavior (e.g., future consumption patterns, or technological change), which is distinct from uncertainty due to “natural” sources (e.g., climate sensitivity, chaos) 4. 5. 6. Known processes but unknown functional relationships or errors in the structure of the modelKnown structure but unknown or erroneous values of some important parametersKnown historical data and model structure, but reasons to believe parameters or model structure will change over time Moss & Schneider (2000)

  21. Why am I focussing on these three issues? Because • We cannot validate/calibrate these models under conditions that potentially differ substantially from current conditions … • And, therefore, we cannot construct formal model errors … We know that prediction – meaning, saying something about the future, with computed uncertainty bounds included – is not possible.

  22. Parameter estimation – in pictures:The problem of anticipating as yet unobserved conditions Parameter #2 Climate system evolution? “Model Point Design” Best fit Parameter #2 value Calibrated parameter domain Parameter #1 Best fit Parameter #1 value

  23. Can inter-simulation comparisons overcome these difficulties? In the absence of data capable of validating the internal physics models, it is common to use inter-simulation comparisons, together with comparisons to historical data … Observed and simulated global mean surface air temperature vs. time

  24. We tried the experiment: Can code comparisons reveal whether the codes are “correct”? • The ‘alpha group’ code focused on the nonlinear evolution of the Rayleigh-Taylor (R-T) instability Heavy fluid g Light fluid Guy Dimonte

  25. The idea … • Guy Dimonte designed the experiment, and carried it out. He measured the width of the mixing zone as a function of time. • Six different code groups simulated this experiment, and computed the mixing layer width as a function of time • We then compared (by overlaying transparent foils/viewgraphs) • The 6 code results • The 6 code results with the experimental data • The mixing layer width: h ~ agAt2, A ~ “density ratio”

  26. The comparison … Unfortunately: Successful comparison of codes cannot increase confidence in the results … The results … A Comparative Study of the Turbulent Rayleigh-Taylor (RT) Instability Using High-Resolution 3D Numerical Simulations: The Alpha-Group Collaboration Dimonte, G., Youngs, D.L., Dimits, A., Weber, S., Marinak, M., Wunsch, S., Garasi, C., Robinson, A., Andrews, M.J., Ramaprabhu, P., Calder, A.C., Fryxell, B., Biello, J., Dursi, L.J., MacNeice, P., Olson, K., Ricker, P., Rosner, R., Timmes, F.X., Tufo, H., Young, Y.-N., & Zingale, M. 2004, Phys. Fluids, 16(5), 1668-1693.

  27. What does all this mean for computing uncertainties for climate models? • In order to be predictive, models must be validated and calibrated within the domain of application … • In the absence of validation, one cannot compute model uncertainties • The aforementioned models cannot be validated across the full range of expected climate variability – because the validation data simply does not exist … • Hence, in the absence of uncertainty quantification for the aforementioned models internal to climate simulations, one cannot quantify the uncertainties for climate simulations .. • This is an important reason why the global climate model calculations shown in the IPCC 5th assessment Working Group I report are not treated as predictions but rather as ‘scenarios’ …

  28. … which gets us to the politics of the IPCC 5th Assessment … • The 5th Assessment WG 1 acknowledged significant differences in the capabilities of the large variety of extant climate simulation codes, but ignored these differences when providing best estimates (viz., the CMP5 mean surface air temperature …) … • The WG 1 report used verbal qualifiers in order to indicate their confidence in results: these verbal qualifiers were converted to numerical ‘confidence levels’ in the WG 1 Summary for Policy Makers (SPM) document … Code performance chart, from IPCC 5th Assessment, WG 1, p. 766 Modeled data set Simulation code used

  29. IPCC 5th Assessment: Background Box SPM.3 – Communication of the Degree of Certainty in Assessment Findings… • The degree of certainty in each key finding of the assessment is based on the type, amount, quality, and consistency of evidence (e.g., data, mechanistic understanding, theory, models, expert judgment) and the degree of agreement. The summary terms to describe evidence are: limited, medium, or robust; and agreement: low, medium, or high. • Confidence in the validity of a finding synthesizes the evaluation of evidence and agreement. Levels of confidence include five qualifiers: very low, low, medium, high, and very high. • The likelihood, or probability, of some well-defined outcome having occurred or occurring in the future can be described quantitatively through the following terms: virtually certain, 99–100% probability; extremely likely, 95– 100%; very likely, 90–100%; likely, 66–100%; more likely than not, >50–100%; about as likely as not, 33–66%; unlikely, 0–33%; very unlikely, 0–10%; extremely unlikely, 0–5%; and exceptionally unlikely, 0–1%. Unless otherwise indicated, findings assigned a likelihood term are associated with high or very high confidence. Where appropriate, findings are also formulated as statements of fact without using uncertainty qualifiers. • Within paragraphs of this summary, the confidence, evidence, and agreement terms given for a bold key finding apply to subsequent statements in the paragraph, unless additional terms are provided. Moss, R.H. and Schneider, S.H. (2000)

  30. Here is what Moss & Schneider (2000) argued … It is certainly true that “science” itself strives for objective empirical information to test theory and models. But at the same time “science for policy” must be recognized as a different enterprise than “science” itself, since science for policy … involves being responsive to policymakers’ needs for expert judgment at a particular time, given the information currently available, even if those judgments involve a considerable degree of subjectivity. … The key point is that authors should explicitly state what sort of approach they are using in a particular case: if frequentist statistics are used the authors should explicitly note that, and likewise if the probabilities assigned are subjective, that too should be explicitly indicated. Transparency is the key in all cases.

  31. So where did all this leave us? • The data is convincing: climate change is underway … • The data show convincingly that anthropogenic influences are playing a major role, quite independent of any modeling … • The models show increasing climate impacts … • It has proven impossible to construct scenarios that avoid serious global climate change impacts … even allowing for drastic model changes in the most challenging physics domains (viz., deep ocean mixing) lead to solutions that have major impacts – for example, damaging impacts can occur locally even if global mean temperatures remain ‘flat’ • The physicists’ dilemma: We’re challenged in communicating our concerns to the public, both here is the U.S. and abroad … • In the present polarized political climate, any reservations regarding the IPCC 5th Assessment – including the comments I’ve made so far – can come across as support for the ‘climate deniers’ … which it absolutely is not …

  32. My concern … • Maintaining credibility for the science community is imperative … • Apocalyptic “predictions” not well-founded on science are not helpful … • Even the ‘climate believers’ remain torn between economic impacts today versus in the more distant (and uncertain) future … • Despite the positive-sounding rhetoric, the published Intended Nationally Determined Contributions (INDCs) prepared for COP21 (=Paris) are highly inadequate – though they do “get us off the dime • As scientists, we’re late in developing an explanatory framework that is true to the science, and also provides a convincing foundation for discussing the risks entered into by inadequate responses to climate change …

  33. Lest you think this is all a tempest in a teapot … • The Italian “National Commission for the Forecast and Prevention of Major Risk” was tasked in 2008 with looking at possible future major disasters, including earthquakes. • The Commission reported on 31 March 2009 in L’Aquila that the risk of a large earthquake within a year in the L’Aquila region was small. • On 6 April 2009, a magnitude 5.8-5.9 Richter scale earthquake shook L’Aquila, in Abruzzo, just east of Rome, killing 308 people. BBC News 22 October 2012 Last updated at 15:06 ET L'Aquila quake: Italy scientists guilty of manslaughter Six Italian scientists and an ex­government official have been sentenced to six years in prison over the 2009 deadly earthquake in L'Aquila. A regional court found them guilty of multiple manslaughter. Prosecutors said the defendants gave a falsely reassuring statement before the quake, while the defence maintained there was no way to predict major quakes. • In 2011, 6 scientists and one ex-government official of the Commission were charged with, and then on 22 October 2012 convicted of multiple manslaughter, and sentenced to six years’ imprisonment.

  34. Ultimately, things got better … but ... NATURE | NEWS Sharing, 10 November 2014 Alison Abbott & Nicola Nosengo Italian seismologists cleared of manslaughter Appeals court says six scientists did not cause deaths in 2009 L'Aquila earthquake and cuts sentence of a government official. • Are we, as scientists, to be held legally accountable for research work and professional advice supporting policy decisions? • How do we – and how will we – deal with fundamental disconnects on issues of risk and uncertainty assessment between scientists, government officials, and the public?

  35. Discussion/Questions/Defamations …

  36. The APS process for “Statements” • Most APS “statements” are developed by POPA • POPA is expected to review all current APS statements every 5 years • POPA recommendations are forwarded to the APS Council, which comments … and forwards to the APS e-Board … • The e-Board reviews and comments; a final draft is then submitted to POPA for approval: • If approved, the final draft is sent to the APS Council, as well as to the full APS membership, for comments. • All received comments need to be addressed (by POPA); and the draft revised if necessary (also by POPA). Once this final revision is finished, the statement is ready for APS Council consideration: The Council makes the final determination on whether the statement is approved and posted on the APS website.

More Related