1 / 32

Acknowledgments: The authors would like to thank

Development of neural network emulations of model physics components for improving the computational performance of the NCEP seasonal climate forecasts P.I.s: M. Fox-Rabinovitz, V. Krasnopolsky, Co-I.s: S. Lord, Y.-T. Hou, Collaborator: A. Belochitski, CTB contact: H.-L. Pan.

aloha
Download Presentation

Acknowledgments: The authors would like to thank

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Development of neural network emulations of model physics components for improving the computational performance of the NCEP seasonal climate forecastsP.I.s: M. Fox-Rabinovitz, V. Krasnopolsky, Co-I.s: S. Lord, Y.-T. Hou, Collaborator: A. Belochitski, CTB contact: H.-L. Pan Acknowledgments: The authors would like to thank Drs. H.-L. Pan, S. Saha, S. Moorthi, and M. Iredell for their useful consultations and discussions. The research is supported by the NOAA CPO CDEP CTB grant NA06OAR4310047. NOAA CDEP CTB SAB Meeting, August 28-29, 2007, Silver Spring MD

  2. OUTLINE • Neural Network applications to the CFS model:Current methodological developments and development of NN emulations for the LWR of the CFS model • Background information on the NN approach • Development of NN emulations of LWR (Long-Wave Radiation) for the CFS model and evaluation of their accuracy vs. the original LWR • Initial validation of NN emulations of LWR through the CFS model run using the LWR NN emulations vs. the control run using the original LWR • Conclusions and plans

  3. Current Developments • Development of the NN methodology for LWR NN emulations for CFS • Development of NN experimentation and validation framework • Creation of training and validation data sets from the 2-year CFS model simulations • Development of NN emulation versions for LWR and validation of their accuracy vs. the original LWR • Initial validation of NN emulations through CFS model runs with the NN emulations vs. the control CFS model run with the original LWR • Analysis of initial results for seasonal predictions, short to medium forecasts, and climate simulations

  4. Background • Any parameterization of model physics is a relationship or MAPPING (continuous or almost continuous) between two vectors: a vector of input parameters, X, and a vector of output parameters, Y, • NN is a generic approximation for any continuous or almost continuous mapping given by a set of its input/output records: SET = {Xi, Yi}i= 1, …,N

  5. Neural Network Y = FNN(X) Continuous Input to Output Mapping Neuron

  6. Major Advantages of NNs: • NNs are generic, veryaccurate and convenient mathematical (statistical) models which are able to emulate numerical model components, which are complicated nonlinear input/output relationships (continuous or almost continuous mappings ). • NNs are robust with respect to random noise and fault- tolerant. • NNs are analytically differentiable (training, error and sensitivity analyses): almost free Jacobian! • NNs emulations are accurate and fast but NO FREE LUNCH! • Training is complicated and time consuming nonlinear optimization task; however, training should be done only once for a particular application! • NNs are well-suited for parallel and vector processing

  7. NN Emulations of Model Physics ParameterizationsLearning from Data Parameterization GCM NN Emulation F FNN Y X Training Set …, {Xi, Yi}, … Xi Dphys NN Emulation FNN Y X

  8. CFS Model: LWR NN emulation NN dimensionality and other parameters: • 591 inputs: 12 variables (pressure, T, moisture, cloudiness parameters, surface emissivity, gases (ozone, CO2) • 69 outputs: 6 variables (heating rates, fluxes) • Number of neurons for NN versions: 50 to 150 • NN dimensionality for the complex system: 50,000 to 100,000 • Training and testing data sets are produced by saving inputs and outputs of LWR during 2-year T126L64 CFS simulations; half of the data is used for training and another half for validation or NN accuracy estimation vs. the original LWR

  9. NN Approximation Accuracy (on independent data set) vs. Original Parameterization (all in K/day). NN Computational Performance:LWR NN emulations are two orders of magnitude faster than the original LWR Overall CFS model computationalperformance:~25-30% faster when using LWR NN emulations vs. the original LWR

  10. Individual HR Profiles

  11. Top of Atmosphere Upward LWR Flux Global Seasonal and Daily Differences

  12. T-850 Global Seasonal & Daily Temperatures and their Didfferences Max Difference = 0.1 K Max Difference = 0.06 K

  13. Surface Downward LWR Flux Differences Season 2: 0-5 W/m², max 10-20 W/m² Season 4: 0-5 W/m², max 10-20 W/m²

  14. Top of Atmosphere Upward LWR Flux Differences Season 2: 0-5 W/m², max 10-20 W/m² Season 4: 0-5 W/m², max 10-20 W/m²

  15. Season 1: 0 – 0.5 K, max 1.5-2 K Season 2: 0 – 0.5 K, max 1.5-2.5 K Season 3: 0 – 0.5 K, max 1.5-2 K Season 4: 0 – 0.5 K, max 2-3 K T Zonal Mean Differences

  16. Season 1: 0 – 1 m/s, max 3 - 4 m/s Season 2: 0 – 1 m/s, max 2 m/s Season 3: 0 – 1 m/s, max 2 m/s Season 4: 0 - 1m/s, max 2 m/s U Zonal Mean Differences

  17. V Zonal Mean Differences Season 1: 0 – 0.1 m/s, max 0.2 - 0.4 m/s Season 2: 0 – 0.1 m/s, max 0.2 - 0.3 m/s Season 3: 0 – 0.1 m/s, max 0.2 - 0.4 m/s Season 4: 0 - 0.1m/s, max 0 .2 - 0.4 m/s

  18. Season 1: 0 - 1 K, max 2-3 K Season 2: 0 - 1 K, max 2-4 K Season 3: 0 - 1 K, max 2-3 K Season 4: 0 - 1 K, max 2-3 K T-500 differences

  19. Day Two: Upward Top of Atmosphere LWR Flux)Differences near 0 - 2 W/m2, a few minor max of 10 - 20 W/m2 Orig. - NN

  20. Day Two: T-850Differences near 0 – 0.2 K, max 0.5 – 1.5 K

  21. Day Two: U-850Differences 0 - 0.1 m/s, max 0.5 – 1 m/s

  22. Day Seven: T-850Differences near 0. – 1.0 K, max 2 - 3 K

  23. 2-Year Mean OLR (Upward Top of Atmosphere LWR Flux), in W/m²Differences 0-5 W/m², max 10-20 W/m²

  24. Recent Journal and Conference Papers Journal Papers: • V.M. Krasnopolsky, M.S. Fox-Rabinovitz, and A. Beloshitski, 2007, “Compound Parameterization for a Quality Control of Outliers and Larger Errors in NN Emulations of Model Physics", Neural Networks, submitted • V.M. Krasnopolsky, 2007, “Neural Network Emulations for Complex Multidimensional Geophysical Mappings: Applications of Neural Network Techniques to Atmospheric and Oceanic Satellite Retrievals and Numerical Modeling”, Reviews of Geophysics, in press • V.M. Krasnopolsky, 2007: “Reducing Uncertainties in Neural Network Jacobians and Improving Accuracy of Neural Network Emulations with NN Ensemble Approaches”, Neural Networks, Neural Networks, 20, pp. 454-46 • V.M. Krasnopolsky and M.S. Fox-Rabinovitz, 2006: "Complex Hybrid Models Combining Deterministic and Machine Learning Components for Numerical Climate Modeling and Weather Prediction", Neural Networks, 19, 122-134 • V.M. Krasnopolsky and M.S. Fox-Rabinovitz, 2006: "A New Synergetic Paradigm in Environmental Numerical Modeling: Hybrid Models Combining Deterministic and Machine Learning Components", Ecological Modelling, v. 191, 5-18 Conference Papers; • V.M. Krasnopolsky, M. S. Fox-Rabinovitz, Y.-T. Hou, S. J. Lord, and A. A. Belochitski, 2007, “Development of Fast and Accurate Neural Network Emulations of Long Wave Radiation for the NCEP Climate Forecast System Model”, submitted to the NOAA 32nd Annual Climate Diagnostics and Prediction Workshop • V.M. Krasnopolsky, M. S. Fox-Rabinovitz, Y.-T. Hou, S. J. Lord, and A. A. Belochitski, 2007, “Accurate and Fast Neural Network Emulations of Long Wave Radiation for the NCEP Climate Forecast System Model”, submitted to 20th Conference on Climate Variability and Change, New Orleans, January 2008 • M. S. Fox-Rabinovitz, V. Krasnopolsky, and A. Belochitski, 2006: “Ensemble of Neural Network Emulations for Climate Model Physics: The Impact on Climate Simulations”, Proc.,2006 International Joint Conference on Neural Networks, Vancouver, BC, Canada, July 16-21, 2006, pp. 9321-9326, CD-ROM

  25. Conclusions • Developed NN emulations of LWR for the CFS model show high accuracy and computational efficiency • Initial validation of NN emulations for LWR through CFS model runs using the NN emulations vs. the CFS model run with the original LWR show a close similarity of the runs, namely the differences are mostly within observational errors or the uncertainty of observational data or reanalyses. • For seasonal predictions: Differences are not growing from season 1 to season 4 and are mostly within observational errors or the uncertainty of observational data or reanalyses. • For climate simulations: Differences are mostly within observational errors or the uncertainty of observational data or reanalyses. • For short- to medium-term forecasts: Differences are only a small fraction for Day-2 and a fraction for Day-7 of observational errors or the uncertainty of observational data or reanalyses. Potential applications to GFS and/or DAS?

  26. Near term (FY07 and Year-2 of the project) science plans • Completing work on LWR NN emulation • Generating more representative data sets • Continuing training and validation of LWR NN emulations for the CFS model • Continuing experimentation and validation of seasonal climate predictions with LWR NN emulations • Refining the NN methodology for emulating model physics • Work on an NN ensemble approach aimed at improve accuracy of NN emulations • Develop a compound parameterization for quality control (QC) and for dynamical adjustment of the NN emulations • Refining experimentation and validation framework • Continuation of development of NN emulations for the CFS model radiation block • Analysis of CFS SWR, generating initial training data sets, and developing initial SWR NN emulations • Initial experiments with SWR NN emulation for the CFS model • Initial development of the project web site and its link to a relevant NCEP and/or CTB web sites

  27. Future (FY08 and Year-3 of the project) science plans • Completing work on SWR NN emulation for the CFS model - Training and validation of SWR NN emulations - Performing seasonal climate predictions with SWR NN emulation • Performing extensive CFS seasonal climate predictions with developed NN emulations for the CFS full radiation block (LWR & SWR), and validating their overall impact computational efficiency • Completing the transition of the developed NN radiation products into the NCEP operational CFS • Completing development of the project web site and using it for interactions with potential NCEP users and other users in educational and research communities • Preparation for future developments: other CFS model physics components, and potential applications to other NCEP systems like GFS and climate predictions

  28. Additional Plots

  29. Neuron NN - Continuous Input to Output Mapping tj Multilayer Perceptron: Feed Forward, Fully Connected Output Layer Hidden Layer Input Layer Y = FNN(X) Jacobian ! V. Krasnopolsky & M. Fox-Rabinovitz, Neural Networks for Model Physics

  30. RH Zonal Mean Differences Season 1: 0 - 1%, max 4 - 5% Season 4: 0 - 2 %, max 4 - 6%

  31. Season 1: 0 - 1 K, max 2-4 K Season 2: 0 - 1 K, max 2-4 K Season 3: 0 - 1 K, max 2-4 K Season 4: 0 - 1 K, max 2-3 K T-850 differences

  32. Day Two: Surface Upward LWR FluxDifferences near 0 - 2 W/m2, a few minor max of 10 - 30 W/m2 Orig. - NN

More Related