1 / 33

Gap filling using a Bayesian-regularized neural network

Gap filling using a Bayesian-regularized neural network. B.H. Braswell University of New Hampshire. Proper Credit. MacKay DJC (1992) A practical Bayesian framework for backpropagation networks. Neural Computation, 4, 448-472.

lowell
Download Presentation

Gap filling using a Bayesian-regularized neural network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gap filling using a Bayesian-regularized neural network B.H. Braswell University of New Hampshire

  2. Proper Credit MacKay DJC (1992) A practical Bayesian framework for backpropagation networks. Neural Computation, 4, 448-472. Bishop C (1995) Neural Networks for Pattern Recognition, New York: Oxford University Press. Nabney I (2002) NETLAB: algorithms for pattern recognition. In: Advances in Pattern Recognition, New York: Springer-Verlag.

  3. Two-layer ANN is a nonlinear regression

  4. Two-layer ANN is a nonlinear regression usually linear usually nonlinear e.g., tanh()

  5. Neural networks are efficient with respect to number of estimated parameters Consider a problem with d input variables Polynomial of order M: Np~ dM Neural net with M hidden nodes: Np~ d∙M

  6. Avoiding the problem of overfitting: • Early stopping • Regularization • Bayesian methods

  7. Avoiding the problem of overfitting: • Early stopping • Regularization • Bayesian methods

  8. Avoiding the problem of overfitting: • Early stopping • Regularization • Bayesian methods

  9. 1

  10. 2

  11. 3

  12. 4

  13. 5

  14. 6

  15. Previous Work Hagen SC, Braswell BH, Frolking, Richardson A, Hollinger D, Linder E (2006) Statistical uncertainty of eddy flux based estimates of gross ecosystem carbon exchange at Howland Forest, Maine. Journal of Geophysical Research, 111. Braswell BH, Hagen SC, Frolking SE, Salas WE (2003) A multivariable approach for mapping subpixel land cover distributions using MISR and MODIS: An application in the Brazilian Amazon. Remote Sensing of Environment, 87:243-256.

  16. ANN Regression for Land Cover Estimation Band1 Forest Fraction Band2 Secondary Fraction Band3 Cleared Fraction Band4 Training data supplied by classified ETM imagery

  17. ANN Regression for Land Cover Estimation

  18. ANN Estimation of GEE and Resp, with Monte Carlo simulation of Total Prediction uncertainty Clim Flux

  19. ANN Estimation of GEE and Resp, with Monte Carlo simulation of Total Prediction uncertainty Weekly GEE from Howland Forest, ME based on NEE

  20. Some demonstrations of the MacKay/Bishop ANN regression with 1 input and 1 output

  21. 1.4 Noise=0.10

  22. Linear Regression Noise=0.10

  23. ANN Regression Noise=0.10

  24. ANN Regression Noise=0.02

  25. ANN Regression Noise=0.20

  26. ANN Regression Noise=0.20

  27. ANN Regression Noise=0.10

  28. ANN Regression Noise=0.05

  29. ANN Regression Noise=0.05

  30. ANN Regression Noise=0.05

  31. Issues associated with multidimensional problems • Sufficient sampling of the the input space • Data normalization (column mean zero and standard deviation one) • Processing time • Algorithm parameter choices

  32. Our gap-filling algorithm • Assemble meteorological and flux data in an Nxd table • Create five additional columns for sin() and cos() of time of day and day of year, and potential PAR • Standardize all columns • First iteration: Identify columns with no gaps; use these to fill all the others, one at a time. • Create an additional column, NEE(t-1), flux lagged by one time interval • Second iteration: Remove filled points from the NEE time series, refill with all other columns

  33. Room for Improvement • Don’t extrapolate wildly, revert to time-based filling in areas with low sampling density, especially at the beginning and end of the record • Carefully evaluate the sensitivity to internal settings (e.g., alpha, beta, Nnodes) • Stepwise analysis for relative importance of driver variables • Migrate to C or other faster environment • Include uncertainty estimates in the output • At least, clean up the code and make it available to others in the project, and/or broader community

More Related