1 / 16

Downscaling of European Land Use Projections for the ALARM Toolkit

This joint work aims to generate CORINE-scale projections of European land use under the ALARM scenarios by converting coarse-resolution projections onto finer spatial scales using a downscaling method.

hornbeck
Download Presentation

Downscaling of European Land Use Projections for the ALARM Toolkit

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Downscaling of European land use projections for the ALARM toolkit Joint work between UCL : Nicolas Dendoncker, Mark Rounsevell, Patrick Bogaert BioSS: Adam Butler, Glenn Marion Athens ALARM meeting, January 2007

  2. Overview • Current day land use data are available at a relatively fine spatial resolution (e.g. CORINE), but land use projections within ALARM are generated at a much coarser resolution • There is a need to convert these projections onto finer spatial scales in a way that properly reflects the statistical properties of high-resolution land use maps • Using the downscaling method developed by Dendoncker et al. (2006) we aim to generate CORINE-scale projections of European land use under the ALARM scenarios

  3. Assumptions • Downscaling introduces additional error into land use projections - the unavoidable price of looking at a high spatial resolution • The method relies on the assumption that the overall frequencies associated with different land types will change in the future, but that the spatial structure of the landspace will not change • Current day land use is assumed to be known without error • It is explicitly assumed that urban areas will remain urban

  4. Outputs • Land use maps at a CORINE scale (250 x 250m) for years 2000-2100 under each of the ALARM scenarios, in terms of four broad land use types: urban, arable, grassland & forestry • Maps could potentially be adapted to provide information on more detailed land classes (e.g. forest & grassland types, individual crops) – but this would require additional assumptions &/or need to make use of additional data • Generic tools for applying land use downscaling to other data, & training materials to illustrate how the downscaling methods work

  5. Some potential uses …as projected environmental data for local field studies e.g. FSN …as fine-scale inputs to mechanistic ecological models e.g. LPJ …as land use inputs for climate envelope analyses …as a resource for future ecological research, via the toolkit

  6. Unresolved issues • How can we best deal with land use classes that are only present in future scenarios e.g. surplus land, biofuels? • How can we best deal with protected areas e.g. NATURA sites? • How should we visualise downscaled land use maps, and how can we best incorporate these into the toolkit? • Should we try to quantify & represent the uncertainties involved in land use projection? If so, how?

  7. Example: Luxembourg

  8. Statistical methodology Developed at UCL by Dendoncker et al. (2006): 1 Fit a multinomial autologistic regression model to current CORINE land use data, which is at high spatial resolution 2 Use ALARM scenarios data to calculate the marginal probabilities that will be associated with different land use classes in future; ensure that these vary smoothly over space 3 Combine these sources of information using Bayes Theorem, in order to estimate the conditional probabilities associated with each of the land use classes at high spatial resolution 4 Take the projected land class for a CORINE cell to be the class that has the highest conditional probability for that cell

  9. 1. Current land use • Data: xik = 1 if CORINE celli has land use classk, 0 otherwise • Model: we assume the probability that cell i belongs to class kconditionally upon the values of xIk for all other cells I is where nik denotes the number of cells in the neighbourhood of cell i that belong to class k and where k are unknown parameters • The marginal probability associated with class k is equal to

  10. 2. Future land use • Data: fjk = fraction of ALARM cell j that is projected to have land use class k (for a particular future year in a particular scenario) • We assume the marginal probability that CORINE cell i will have land class k in future is equal to where dij is the distance between the midpoints of cells i and j • Using a weighted sum ensures smoothness; the value of q controls how smooth we would like the probability surface to be • If cell i has the same midpoint as ALARM cell j then mik = fjk

  11. 3. Bayes theorem • Using Bayes Theorem we can calculate the future conditional probability that CORINE cell i belongs to land use class k as: • For each cell i we need to rescale the Cik so that they sum to one, & so are valid probabilities; however this means that the marginal probabilities will no longer be equal to the values mik that we computed using the ALARM scenario data • We can use an iterative procedure to ensure that the rescaled conditional probabilities also respect these marginal probabilities; we alternate between: until the marginal probabilities have approximately converged to mik after T steps

  12. 4. Prediction • Finally, we predict that CORINE cell i will belong to the class that has the highest associated conditional probability, so that:

  13. Computation • The procedure is not inherently expensive to run, but the vast size of the baseline CORINE dataset means that computational issues will be the key technical problem in applying it at a pan-European scale • Currently implemented using a mixture of: • SAS, to fit the multinomial autologistic regression model • matlab, for the remaining steps • We are looking at thefeasability of porting the code to R, with most of the heavy internal computation being done in Fortran90, so that it could easily be integrated into the toolkit • Could calculations be done online, via the ALARM map portal?

More Related