LANCE UWG Meeting, November 16, 2010. UWG Member : Robert Brakenridge Affiliation: University of Colorado Director , Dartmouth Flood Observatory http://floodobservatory.colorado.edu/ Email: [email protected] LANCE Data Products.
Automated mapping of surface water changes using the LANCE MODIS rapid response processor.
MODIS data (four scenes, Terra and Aqua, over two days) are used to classify water pixels using an algorithm developed at the Dartmouth Flood Observatory. Data are processed at NASA's Goddard Space Flight Center, and transmitted to the Flood Observatory for display here.
Authorship: Sun, J., Slayback, D., Policelli, F., Brakenridge, G., and Kettner, A., 2010, "NASA Experimental Flood Maps,Current Floodwater", http://floodobservatory.colorado.edu/LanceModis.html.
Project supported by NASA grant NNX09AV26G to Dartmouth College (G.R. Brakenridge). Overall principle investigator is F. Policelli, NASA Goddard Space Flight Center
DFO Map (updated during major flooding)
GSFC automated flood water mapping.
No current flooding: no water pixels above the reference water mask.
Flooding occurs annually in this region (large variability in surface water extent). Past flooding is illustrated in the DFO map.
Prior events can be illustrated by year, as here, or merged into the “maximum imaged inundation” (next slide).
Here, SRTM water mask is used as reference water.
Maximum imaged inundation is larger, each year,as new floods are mapped.
Current flooding is now shown in dark blue; will then be included in 2010 map layer.
II. MODIS 250 m “mean water extent” (wet season and dry season) water masks are critical to determine flood magnitude..
III. Annual and Period-of-Record “maximum (flood) and minimum (hydrologic drought) water extent also need to be mapped and will these data products will be facilitated by automated, daily, updating.
Waning stages; dark blue is flooded land; red was recently flooded, lighter reds show flooding in previous years
Lance data will allow routine, global, recording of these major events in near real time