1 / 45

Applying and Interpreting the SWAT Sensitivity Analysis and Auto-calibration Tools

Applying and Interpreting the SWAT Sensitivity Analysis and Auto-calibration Tools . by Mike Van Liew Dept. of Biological Systems Engineering University of Nebraska Lincoln, NE Heartland Regional Water Coordination Initiative. Available Auto-calibration tools in SWAT.

rafe
Download Presentation

Applying and Interpreting the SWAT Sensitivity Analysis and Auto-calibration Tools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying and Interpreting the SWAT Sensitivity Analysis and Auto-calibration Tools by Mike Van Liew Dept. of Biological Systems Engineering University of Nebraska Lincoln, NE Heartland Regional Water Coordination Initiative

  2. Available Auto-calibration tools in SWAT --Auto-calibration tools created by Ann van Griensven (2005) --Tools include: Sensitivity Analysis Parasol model calibration parameter uncertainty Sunglasses parameter uncertainty for calibration and and validation periods

  3. Limitations of the ArcSWAT InterfaceAuto-calibration Tool • The ArcSWAT Interface Sensitivity Analysis/Auto-Calibration and Uncertainty Tools only allow calibration at a single point within a watershed • In some cases, a multi-point, regional approach to calibration is highly desirable, especially for large watersheds

  4. Running the Sensitivity Analysis/Auto-calibration Tool in the Project Directory • The ArcSWAT Interface provides a framework for constructing files that are necessary for performing sensitivity analysis or a multi-gage, multi-parameter calibration • Some files employed in the Interface tools must be modified by hand to perform a multi-gage or multi-parameter calibration • This can be accomplished by working in the project directory instead of the ArcSWAT Interface

  5. Today’s Objectives: • Learn how to create and modify the necessary files for running the sensitivity analysis and auto-calibration tools in a project directory for multi-gage, multi-constituent configurations • Learn how to interpret the output files generated from the sensitivity analysis and auto-calibration tools

  6. Parameter Sensitivity • Challenge of determining which parameters to calibrate so that the model response mimics the actual field, subsurface, and channel conditions as closely as possible • Calibration process becomes complex and computationally extensive when the number of parameters in a model is substantial • Sensitivity analysis can be helpful to identify and rank parameters that have significant impact on specific model outputs of interest

  7. Sensitivity Analysis in SWAT • helpful to model users in identifying parameters that are most influential in governing streamflow or water quality response • allows model users to conduct two types of analyses: --the first analysis may help to identify parameters that improve a particular process or characteristic of the model (assesses the impact of adjusting a parameter value on some measure of simulated output, such as average streamflow) --second type of analysis uses measured data to provide an overall “goodness of fit” estimation between the modeled and the measured time series (identifies the parameters that are affected by the characteristics of the study watershed and those to which the given project is most sensitive)

  8. Sensitivity Analysis • Sensitivity analysis demonstrates the impact that change to an individual input parameter has on the model response • Method in SWAT combines the Latin Hypercube (LH) and One-factor-At-a-Time (OAT) sampling • LH = generates a distribution of plausible collections of parameter values from a multidimensional distribution • During sensitivity analysis, SWAT runs (p+1)*m times, where p is the number of parameters being evaluated and m is the number of LH intervals or loops • For each loop, a set of parameter values is selected such that a unique area of the parameter space is sampled

  9. Sensitivity Analysis • That set of parameter values is used to run a baseline simulation for that unique area • Then, using one-at-a-time (OAT) sampling, a parameter is randomly selected, and its value is changed from the previous simulation by a user-defined percentage • SWAT is run on the new parameter set, and then a different parameter is randomly selected and varied • After all the parameters have been varied, the LH algorithm locates a new sampling area by changing all the parameters

  10. Getting Started: Building Files to Conduct Sensitivity Analysis • ArcSWAT Interface Sensitivity Analysis Tool Input and Output Windows • Manually modify files in project directory that are written from the Interface

  11. Analysis Location: Select from the SWAT simulation list a simulation for performing the sensitivity analysis Sensitivity Input Window Subbasin: Select a subbasin within the project where observed data will be compared against simulated output

  12. Hypercube intervals (Alpha_Bf): 10 intervals of 0-0.1, 0.1-0.2 … 0.9-10.0 Sensitivity Input Window OAT change (Alpha_Bf): Changes by 5% x (1.0 -0.0) = 0.05 Initial value of 0.13 becomes 0.08 or 0.18

  13. Observed Data File Name Sensitivity Input Window Select Parameters for conducting sensitivity analysis Lower bound = 0.0 Upper bound = 10.0 Adjust if necessary Variation Method: • Replace by value • Add to value • Multiply by value (%)

  14. Output Evaluation: Comparison variable(s) Select Average Modeled Output (eg, streamflow) Or Percent of Time output is < a threshold value) Sensitivity Analysis Output Window Select Concentrations or Loads for Water Quality Objective Function: Select optimization method Write Input Files to Project Directory

  15. Main Output: Sensout.out Input Data: Objective and Response Functions List of Parameters

  16. Sample of Senspar.out file OAT = .05 Loops = 5 + 20%

  17. Main Output: Sensout.out Parameter Ranking

  18. Ranking of 16 Parameters for Mahantango Creek Watershed, PA

  19. Ranking of 16 Parameters for Stevens Creek Watershed, PA

  20. Mean Value Percent Difference in Objective Function Value with a 5% Change in Parameter Value for Stevens Creek Watershed, NE

  21. Strengths of the Automated Approach to Calibration in SWAT • Manual calibration of a dozen or more parameters that govern streamflow can be a very time consuming and frustrating process • The auto-calibration procedure in SWAT provides a powerful, labor-saving tool that can be used to substantially reduce the frustration and uncertainty often associated with manual calibration • The Parasol with Uncertainty Analysis tool in SWAT provides optimal parameter values that are determined through an optimization search. It also provides an indication of how sensitive a parameter is to being precisely calibrated, based upon the user supplied input range

  22. Shuffled Complex Evolution Algorithm (SCE-UA) • calibration procedure based on a Shuffled Complex Evolution Algorithm (SCE-UA) and a single objective function • In a first step, the SCE-UA selects an initial population of parameters by random sampling throughout the feasible parameter space for “p” parameters to be optimized, based on given parameter ranges • The population is partitioned into several communities (complexes), each consisting of “2p+1” points

  23. Shuffled Complex Evolution Algorithm (SCE-UA) • Each community is made to evolve based on a statistical “reproduction process” that uses the simplex method, an algorithm that evaluates the objective function in a systematic way with regard to the progress of the search in previous iterations • At periodic stages in the evolution, the entire population is shuffled and points are reassigned to communities to ensure information sharing • As the search progresses, the entire population tends to converge toward the neighborhood of global optimization, provided the initial population size is sufficiently large

  24. Shuffled Complex Evolution Algorithm (SCE-UA) Initialize Select Parents Repeat x times to generate x offspring Shuffle Generate Offspring Repeat y times to generate y offspring Evolve Assess Replace Parents by Offspring No Yes End

  25. Limitations of the ArcSWAT InterfaceAuto-calibration Tool • The ArcSWAT Interface Sensitivity Analysis/Auto-Calibration and Uncertainty Tools only allow calibration at a single point within a watershed • In some cases, a multi-point, regional approach to calibration is highly desirable, especially for large watersheds

  26. Building Files to Conduct Auto-calibration • ArcSWAT Interface Auto-calibration Tool Input and Output Windows • Manually modify files in project directory that are written from the Interface

  27. Analysis Location: Select from the SWAT simulation list a simulation for performing the calibration Auto-calibration Input Window Subbasin: Select a subbasin within the project where observed data will be compared against simulated output

  28. Optimization Settings MAXN = Maximum number of trials before optimization is terminated NGS = Number of complexes IPROB = sets the threshold for ParaSol: 1 = 90% CI 2 = 95% CI 3 = 97.5% CI Auto-calibration Input Window Observed Data File Name Calibration Method: ParaSol or ParaSol with Uncertainty Analysis

  29. Auto-calibration Input Window: Observed Daily Record for Streamflow Year of observed record Julien day of observed record Observed Daily Streamflow in cms

  30. Input Window: Observed Monthly Record for Streamflow and Sediment Year of observed record Month of observed record Observed Monthly Streamflow (cms) Observed Monthly Sediment Load (tons/day)

  31. Auto-calibration Input Window Select Parameters for calibration Adjust initial lower and upper bounds, if necessary (note: minimum lower bound for SURLAG = 0.5)

  32. Auto-calibration input files:ChangeparParasolin MAXN NGS IPROB

  33. Auto-calibration input: MultigageChangepar file is created by combining two or more changepar files that are specific for certain subbasins or HRUs in the project Upper Gage Lower Gage

  34. Auto-calibration input: MultigageChangepar file is created by combining two or more changepar files that are specific for certain subbasins or HRUs in the project For parameters that vary by HRU, select All Land Uses, Soils, and Slopes for Subbasins that are relevant to a particular gage For parameters that vary by Subbasin, select All Subbasins that are relevant to a particular gage

  35. Auto-calibration input: MultigageChangepar file is created by combining two or more changepar files that are specific for certain subbasins or HRUs in the project Subbasins for gage 1 Subbasins for gage 2 HRUs for gage 1 HRUs for gage2

  36. Output Evaluation: Select parameter to be calibrated Auto-calibration Output Window Objective Function: Select optimization method Select Concentrations or Loads for Water Quality Calibration Write Input Files to Project Directory

  37. Auto-calibration input file: fig Autocal Command Code and Observed Data Files for 2 Gage Locations

  38. Auto-calibration input file: Filecio Number of years simulated ICLB =AutoCalibration Default = 0 Sensitivity = 1 Optimization = 2 Optimization with uncertainty = 3 Bestpar = 4 NYSKIP = Warm-up

  39. Auto-calibration input file: Objmet Code number for calibration variable Concentration or load Given weight for objective function Code number for Autocalfile in .fig Objective function method

  40. Auto-calibration output file: Parasolout Calibration Parameter Uncertainty Ranges

  41. Auto-calibration output file: goodpar and bestpar Parameter listings Calibration

  42. Auto-calibration output file: Autocal Monthly Sediment Load Parameter Uncertainty Ranges Monthly Streamflow Calibration

  43. Measured versus Simulated Streamflow with Parasol Uncertainty CI

More Related