1 / 44

Robert Kellogg NRCS, Beltsville

Results and Lessons Learned on Regional/National Modeling Efforts: Conservation Effects Assessment Project (CEAP). Robert Kellogg NRCS, Beltsville. Why do we do large-scale regional modeling and assessment?. Why do we do large-scale regional modeling and assessment?.

tausiq
Download Presentation

Robert Kellogg NRCS, Beltsville

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Results and Lessons Learned on Regional/National Modeling Efforts: Conservation Effects Assessment Project (CEAP) Robert Kellogg NRCS, Beltsville

  2. Why do we do large-scale regional modeling and assessment?

  3. Why do we do large-scale regional modeling and assessment? To provide information in support of policy development or management of government programs.

  4. Why do we do large-scale regional modeling and assessment? To provide information in support of policy development or management of government programs. • How big is the problem? • What has already been accomplished? • What is left to do, and where? • What can be expected if specific actions are taken? • What is the most cost-effective approach?

  5. Describe study and findings Challenges in developing and presenting results Lessons learned

  6. Goals of the CEAP Cropland National/Regional Assessment • Define and evaluate practices in use • Estimate the effects/benefitsof conservation practices in use • Estimate the need for additional conservation practices • Simulate effects/benefits of additional treatment

  7. Cropland Regional Assessments

  8. Sampling and Modeling Approach Onsite (field-level) Effects Field-level modeling APEX Farm survey data at NRI-CEAP sample points Off-Site Water Quality Effects Watershed modeling HUMUS/SWAT

  9. Primary Sample Unit (PSU) Points Statistical Design

  10. Modeling Strategy • Estimate a CEAP Baseline using farmer survey information at NRI sample points • Construct an alternative scenario assuming “no practices” Difference between these two scenarios represents the benefits of the accumulation of conservation practices currently in place on the landscape.

  11. 47-year minimum-maximum precipitation

  12. Evaluation of Conservation Practices

  13. The Baseline Conservation Condition

  14. The Baseline Conservation Condition

  15. The Baseline Conservation Condition

  16. Losses of Sediment and Nutrients from Fields

  17. Sediment Loss (tons/acre), Baseline

  18. Nitrogen Loss (pounds/acre), Baseline

  19. Nitrogen Loss in Subsurface Flows, Baseline Means: CB = 32.7 pounds/A UM = 18.7 pounds/A GL = 25.8 pounds/A

  20. Inherent Vulnerability

  21. Conservation Treatment Needs • Under-treated acres were identified as those with an imbalance between the level of potential loss—inherent vulnerability—and the level of conservation treatment. • Acres were assigned to three levels of need for additional treatment—High Moderate, and Low

  22. Average annual loss of nitrogen in subsurface flows, GL--pounds/acre/yr

  23. Acres Needing Conservation Treatment

  24. Average annual loss of nitrogen in subsurface flows, GL--pounds/acre/yr

  25. Average annual loss of nitrogen in subsurface flows, GL--pounds/acre/yr

  26. High conservation treatment need for nitrogen and/or phosphorus loss

  27. Challenges in developing and presenting results

  28. Challenges in developing and presenting results Evolution of models

  29. Challenges in developing and presenting results Evolution of models Establishing believability

  30. Challenges in developing and presenting results Evolution of models Establishing believability Simplicity versus complexity

  31. Challenges in developing and presenting results • Evolution of models • Establishing believability • Simplicity versus complexity • Forecasting…and meeting…report publication deadlines

  32. Challenges in developing and presenting results • Evolution of models • Establishing believability • Simplicity versus complexity • Forecasting…and meeting…report publication deadlines • Presentations

  33. Challenges in developing and presenting results • Evolution of models • Establishing believability • Simplicity versus complexity • Forecasting…and meeting…report publication deadlines • Presentations • Peer review

  34. Lessons Learned… • Define clearly at the start the kinds of statements you will be including in your report, as well as what you will NOT address… • Write up preliminary results and present to users of the information early and often. • Don’t wait for the modeling to be completed before drafting. • Try to manage expectations of your audience.

  35. Lessons Learned… Involve a team of subject-area experts from different disciplines... But discuss the project as a group frequently to keep all on the same page.

  36. Lessons Learned… Modeling decisions are NOT independent from the presentation of results… Discuss assumptions and methods as a team to confirm that the “messages” in the report are consistent with modeling assumptions, and vice versa

  37. Lessons Learned… If “off-the-shelf” databases are fundamentally inappropriate for answering the questions, don’t try to “make do”… Collect the data you need.

  38. Lessons Learned… Models and databases will always be modified and refined… You will have to do everything over more than once—plan on it.

  39. Lessons Learned… If your results appear to be new scientific findings, you are probably doing something wrong… Regional modeling is primarily a synthesis of scientific knowledge and understanding.

  40. Lessons Learned… • Document…document…document • Establishes believability. • Avoids mis-use of the findings. • Explain why you chose a method/assumption, and why alternatives were not chosen. • Prepare documentation reports as you go…don’t wait until the end.

  41. Lessons Learned… • Consider keeping the technical report separate from other communication products designed specifically to focus on messages. • Be patient with your audience.

  42. Lessons Learned… Avoid an open public review of a draft report.

  43. Lessons Learned… Avoid an open public review of a draft report. Don’t get into a “model war” with either EPA or USGS.

  44. Information on CEAP can be found at: http://www.nrcs.usda.gov/Technical/nri/ceap/

More Related