1 / 30

Redesign and Implementation of Chesapeake Bay Program Indicators

Redesign and Implementation of Chesapeake Bay Program Indicators. Presentation to the Monitoring & Analysis Subcommittee Carlton Haywood December 22, 2004. Plan for the Day. Some thoughts about what’s wrong with our indicators. About the Indicators Redesign Taskforce (IRT).

nydia
Download Presentation

Redesign and Implementation of Chesapeake Bay Program Indicators

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Redesign and Implementation of Chesapeake Bay Program Indicators Presentation to the Monitoring & Analysis Subcommittee Carlton Haywood December 22, 2004

  2. Plan for the Day • Some thoughts about what’s wrong with our indicators. • About the Indicators Redesign Taskforce (IRT). • Results from first two brainstorming sessions of the IRT. • A strawman framework for organizing the indicators • An initial sort of the current indicators • A process for how the indicators redesign will occur, with roles assigned to CBP committees. • Ideas for a regular schedule of publications that report the state of the Bay and the Bay Restoration to the public • Collect your suggestions/comments to feed back to IRT and prepare for IC in January.

  3. What’s wrong with current our indicators/science communication strategy?

  4. Sometimes Incorrect/Confusing Messages are Received

  5. The CBP is often not the Primary Source of Information We may not approve of methods used by other organizations, but the public listens.

  6. The Public Wants to Know … If the CBP doesn’t answer the public’s questions, or takes too long, others are ready to fill the information void.

  7. So What’s Up with Our Indicators? • We have lots of them (~100) • In general, they are rigorously vetted before publication • Frequently, not always, quantitative • Sometimes, not always, reader can figure out how the indicator was developed Current list of Chesapeake Bay Program indicators as they appear on the website

  8. So What’s Up with Our Indicators? • Some storylines have been developed.

  9. However … • Few overarching indices • No hierarchy of importance • Mix ‘state of the Bay’ and ‘state of the Bay restoration’ messages • Storylines tend to be detailed, hard to find ‘quick answers’` • Individual indicators sometimes give conflicting messages (e.g. trends in nutrients) • Presented stand alone style - don’t tell a complete story • Sometimes a long lag time between monitoring and indicator availability • Lack spatial detail

  10. More effectively communicate the health status of the Bay and its tributaries in order to: • Provide effective evaluation of restoration efforts • Improve accountability for tributary strategies • Stimulate setting goals • Better inform community and stakeholders Goal

  11. Discussion • Comments on the characterization of the problems with our current indicators • Feedback on the characterization of the goals of our indicators • Other lessons learned that we should factor into our redesign the Bay Program’s indicators

  12. Why the Taskforce was Assembled • Reorganize the existing suite of indicators to eliminate inconsistencies and to convey more clearly the answers to key questions about the Bay and the Bay restoration • Improve timeliness of reporting and reporting framework • Add some "overarching" composite measures of the state of the Bay, pressures on the Bay, and progress in its restoration • Stimulate change quickly to become more responsive to our public communication needs

  13. Indicators Redesign Taskforce • A temporary group formed to recommend solutions for the current deficiencies in the CBP indicators and the way they are communicated. • Members experienced in environmental data analysis, science communication, and web design. • Group is compromise between efficiency (small size) and links to every interest group (large size). • Members: • Carlton Haywood (Chair), ICPRB -- Rich Batiuk, EPA/CBPO • Mike Burke, EPA/CBPO -- Bob Campbell, NPS/CBPO • Peter Claggett, USGS/CBPO -- Chris Conner, ACB/CBPO • Bill Dennison, UMCES -- Rick Hoffman, VA DEQ • Mike Land, NPS/CBPO -- Ben Longstaff, NOAA / UMCES • Bruce Michael, MD DNR -- Steve Preston, USGS/CBPO • Nita Sylvester, EPA/CBPO -- Ken Moore, VIMS • Gary Shenk, EPA/CBPO

  14. Elements of a Proposed Structure & Process for Indicators Redesign • Indicators Redesign Task Force develops framework and pushes process. • MASC, with representation from all relevant Subcommittees, is forum to discuss ideas from IRT. • Subcommittee and workgroups remain responsible for implementation, interpretation, and maintenance of ‘their’ indicators. • CESC develops communications strategy, works with IRT, MASC, and other Subc., so that redesign is consistent with comm. strategy. • IMS responsible for web implementation, works with IRT so that framework is technically feasible. • STAC provides review of framework and individual indicators. • IC reviews & approves as necessary.

  15. STAC Indicator review & additional expertise Review Indicator redesign NSC / NTW Pressure Indicators IC Sign-off: design phase Implementation phase IMS Information management and data analysis TMAW / LRSC / TS State-of indicators STAC Review of integrated program LGSC Restoration indicators CESC Communication Strategy Approval and review Coordination & Steering Redesign Process: Redesign Phase Indicator Redesign Taskforce Overview and steering MASC Coordination and integration

  16. Discussion • Comments on the taskforce • Assignment of roles make sense • Suggestions for amending/refining process for indicators redesign

  17. IRT Brainstorming • Refine our goals. • Propose a structure for organizing our indicators. • Assess the current suite of indicators in this structure. • Identify gaps. • Ideas for new information products.

  18. Refining Our Communication Goals • Answer to the questions public and managers ask • Communicate the most important information first • Provide a constant flow of products: newsletters, e-newsletters, website updates, seasonal forecasts and follow through “what did happen”, annual syntheses (Health Check), presentations, posters…. • Link quantitative information (defendable, transparent) to storylines… • In timeframe that is appropriate to the messages being conveyed • Provide information in maps, conceptual diagrams and other easy-to-interpret approaches

  19. Refining Goals for Our Indicators A suite of indicators that: • Communicate the most important information first • Communicate important information in a timely fashion • Provide information at appropriate spatial and temporal scales • Where possible, use maps and graphic displays

  20. Factors that affect the Bay (Pressure) State of the Bay Ecosystems (State) State of the Bay restoration efforts (Response) Vision Structure the Indicators Functional group 1) Separate indicators into functional groups:

  21. Factors that affect the Bay (Pressure) State of the Bay restoration efforts (Response) State of the Bay Ecosystems (State) Structure the Indicators Functional group 2) Classify indicator into specific roles: Integrated indices Summary statements Top level Indices Critical measures for management and public interest Role Detail Indicators Indicators that facilitate the interpretation of higherIndicators and system understanding

  22. Factors that affect the Bay (Pressure) State of the Bay restoration efforts (Response) State of the Bay & Tribs. (State) Structure the Indicators Functional group 2) Classify indicators into specific roles: Integrated indices Summary statements Ecosystem Restoration Ecosystem Health Ecosystem Footprint Socio-economic health • Pollution control • Habitat protection and • restoration • Fisheries management • Education-Outreach Loads Land use Harvest Top level Indices Critical measures for management and public interest • Living Resources • Habitat • Water Quality • Social index • Economic Index Role Detail Indicators Indicators that facilitate the interpretation of higher Indicators and system understanding Indices are building blocks for storylines that connect pollution source to ecosystem resource to management action

  23. Factors that affect the Bay (Pressure) State of the Bay restoration efforts (Response) State of the Bay & Tribs. (State) Structure the Indicators Functional group Example: Developing Detail Indicators Top level Indices Water Quality Detail Indicators • DO Attainment • Clarity Attainment • Chlorophyll Attainment • Chemical Contaminants Role Geographically Specific Indicators • Segment / Trib. Specific clarity attainment (maps) Diagnostic Indicators • % water clarity non-attain. due to chloro. vs TSS vs epiphytes

  24. Discussion • Feedback on the overall hierarchical design • Alternative designs/ideas • Agreement on general principles for the indicators framework

  25. IRT Brainstorming • Propose a set of information products that communicates CBP information to the public on a regular schedule • Currently, these are just ideas

  26. Reporting framework 1 and 5 year communication product cycle: • Annual communication: • State of the Bay and Pressure only (notstate of restoration) • Range of reporting products produced • Rely on Chesapeake Bay Program workgroups to provide information required • 5-year reporting: • “State of the Bay” report • Align with congressional reporting requirements • Also includes State of the Bay restoration

  27. Reporting Framework Annual communication products • Monthly e-newsletter: keeping a ‘pulse’ on the Bay • Quarterly newsletters: Important events, storylines etc • Annual assessment: ‘Annual Health Check’ • Spring forecast: Predict summer conditions • Fall “look back”: Explain what did happen/why Annual reporting cycle

  28. Suggested Timeline • Winter 2005: Have a framework, identify some key measures that should be revised or created in time for Spring / Summer 2005 publication. • Spring 2005: Have our first ‘look ahead' publication that discusses likely DO or SAV outcomes based on 2004-Spring 2005 hydro-meteorology. • Summer 2005: Have our key measures ready when public interest in Bay is highest. • Fall 2005 – 2006: Roll out revised indicators as we develop storylines.

  29. Discussion • Communication plans--too much? • Ambitious, but too important to let slide • Which elements are “must haves” • Does the seasonal approach make sense • Other communications ideas, products

  30. Discussion: Final thoughts • Right track / wrong track? • Aspects that need further refinement • Ready for the IC? • When can we talk to your Subcommittee? • Next steps from here

More Related