1 / 18

Lessons Learned from the 316b Choice Question Development

Lessons Learned from the 316b Choice Question Development. November 1, 2011 Presented by: Erik Helm w ith: Elena Besedin, Robert Johnston, Julie Hewitt, and Ryan Stapler. Outline. Multi-attribute choice question design under three constraints: Policy issues to be measured

montana
Download Presentation

Lessons Learned from the 316b Choice Question Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lessons Learned from the 316b Choice Question Development November 1, 2011 Presented by: Erik Helm with: Elena Besedin, Robert Johnston, Julie Hewitt, and Ryan Stapler

  2. Outline • Multi-attribute choice question design under three constraints: • Policy issues to be measured • What can be measured environmentally • Measures people can comprehend in a choice question format • How focus groups informed choice question design • Preliminary results from the Northeast Region survey (still in the field) • Conclusions

  3. Background on 316b §316(b) of the Clean Water Act requires that the location, design, construction and capacity of cooling water intake structures reflect the best technology available for minimizing adverse environmental impact. Water being used by 316b facilities passes through “trash racks” where larger fish unable to swim away are impinged against the screen and usually die. Smaller organisms which pass through the screens are entrained in the cooling system where they are heated and subjected to abrasion and turbulence and also die.

  4. Background on Survey Development • Survey development began in 2003 • Paperwork Reduction Act and Information Collection Requests: • OMB approved 1 streamlined and 3 full ICRs • Various stages of survey have been available for public comment a total of 180 days • EPA received about 700 pages of comments • Survey was externally peer reviewed in 2006 • EPA conducted 16 focus groups and 28 cognitive interviews, producing over 1,600 pages of documentation.

  5. Lessons Learned through the 316b Development Process • Start with an assessment of what is required to properly inform policy • Look for measureable environmental outcomes that can be directly linked to policy needs • Don’t start with oversimplified environmental measures and use FGs to test for the correct level of complexity of environmental information • Use focus groups to qualitatively test for the ability of participants to separate and assess environmental attributes

  6. Policy and Environmental Outcomes Important in the 316b Survey Design • EPA has defined adverse environmental impact under successive 316b regulations as being individual fish saved from impingement and entrainment by cooling water systems • Some stakeholders have stressed that only fish that are commercially or recreationally caught are important • Other stakeholders have indicated that only population impacts are important • The use of recirculating cooling systems has the co-benefit of reducing thermal discharge at facilities • Additional groups have stressed that I&E and thermal issues can impact species assemblages and local ecosystem balance

  7. A Choice Question from 2005

  8. Attribute List in 2009

  9. Additional Material on the Aquatic Ecosystem Score that was Removed The Aquatic Ecosystems Score is a 0 - 100 score showing the effects of policies on the ecological condition of affected areas. It measures how close affected Northeast waters are to the most natural, undisturbed condition that is possible. Higher scores mean the area is more natural. • The following information is combined to make the final score:

  10. 2011 Table Explaining Choice Question Attributes (in Northeast Survey)

  11. 2011 Choice Question

  12. Focus Group Finding with Regard to Choice Attributes • Participants generally understood the ecological scores and differences between the attributes • Excluding early details which people found confusing, participants appeared to understand the ecological concepts being addressed • Participants expressed preferences and values for these different ecological outcomes • Some people expressed preference for a single attribute or subset of attributes. • Others said they considered all attributes or keyed in on the attribute with the greatest change.

  13. Information on Northeast Pilot Survey Results • EPA started fielding the survey August 30, 2011 • Total surveys mailed: 1,440 • Completed surveys received: 399 • Undeliverables: 117 (8%) • Response rate: 28% (relative to 1,440), 30% (total minus undeliverables) • Non-response work will begin shortly • Statistical information presented today is based on just the 330 surveys that have been coded.

  14. Question 3 Question 3. When considering policies that affect how facilities use cooling water, how important to you are effects on each of the following scores? Check one box for each. (For reminders of what the scores mean, please see page 7).

  15. Results of t-test on Question 3 P-values indicate that people have different preferences across attributes. Respondents can differentiate between the different outcomes of reducing I&E mortality.

  16. Percent of Options Selected Across Choice Questions

  17. Selected Follow-up Question Results

  18. Lessons Learned • Attributes must be designed to assess policy requirements • Construct attributes that have clearly measurable ecological end points directly linked to policy needs • Ecological outcomes of policies don’t have to be “dumbed down” and oversimplified for survey respondents • People are able to differentiate between well defined ecological measures and express values • Qualitatively in focus groups and quantitatively in surveys

More Related