attributing benefits to voluntary programs practical and defensible approaches n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches PowerPoint Presentation
Download Presentation
Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches

Loading in 2 Seconds...

play fullscreen
1 / 15

Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches - PowerPoint PPT Presentation


  • 93 Views
  • Uploaded on

Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches. Cynthia Manson, Principal June 23, 2011. Project History. EPA ORCR (OSW) faced OMB concerns: Economic benefits of partnership programs Specific ICRs – WasteWise and NPEP Economic efficiency of programs (PART)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches' - marnin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
attributing benefits to voluntary programs practical and defensible approaches

Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches

Cynthia Manson, Principal

June 23, 2011

project history
Project History

EPA ORCR (OSW) faced OMB concerns:

  • Economic benefits of partnership programs
  • Specific ICRs – WasteWise and NPEP
  • Economic efficiency of programs (PART)

Identified need to:

  • Respond to demand for robust analysis
    • Noting data limitations of partnership programs
    • Programs already exist, limits analytic options
  • Harmonize discussions of economic analysis and program evaluation

Result:

  • Framework for analysis using available data
  • Discussion of limitations of experimental design
economic reasoning for voluntary programs
Economic reasoning for voluntary programs
  • To address market failures:
    • Imperfect information in the marketplace

SIGNALING FAILURE

    • Lack of knowledge transfer on green approaches from firm to firm

“PUBLIC GOOD” NATURE OF R&D

  • To address unregulated or under-regulated areas, e.g., water conservation, pollution prevention
potential impacts of epa partnership programs
Potential Impacts of EPA Partnership Programs

Example Programs: WasteWise, EnergyStar, Natural Gas Star, WaterSense, Green Suppliers Network

Provide incentives for participants to share and adopt greener behaviors that, in absence of EPA assistance, would have occurred:

  • Later in time
  • On a temporary or tenuous basis
  • On a smaller scale
  • Not at all
potential impacts of epa partnership programs1
Potential Impacts of EPA Partnership Programs
  • “Technical Assistance:” Goal of Information sharing – transfer of R&D, innovation. EPA facilitates transfer of innovations among participants, and to non-participants through web sites and publications.
    • Addresses “public good nature of R&D”
    • Spillover effects DELIBERATE
  • Market signaling: EPA recognition informs consumers about environmental quality, though:
    • Awards and other public recognition;
    • Logos that signal participation and performance;
    • Certification assistance and verification; and
    • Marketing assistance.
    • Addresses “signaling failure”
problem proving program outcomes
Problem: “Proving” program outcomes
  • Optimal design: randomized control trial (RCT)
    • Strongest approach - addresses causality, attributes program benefits.
    • Requires random assignment of groups to participate and not (drug tests).
  • Random assignment not possible in most EPA contexts, including voluntary programs
  • Alternative to RCT: two-stage approach:
    • Evaluate features of participant group, ensure appropriate selection of control group(s).
    • Approach still requires identifying non-participants.
  • Spillover deliberate – no control group.
proposed approach tiered assessment with existing data
Proposed Approach: Tiered Assessment with existing data
  • Level 1: Threshold Assessment: ensures and documents that the program design is appropriate for addressing market failure.
  • Level 2: Intervention-Outcome Assessment: verifies that program resources and activities are logically aligned with desired outcomes.
  • Level 3: Quasi-Experimental Design: Quantitative analyses that effectively attribute benefits to the program, while avoiding feasibility issues of experimental design.
level 1 threshold assessment technical assistance
Level 1: Threshold Assessment: Technical Assistance
  • Threshold evidence for potential technical assistance benefits of a partnership program - innovations are:
  • Non-patentable
  • Applicable broadly to other firms
  • Able to be duplicated by other firms at low cost
  • Able to be duplicated by other firms quickly
  • Applicable to small firms with numerous competitors
level 1 threshold assessment market signaling
Level 1: Threshold Assessment: Market signaling
  • Threshold evidence for potential market signaling benefits to a partnership program:
  • Environmental quality characteristics are difficult for the public to observe
  • Environmental quality characteristics are not already addressed by a respected third- party certification of auditing scheme
level 2 intervention outcome assessment
Level 2: Intervention-Outcome Assessment:

Thorough inventory of program interventions and outcomes (quantified logic model).

Step 1: Information on interventions should include:

level 2 intervention outcome assessment1
Level 2: Intervention-Outcome Assessment:

Step 2: Information on outcomes should include:

level 3 quasi experimental design
Level 3: Quasi-Experimental Design
  • Examples of quasi-experimental designs:
  • Sub-optimal comparison group: Compare participants and non-participants without statistical correction.
  • Regression discontinuity: Assign participants to a treatment or comparison group on the sole basis of a cutoff score on a pre-program measure.
  • Time series: Measure indicators of study group performance over time, with or without a comparison group.
  • Outcome analysis: Measure changes in outcome variable(s) without accounting for external factors.
slide15

IEc

INDUSTRIAL ECONOMICS, INCORPORATED

617.354.0074