Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of En...
This presentation is the property of its rightful owner.
Sponsored Links
1 / 27

Presented at the American Evaluation Association/Canadian Evaluation Society Joint Conference PowerPoint PPT Presentation


  • 138 Views
  • Uploaded on
  • Presentation posted in: General

Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs. Presented at the American Evaluation Association/Canadian Evaluation Society Joint Conference Toronto, Canada October 28, 2005 Scott Albert, GDS Associates

Download Presentation

Presented at the American Evaluation Association/Canadian Evaluation Society Joint Conference

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Presented at the american evaluation association canadian evaluation society joint conference

Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs

Presented at the

American Evaluation Association/Canadian Evaluation Society Joint Conference

Toronto, Canada October 28, 2005

Scott Albert, GDS Associates

Helen Kim, NYSERDA

Rick Ridge, Ridge & Associates

Gretchen B. Jordan, Sandia National Laboratory


The nyserda portfolio

The NYSERDA Portfolio


R d budget through 12 31 04

R&D Budget Through 12/31/04


Objective

Objective

  • Develop and pilot-test an evaluation model for NYSERDA’s R&D program area covering 1998 through 2004 that recognizes:

    • R&D programs and their societal impacts are difficult to evaluate by their nature.

    • The outcomes are subject to multiple and uncontrollable influences that are difficult to foresee.

    • The cycle for product development is 5 to 15 or 20 years and many of the energy and economic impacts of R&D projects may not be fully realized and measured for many years.

    • Given the multiple and compounding effects that happen along the way, it is also very difficult to be exact about the attribution of impacts to any one particular effort.

    • When evaluating an entire portfolio of R&D projects, objectives and outcomes vary by project.


R d portfolio logic model

R&D Portfolio Logic Model


Six stages of the r d model

Six Stages of the R&D Model

  • Information for policy makers and R&D community

  • Product development stage 1 – study and prove concepts

  • Product development stage 2 – develop new or improved products

  • Product development stage 3 – product testing

  • Demonstration

  • Pre-deployment


The value cost method combines two approaches

The Value/Cost MethodCombines Two Approaches

  • Aggregate approach

    • Analyzed data collected for each of NYSERDA’s 638 R&D projects (since 1998) in the portfolio.

    • Basic statistics, such as the number of projects, expenditures by technology type, leveraged funds, and the stage of development were calculated to describe the entire R&D portfolio.

  • Peer Review

    • Analyzed using an adaptation of the Composite Performance Rating System (CPRS) used to evaluate the U.S. Department of Commerce’s Advanced Technologies Program (ATP).

    • Peer review approach was applied to a small sample of successful R&D projects, covering each of the six R&D stages (project types).


Atp composite performance rating system constructed bottom up used top down

ATP: Composite Performance Rating System Constructed Bottom-up; Used Top-down

Performance Distribution for the Portfolio

Distribution

by

Tech Area

Distribution

by

Firm Size

Distribution

by

Location

etc.

CPRS 1

CPRS 2

CPRS 3

CPRS 4

CPRS n

...

Project 1

Case Study

Project 2

Case Study

Project 3

Case Study

Project 4

Case Study

Project n

Case Study

  • Unique cases

  • Aggregate statistics

  • Composite scores

  • Performance distributions

  • Minimum net portfolio benefits

ATP Method

R.Ruegg,

Nov. 2002


Aggregate analysis

AGGREGATE ANALYSIS

  • Expanded and updated R&D database in order to carry out a comprehensive descriptive analysis of the entire R&D portfolio.

  • Variables Considered

    • Funding

    • Technology Area

    • Co-Funding Entity

    • Project Status

    • Expected Benefits from R&D Projects


Questions addressed by aggregate analysis

Questions Addressed by Aggregate Analysis

  • How does NYSERDA funding per project vary by project type?

  • How does NYSERDA funding per project vary by program?

  • What is the frequency of the various project types?

  • What goals are being served by the various project types?

  • What are the primary goals served by the portfolio?

  • What are the sources of funding, by project type?

  • What is the funding share contributed by partners?

  • How does NYSERDA funding and co-funding vary by project type over time?

  • How does the mix of technologies and issues examined change over time?


Results aggregate analysis

Results:Aggregate Analysis


Nyserda funding by project type

NYSERDA Funding, by Project Type


Funding by goals

Funding by Goals


Co funding sources

Co-Funding Sources


Percent of projects by technology and year

Percent of Projects by Technology and Year


Peer review focused on six success stories as a pilot test

Peer Review Focused on Six Success Stories as a Pilot Test


Indicator variables

Indicator Variables

  • Choice of indicator variables for the R&D portfolio guided by the R&D portfolio logic model.

  • Six categories of outcomes, identified in the logic model were selected:

    • Knowledge creation,

    • Knowledge dissemination,

    • Commercialization progress,

    • Energy benefits,

    • Economic benefits, and

    • Environmental benefits.


Accomplishment packets

Accomplishment Packets

  • Project-specific accomplishment packets were then developed to document objective evidence regarding the six outcomes:

    • Knowledge creation

    • Knowledge dissemination

    • Commercialization progress

    • Realized and potential energy benefits

    • Realized and potential economic benefits

    • Realized and potential environmental benefits

    • Value versus cost (not a specific outcome, but this item was also included in the peer-reviewer response packet for 0 to 4 rating)


Review process

Review Process

  • Reviewers willing to participate were sent:

    • Peer Review Instructions,

    • Conflict of Interest Form

    • Peer Review Assessment Form, and

    • the Peer Review Information Packet for their specific project.

  • Over a period of five weeks, the reviewers completed their assessment and returned them for data entry.


Results peer review

Results:Peer Review


Weighted rating by project

Weighted Rating By Project


Overall ratings by outcome

Overall Ratings by Outcome


Overall ratings by project by outcomes

Overall Ratings, by Project, by Outcomes


Conclusions aggregate analysis

Conclusions: Aggregate Analysis

  • Assumes more risk than the commercial sector in the earlier stages of technology development, while in the latter stages, the reverse is true.

  • Covers a wide range of technologies that are aimed at achieving potentially significant energy, economic and environmental benefits.

  • Leverages funds on a 4.3 to 1 ratio.

  • Partners with a wide range of public and private organizations and institutions.

  • Evolves over time in response to the societal needs and opportunities to address them (i.e., the technologies and issues addressed in the R&D portfolio are not static).


Conclusions peer review

Conclusions: Peer Review

  • Peer review scores from the pilot test averaged 3.34 (on a 0-to-4 scale) across all assessment categories.

  • There are substantial benefits across all documented accomplishment areas for the five projects assessed.

  • Significant progress is being made toward the eventual achievement of measurable 3-E benefits.


Conclusions peer review process

Conclusions: Peer Review Process

  • The information provided in the review packets for the five selected projects was adequate

  • The instructions provided were clear

  • The criteria used in the assessments were clearly defined

  • The criteria used in the assessments were the right ones

  • It is very important for NYSERDA to assess the value of its R&D programs

  • The results of the peer review process should be useful for NYSERDA decision-makers

  • Reviewers can assess a fair amount of information if the information is presented in a clear and organized format.

  • Statistical analyses revealed that the ratings provided by the peer reviewers were reliable.


Next steps

Next Steps

  • Routinize the collection of key indicator data for all R&D projects.

  • Perform aggregate analysis on all projects

  • Focus significant effort on a more representative sample of projects


  • Login