1 / 27

Understanding the Elements of Success: Findings from the National Energy Efficiency Programmatic Best Practices Study

Understanding the Elements of Success: Findings from the National Energy Efficiency Programmatic Best Practices Study. CALMAC Meeting December 15, 2004 Kenneth James, PG&E, Study Manager Mike Rufo, Quantum Consulting, Prime Contractor. Project Advisory Committee Kenneth James – PG&E

tiana
Download Presentation

Understanding the Elements of Success: Findings from the National Energy Efficiency Programmatic Best Practices Study

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Understanding the Elements of Success: Findings from the National Energy Efficiency Programmatic Best Practices Study CALMAC MeetingDecember 15, 2004 Kenneth James, PG&E, Study Manager Mike Rufo, Quantum Consulting, Prime Contractor

  2. Project Advisory Committee Kenneth James – PG&E Pierre Landry – SCE Rob Rubin – SDG&E Jay Luboff – CPUC Eli Kollman – CPUC Sylvia Bender - CEC Project Team Mike Rufo – Quantum Marissa Meyers - Quantum Phil Willems – Quantum Jane Peters – Research Into Action Bruce Mast – Frontier Associates Megdal & Associated Shel Feldman Mgt Cons.

  3. Presentation Overview • Project Background • Example Best Practices Study Findings • Best Practices Website and Products • Next Steps

  4. “Benchmarking is the process of identifying, sharing, and using best practices to improve business processes.” Source: American Productivity and Quality Center "Benchmarking is simply about making comparisons with other organizations and then learning the lessons that those comparisons reveal". Source: The European Benchmarking Code of Conduct Benchmarking almost always occurs as a collaborative process in which participants share information. Typically the shared information is about business processes with the intention of identifying excellence and developing an understanding of how it is achieved. Benchmarking addresses the question, “How one can improve by learning from others?” Benchmarking is inexorably tied to concept of best practices, also called business excellence or exemplary practices. Benchmarking

  5. The term “Best Practice” refers to the business practice that, when compared to other business practices that are used to address a similar business process, produces superior results. Best practices are documented strategies and tactics employed by successful organizations and programs. Rarely is an organization or program "best-in-class" in every area. Our focus is not on identifying best programs or best organizations but, rather, best practices that exist within and across programs. Best Practices are identified from in-depth interviews with program managers, thorough review of program documents, analysis of secondary sources, and comparison of program features and outcomes. The focus of this Study is on best practices that can be generalized and have a high likelihood of transferability to other programs within or across program categories. Best Practices

  6. Background

  7. A Few Key Questions Addressed by this Best Practices Project: • What EE program design, implementation and management practices are entities currently using? • How effective are they? • Where is there room for performance improvement? • How will this knowledge assist in meeting the challenges and opportunities in CA new EE environment?

  8. Program Components

  9. On Beta Website: Residential Lighting HVAC Single-Family Comp Multi-Family Comp Audits New Construction Nonresidential Lighting/Turnkey Large Comprehensive Program Areas Program Area Reports (13) • In Review: • Nonresidential • HVAC • Trade Ally Training • New Construction • Other • Mass Market Advertising • Still in Preparation • Res Appliances

  10. Study Products: Program Area White Paper Reports Report Content • Summary of Findings • Overview of Programs (5-10 each, 80+ total) • Policy/Historic Context & Issues • Feature Benchmarking and Best Practices • Comparison of Outcomes

  11. Example of Range of Programs Covered - Large Nonresidential Comprehensive Incentives Program Area • CA’s Nonresidential Standard Performance Contract • NYSERDA’s Energy $martTM C/I Performance • United Illuminating’s Energy Opportunities • BC Hydro’s Power Smart • Xcel Energy’s Custom Efficiency (Colorado) • Northeast Utilities’ Custom Efficiency • Massachusetts Electric’s Energy Initiative • Alliant Energy’s Energy Shared Savings • Efficiency Vermont’s Business Energy Services • SMUD’s Commercial & Industrial Custom Retrofit

  12. Study Products: Program Area White Paper Reports Program Profiles • Program Synopsis • Program Focus • Program Context • Program Components • Key Sources • Contact Information

  13. Sample Program Design BPs Generally Cross-Cutting • Articulate plan/theory that states expectations, timing, approach • Link strategic approach to policy objectives and constraints • Build feedback loops into program design and logic • Do not over-promise results • Understand local market conditions • Conduct sufficient market research • Maintain program design flexibility to respond to changes in market & other factors • Put process plan (including program management) in writing • Tailor program to the unique needs of market sector targeted • Define & locate hard-to-reach customers & target programs accordingly, as appropriate

  14. Sample Program Management BPs Generally Cross-Cutting • Clearly define program management responsibilities to avoid confusion as to roles and responsibilities • Develop and maintain clear lines of communication • Use well-qualified engineering staff (for technical programs) • Motivate field staff and service providers • Maintain consistency in personnel over time • Delegate responsibility based on risk versus reward • Make sure at least some of the institutional memory resides in-house, not with subcontractors • Reward high performing staff and link performance evaluations to tangible measures which are known in advance and developed together jointly by the manager and employee

  15. Sample Reporting & Tracking BPs Generally Both Program-Specific and Cross-Cutting • Define & identify key information early in program process • Clearly articulate data requirements for measuring program success • Balance level of tracking planned against resource availability • Design system to support requirements of evaluators • Use Internet to facilitate data entry & reporting; build in real time data validation systems that perform routine data quality functions • Automate, as much as is practical, routine functions • Develop electronic application processes • Develop accurate algorithms & assumptions for savings estimates • Conduct regular checks of reports to assess program performance • Document tracking system & provide manuals for all users

  16. Sample QC & Verification BPs Mostly Program-Specific • Develop inspection & verification procedures during program design • Base quality control on program’s relationship with vendors, # involved, types of measures, volume, variability of project size • Use measure product specification in requirements & guidelines • Require pre-inspections for large or uncertain impact projects • Require builder/representative to be on site during inspection • Verify accuracy of rebates, coupons, invoices to ensure the reporting system is recording actual product installations • Assure quality of product through independent testing procedures • Build in statistical features to the sampling protocol • Treat inspection visits as partnership-building & learning events • Ensure inspectors have plenty of hands-on-construction practice • Assess customer satisfaction with the product through evaluation

  17. Program Implementation Mostly Program Specific • Keep participation simple, Offer a single point of contact for customers • Develop participation strategies that are multi-pronged & inclusive • Review & understand product availability before establishing eligibility • Make program participation part of an existing, routine transaction • Use Internet/electronic means to facilitate participation • Avoid over committing to projects before design parameters are known • Set incentive levels to maximize net not gross program impacts • Use incremental costs to benchmark and limit payments • Tie rebates for popular measures to those less likely to be considered • Limit or exclude incentive payments to known free riders • Tie incentives to building performance • Offer low interest loans or financing as leverage on incentives • Use disincentives for savings inflation for performance-based options

  18. Program Marketing Mostly Program Specific • Use Energy Star logo to instill consumer confidence/utility credibility • Leverage with cities/community-based organizations & other programs • Include adequate retail outreach & support to ensure product is stocked & advertised & that point-of-purchase materials are accurate & clear • Develop & disseminate case studies to showcase program projects • Use target marketing strategies to ensure that hard-to-reach populations are informed about available programs and options • Use face-to-face marketing, where possible, especially for small biz • Give builders the opportunity to participate in development of message • Market to multiple departments with volume building organizations • Provide trade allies with training & resources to enhance marketing • Sell the customer benefits first, then energy efficiency • Keep benefits quantifiable in economic terms, Promote life-cycle cost • Take advantage of external factors (i.e., heat waves, etc.)

  19. Program Evaluation Generally Both Program-Specific and Cross-Cutting • Engage the implementation team in the evaluation process • Create a culture whereby evaluation findings are valued and integrated into program management • Present actionable findings in real time and at the end of study • Stagger the timing of process and ex post impact tasks so that process results are communicated on a relatively real-time basis • Conduct impact evaluations routinely, but not necessarily annually • Include periodic estimation or free-ridership and spillover • Use regular process evaluation to provide timely and fresh results • Periodically review & update market information on construction practices, EE market share, measure adoption, trends • Perform detailed market assessments particularly for MT programs • Support program review & assessment at the most comprehensive level possible

  20. The Best Practices Benchmarking Website

  21. Top-line BP List BP Rationales, Whitepapers, & Gap Analysis Comprehensive Program Profiles, All Performance Benchmarks Complete Documentation of Methods, Data Collection Processes, Results Communication of Results Model

  22. Home Page – Best Practice Website

  23. Find A Study

  24. Links to Program Areas

  25. Search Results

  26. Website Products • Program Area Chapter Reports • Description of Report • Full Chapter Report (PDF) • Executive Summary (PDF) • Best Practices Table (PDF) • Program Profiles • Description of Program Profile • Program Profile Report (PDF)

  27. Next Steps • Need feedback from Beta Review Group and Project Advisory Committee (PAC) • Post review consideration, revisions made on January • Load final chapters onto website in January to be used for 2005-’08 Portfolio planning • Further analysis of outcomes ($/kWh saved)

More Related