1 / 8

2004 IAC Director’s Meeting

2004 IAC Director’s Meeting. Drew McMahan Oregon State University IAC. Implementation Analysis From Ellie Najafi, Bradley University. Understanding Implementation Rates What ARs have the highest/lowest implementation rates? What are the correlations between ARs and rejection codes?

petra
Download Presentation

2004 IAC Director’s Meeting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2004 IAC Director’s Meeting Drew McMahan Oregon State University IAC

  2. Implementation AnalysisFrom Ellie Najafi, Bradley University • Understanding Implementation Rates • What ARs have the highest/lowest implementation rates? • What are the correlations between ARs and rejection codes? • Using Knowledge to Improve Report Performance • Include client in brainstorming session • Provide additional training to IAC staff • Include decision makers in final exit interview • Modify rejection codes • Modify format of final report • Provide client recognition for high implementation rates

  3. The OSU IAC Response • Active Follow-Up • Facilitate report distribution, understanding • Track and respond clients’ impressions of the report.

  4. Best Practices Tools • http://www.oit.doe.gov/bestpractices/software_tools.shtml • Current Tools • Airmaster+ • MotorMaster+ • 3E+ • PSAT, FSAT • SSAT, NxEAT • Future Tools • CHP Tool • Combined Tool Suite • IAC Utilization of BP Tools • Internal IAC Tool Exchange(?)

  5. CHP • CHP Tools • CogenPro; http://www-rohan.sdsu.edu/~eadc/cogenH.html • DOE CHP Tool, Resources; • http://www.eere.energy.gov/de/information/info_tools.shtml • Other Tools • RACs – Regional Application Centers • RAC-IAC Synergy • UIC Case, Future Goals • AR Payback Penalty

  6. Database MetricsFrom Rutgers • Direct Metrics • Average Implemented Energy $ Saved • Average Implemented Energy $ Saved / Usage • Average Implemented Energy / Usage in Btus • Average Implemented $ Saved • Average Implemented Energy MMBtu’s Saved • Indirect Metrics • Total Number of Different ARs • Recommended $ Saved per Assessment Day • Total number of Energy ARs • Total number of non-energy ARs

  7. Programmatic Issues (PI) • Industrial Coverage • Geographic Coverage • Number of IAC Students Employed • Average Number of IAC Students Attending Assessments • Teaching Faculty Participation • Timeliness • Performance Adjustment Factors (PAFs)

  8. Peer ReviewFrom Director’s Roundtable • Facilitate report improvement • Independent of DOE metrics • PDF Exchange?

More Related