1 / 40

Database Issues and a Few Other Things

Database Issues and a Few Other Things. Michael R. Muller. Professor of Mechanical Engineering at Rutgers Director of IAC at Rutgers: 1986-1991 Eastern Region Field Manager – IAC 1992 to present Eastern region handles database for entire program My particular interests are:

Download Presentation

Database Issues and a Few Other Things

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Database Issues and a Few Other Things

  2. Michael R. Muller • Professor of Mechanical Engineering at Rutgers • Director of IAC at Rutgers: 1986-1991 • Eastern Region Field Manager – IAC 1992 to present • Eastern region handles database for entire program • My particular interests are: • On-site power generation • Supplier development programs • Developing Protocols for Industrial “Triage”

  3. Industrial “Triage” • Strategy for ranking cost saving proposals • Replaces list of “good ideas” which preclude ranking • Requires cost estimates and project impacts • Must develop connection to productivity • The “soft number problem” • We do it already – not enough time to go after all of the good ideas at a plant • Need to quickly focus on high yield targets

  4. Program Metrics • Each year the IAC program’s performance at DOE is determined (GPRA - Government Performance and Results Act of 1993) • IAC’s are graded on their performance as well • Measurable statistics are called metrics (buzz) • Director’s know about this, but students may not • Students are important parts of a center’s overall performance • You need to track your own metrics

  5. Program Metrics • East and West regions differ, but details are available • Examples here are for the East!

  6. Overall Ranking Criteria • Report Quality • Assessment Performance • Day to Day Operation • Contract Compliance • Database Metrics

  7. Eastern Program Metrics

  8. Report Quality • Presentation • Clarity • Creativity • Knowledge • Errors • Students do much of the proof reading • We still see upside-down pages! • Lead students will set the tone in the office – will have great impact with the rest of the team

  9. Assessment Performance • Based from site visits and client follow ups • Some of the criteria: • Attitude • Technical Knowledge • Interview • Plant inspection • Student participation • Faculty leadership • Student safety • Instrumentation

  10. Day to Day Operation • Availability / responsiveness • Database Communications • Extra money handling • Faculty participation • Student Utilization

  11. Contract Compliance • Timeliness (# of late items submitted) • Big student impact here • Coverage • Industry coverage1 (based on # of 2-digit SIC codes) • Geographical coverage1 (based on 3-digit ZIP codes) • Other violations • More than 3 exceptions • Non-compliant quarters 1 based on last 75 reports

  12. Database Metrics (last 25 reports) • Implemented Dollar Savings (35%) • Implemented Energy Savings (15%) • Total # of different AR’s (20%) • Recommended Dollars per assessment (15%) • Total # of Energy AR’s (7.5%) • Total # of Non-energy AR’s (7.5%)

  13. Database Metrics • Students need to track metrics at their own center • Should also track their performance against other centers • This is very doable, but needs a student to get experienced with databases • Students already upload most of the data to RU • Database and manuals are online at Rutgers • oipea.rutgers.edu • Or call Uncle Fred (Fred Glaser) at 732-445-5540

  14. Training Manuals Online

  15. The IAC Database • Structure: Two Main Files • Assessment Database • IAC and Report Number, Visit Dates • Plant size, Principal products, SIC • Number of Recommendations • Energy Streams: Cost and Usage • Recommendation Database • A Narrative Description of the Recommendation • ARC representing recommendation • Resource Use and Costs Avoided • Costs to Implement • Status of Implementation

  16. The IAC Database (cont.) • The database contains data from all assessments since 1981 • Data shows changes regionally, and over time – very useful to many groups • Accessing the Database • http://oipea.rutgers.edu • Users can: • 1. Download the data files and perform queries on their own computers • 2. Use the on-line queries on the web site

  17. Interactive Database Access • Client - Server structure • Available on RU website • Eliminates need to download database for simple queries • Current version use MYSQL v. 3.21 • Easiest way to check if your data is online • DOE looks at this – their way of keeping track

  18. Database Metrics • For many of the metrics used to judge center performance, the database must be downloaded and analyzed • Any database software will let you work with the data • Data is in standard dbase engine (.dbf) • Most metrics are easy to calculate • Normally we sum over last 25 assessments • Therefore it crosses contract years

  19. Database Metrics • Trickiest measure is creativity • We count different ARC numbers used in the last 100 assessments • Good number is about 100!

  20. Using the database as an assessment tool • IAC’s develop their own ways of doing things • The IAC database is just one tool • We recommend that students check the database before an assessment • Look for assessments with same SIC code (NAICS) • Find recommendations which were made – get ideas • Broaden the types of recommendations your IAC makes • My happiest moments as an IAC director was when students came up with good ideas • Probably need to use download database

  21. 20 Food and Kindred Products 30 Rubber and Misc Products 21 Tobacco Products 31 Leather and Leather Products 22 Textile Mill Products 32 Stone, Clay, and Glass Products 23 Apparel and other Textile Products 33 Primary Metal Industries 24 Lumber and Wood Products 34 Fabricated Metal Products 25 Furniture and Fixtures 35 Industrial Machinery and Equipment 26 Paper and Allied Products 36 Electronic & Other Electric Equipment 27 Printing and Publishing 37 Transportation Equipment 28 Chemicals and Allied Products 38 Instruments and Related Products 29 Petroleum and Coal Products 39 Miscellaneous Manufacturing Industries

  22. First Step – Check the interactive database • Will give top ten recommendations • The IAC program looks at systems which are common to many industries • There are 22 2-digit SIC codes served by the IAC’s • How many different recommendations appear in the 22 top ten lists • (max = 220, min 10)

  23. "Industrial Energy Audit/ Assessments: Implications for Saving Energy in the Manufacturing Sector"AEE Congress - 1999 Michael R. MullerProfessor of Mechanical EngineeringRutgers University

  24. Only 21 Recommendations appear in any Top Ten

  25. SIC Searches • Best to use download database and look for more than the top ten • Often this will give good ideas • Examples: • Sand recommendations in metal casting • Recovery boilers in paper mills • Oxyfuel combustion in glass plants • High accuracy metering pumps in chemical plants

  26. Understanding the bigger picture • Another benefit of database is that it shows what plants and industries respond best to IAC services • How big is the typical plant? How many employees? What industries do we visit most often? • With IOF industries being the priorities, how are they represented in your region?

  27. Characterization of IAC Clients Average = $25 million Median = $16 million

  28. Characterization of IAC Clients Average = 166 Median = 125

  29. Characterization of IAC Clients Average = $354k Median = $196k

  30. Some Reminders • Change from “site to source” for electricity • Electrical energy consumption is not equivalent to the total energy consumed • Losses from power plants and transmission are considerable • Grid efficiency is about 30% • Practice leads to 3x undercounting of energy savings on electrical AR’s • Before – uploading used MMBtu’s • Now – upload electrical consumption with KWH!

  31. Implementation – what is pending? • Only three implementation codes are acceptable: • I – Implemented must have date that is not greater than 1 year from implementation interview and not greater than 2 years from assessment • N – Not Implemented with Reason Code • N1 through N22, excluding N12 & N13 • For FY01 we will not accept • N12 – to be implemented after 2 years • N13 - considering • P – Pending: No date just a ‘P’ • For recommendations over $10,000 use ‘P’ and you then must call back client within 3 years of audit or recommendation will become not implemented. • For recommendations under $10,000 – make a decision!

More Related