1 / 24

Good Practice Performance Indicators

Good Practice Performance Indicators . Engineering Practices Working Group Performance Indicator Sub-Group Engineering Performance Metrics D. C. Lowe October 21, 2003. Performance Indicator Sub-Group Participants. Dave Lowe, CH2M HILL Hanford - Chair Herb Mumford, Bechtel, INEEL

manning
Download Presentation

Good Practice Performance Indicators

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Good Practice Performance Indicators Engineering Practices Working Group Performance Indicator Sub-Group Engineering Performance Metrics D. C. Lowe October 21, 2003

  2. Performance Indicator Sub-Group Participants • Dave Lowe, CH2M HILL Hanford - Chair • Herb Mumford, Bechtel, INEEL • Barb Quivey, LLNL • Tom Monahon, Westinghouse Savannah River • Harry Gullet, Sandia National Laboratories

  3. Objectives • Identify a set of “good practice” example PIs from commercial nuclear and DOE sites grouped into general categories applicable to engineering that could be adopted or modified, as appropriate, for use by engineering groups within the DOE complex. • Demonstrate that engineering groups are practicing “good engineering.” • Identify where engineering should focus attention to satisfy customer needs. • Identify trends in equipment or system performance to focus resources correctly. • Monitor engineering costs and efficiency.

  4. Approach • Gather input/examples from INPO, commercial nuclear, and DOE sites. • Identify general categories that engineering groups typically monitor. • Evaluate input from participating sites/plants corresponding to general categories. • Provide good practice examples in each category.

  5. PI Example Contributors DOE Sites • Hanford Site (Tank Farms) • Savannah River Site • INEEL Commercial Nuclear Plants • Columbia Generating Station • Davis Besse • McGuire Nuclear Station • Watts Bar Nuclear Plant • Wolf Creek Nuclear

  6. Engineering PI Categories A. Product Quality/Technical Rigor B. Safety System Performance C. Configuration Control D. Production/Productivity E. Continuous Improvement F. Training and Qualification G. Engineering Cost

  7. Product Quality/Technical RigorGood Practice Examples 1. Document Quality • Change package quality • Engineering technical adequacy 2. Human Performance • Organizational quality clock • Plant engineering personnel error rate 3. Rework • Unplanned change package revisions due to design errors

  8. Engineering Technical Adequacy

  9. Project Management Section Clock

  10. Plant Engineering Personnel Error Rate The Number of Significant Errors Recordedper 10,000 Hours Worked, 6 Mo.Rolling Average. Notes: There were no Human Performance Errors in the department for July. Definition: Any event that resets our department clock. Responsible Mgr: Data Provided By: PJ Inserra T Neidhold (C Leon)

  11. Unplanned Changed Package RevisionsDue to Design Errors

  12. Path Forward • Finalize “Safety System Performance” category • Sub-group review of additional categories and selection of good practice examples • Compile file report of good practice examples • Target completion: March 1, 2004

  13. Backup Slides

  14. Document QualityChange Package Quality This PI tracks the quality of change packages as determined from a review performed by the responsible supervisor. Strengths: • All the reviews and scoring are performed to a set of predetermined criteria. • The PI includes descriptive information on what is measured, how it is measured and what is considered good performance. • An action level is indicated on the chart. • The change package customer rates the quality of the product.

  15. Document QualityChange Package Quality Suggested Improvement: • The PI would benefit from a clear visual indication that shows the direction of increasing or decreasing performance. • Total population of work packages reviewed vs. the percentage meeting quality expectations would provide information that could be used to benchmark among different sites. • Information should be provided on how the one-year weighted average plot data is calculated. Major rating criteria should be stated.

  16. Document Quality Engineering Technical Adequacy This PI tracks the quality of engineering documentation as determined by a quality review group comprised of senior level engineers. Strengths: • All the reviews and scoring are performed to a set of predetermined criteria. • The PI includes a sub-chart that shows what attributes are reviewed and an average score for each attribute. • A goal is indicated on the chart. • A three-month rolling average is included so that trends are not masked by an individual month’s data.

  17. Document Quality Engineering Technical Adequacy Suggested Improvement: • The PI would benefit from a clear visual indication that shows the direction of increasing or decreasing performance. • Total population of documents reviewed vs. percentage meeting quality expectations would provide information that could be used to benchmark among different sites. • Information should be provided on how the average document rating is calculated. Major rating criteria should be stated.

  18. Human Performance Project Management Section Clock This PI tracks the number of resets of the event free clock for a specific engineering group on a monthly basis. Strengths: • The direction of increasing or decreasing quality is indicated on the chart. • The PI includes descriptive information on what is measured and a color rating scale. • The performance goal is stated. • Monthly score (color) is directly indicated on the charted data.

  19. Human Performance Project Management Section Clock Suggested Improvements: • More definitive criteria should be included so that objective performance can be compared to other sites. This should include a reporting threshold and examples for events that would be counted. • A rolling average could be included so that trends are not masked by an individual month’s data. • If the indicator included information on the number of hours worked in a given month, data could be compared to other sites using the error rate method of tracking human performance.

  20. Human Performance Plant Engineering Personnel Error Rate This chart plots the rolling average personnel error rate per 10,000 hours worked for the specific engineering department. Strengths: • The direction of increasing or decreasing quality is clearly indicated by the color scale on the chart. • The performance goal “green” is shown on the chart. • Data are normalized (i.e., errors per 10,000 hours worked) so that information can be benchmarked among different sites. • A rolling average is used so that trends are not masked by an individual month’s data.

  21. Human Performance Plant Engineering Personnel Error Rate Suggested Improvements: • The chart should include descriptive information on what the threshold is for determining whether an error is significant. This was verbally provided as any entry into the plant corrective action system that had human error as a cause code and attributed to the Plant Engineering Department. • The chart should include descriptive information on what is measured, why it is measured and how it is measured as will as recovery actions when performance goals are not met.

  22. Rework Unplanned Change Package Revisions Due to Design Errors This PI tracks the percentage of work packages that must be revised because design errors were detected during the review performed by the responsible supervisor. Strengths: • The data are presented as a percentage vs. raw numbers so that information can be benchmarked among different sites. • The minimum performance goal is shown on the chart. • A rolling average is used so that trends are not masked by an individual month’s data. • The PI includes descriptive information on what is measured, how it is measured and what is considered good performance.

  23. Rework Unplanned Change Package Revisions Due to Design Errors Suggested Improvements: • The PI would benefit from a clear visual indication that shows the direction of increasing or decreasing performance. • Total population of work packages reviewed vs. the percentage meeting quality expectations would provide information that could be used to benchmark among different sites. • Criteria should be provided for work package grading and the criteria for determining when a design error is significant, causing the package to be revised vs. inconsequential changes.

More Related