1 / 26

Database Committee Report January 2012 (Data from 2011)

Database Committee Report January 2012 (Data from 2011). San Diego, CA January 10-11, 2011 R. Darrell Taylor Raytheon. Executive Summary. Not all 2011 data is in Database (app. 65%) Reviewed a 12 month rolling number. Agenda. CRB Activity OASIS Entries – Audits OASIS Nonconformities

herb
Download Presentation

Database Committee Report January 2012 (Data from 2011)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Database Committee Report January 2012 (Data from 2011) San Diego, CA January 10-11, 2011 R. Darrell TaylorRaytheon San Diego, CA 2012

  2. Executive Summary • Not all 2011 data is in Database (app. 65%) • Reviewed a 12 month rolling number

  3. Agenda • CRB Activity • OASIS Entries – Audits • OASIS Nonconformities • AS9100 Elements • Feedback

  4. CRB Activity San Diego, CA 2012

  5. CB Scores as of 1/1/2012 using Performance Metric

  6. 8 D Approach • Threshold Violations: • (This section will contain a list and description of the thresholds that were violated • Note: It is not about the individual thresholds, the Corrective Action is being issued because there are multiple threshold violations which may be indicative of a systemic issue with your process)

  7. 8 D Approach • Identify Team: • (Because of the nature of this corrective action we recommend that this corrective action not be an individual’s responsibility, this block should identify who was involved in the process) • Describe the Problem: • (Given the information provided, we recommend formulating a problem description based on what you know about your process and the systemic issues identified above.)

  8. 8 D Approach • Implement and Verify Short Term Corrective Action: • (This section should identify what is being done immediately to solve the problem, and what was done to verify that those actions were put in place) • Define and Verify Root Cause: • (This section should identify the Quality Science tools used to investigate the problem and how you verified the root cause does in fact exist)

  9. 8 D Approach • Verify Corrective Actions: • (This section should contain a look back at the already implemented containment corrective actions and reevaluate them in light of the root cause analysis.) • Implement Permanent Corrective Actions: • (This section should contain the corrective action plan, with milestones, ownership and expected outcomes including a verification and validation plan)

  10. 8 D Approach • Prevent Recurrence: • (This section should contain the actions taken to prevent this from re-occurrence, with milestones, ownership and expected outcomes including a verification and validation plan) • Future Steps: • (This section should contain the steps being taken for the future, monitoring the process)

  11. Entries – Audits San Diego, CA 2012

  12. Comparison of Entries 2007 -2011 • Entries: • 2007 – 5114 • 2008 – 5972 ^16% • 2009 – 6969 ^17% • 2010 – 7502 ^7.6% • 2011 – 4848 • As of 12/1 Suspect at this point app. 65% of the data is in Database

  13. SUR=Surveillance REC = Recertification INT = Initial Comparison of Types2007 -2011 65% of data

  14. Standards Entered Into Database 1071 AS9100C as of 12/1/2011

  15. CRB Comments - 2011 • Ratio of Surveillance to Recertification to Initial consistent – potential decrease in initial for the second year in a row • Slow start in AS9100C, perhaps due to timing Going to Rolling Format

  16. Nonconformities San Diego, CA 2012

  17. Pareto of Total Nonconformities Totals 2011 – 14,276 2010 – 20,009 2009 -20,980 2008 -18,803 2007 – 14,957 Average per audit = 2.68 nonconformities, this may be leading to a reduction of nonconformities

  18. Total Not Bell Shaped

  19. Ratio Majors/Entries Totals 2011 - 1271 2010 – 1387 2009 -1135 2008 – 976 2007 - 870 On average, one Major is written every fourth audit Range 0 to 19

  20. Ratio Minors/Entries Totals 2011 – 12,893 2010 – 18,627 2009 -19,414 2008 -17,393 2007 –14,087 On average, 2.4 Minors are written every audit RANGE: 0 - 52

  21. Top 80% of Nonconformities First time in 3 years the order slightly changed

  22. The Remaining 20%

  23. Feedback Analysis 2011 Average to close of “CLOSED” Feedback – 18.54 days Average of those still “OPEN” – 46.7 days

  24. AB Report Card Slight increase in Minors from previous years, Audits and Majors relatively flat: 0 Complaints, 0 Suspensions

  25. Questions

  26. Back Up Boston, MA July 19-20, 2011

More Related