1 / 55

WebSAT: A Proactive System To Capture Aviation Maintenance Errors

WebSAT: A Proactive System To Capture Aviation Maintenance Errors. Sponsor: FAA Industry Partner: FedEx, NWA and other airlines. Clemson WebSAT Team: Anand K. Gramopadhye Joel S. Greenstein Kunal Kapoor Nikhil Iyengar Pallavi Dharwada. Human Computer Systems Laboratory

trisha
Download Presentation

WebSAT: A Proactive System To Capture Aviation Maintenance Errors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WebSAT:A Proactive System To Capture Aviation Maintenance Errors Sponsor:FAAIndustry Partner:FedEx, NWA and other airlines Clemson WebSAT Team:Anand K. Gramopadhye Joel S. Greenstein Kunal KapoorNikhil IyengarPallavi Dharwada Human Computer Systems Laboratory Department of Industrial Engineering Clemson University, Clemson, SC 29634 http://www.ces.clemson.edu/ie/research/hcsl/websat/ Web-based Surveillance & Auditing Tool

  2. Overview • Background • Motivation • Problem Statement • Research Overview • WebSAT Research Objective • Methodology • Discussion and Conclusions • Significant Milestones • Dissemination Slide 1

  3. Background • The mission of FAA is to - • Provide safe and reliable air transportation, and • Ensure airworthiness of the aircraft • Mission can be achieved by • Improving the safety • Minimizing aircraft accidents Slide 2

  4. Motivation • Maintenance Error - a crucial factor in aircraft accidents • 23% involved incorrect removal or installation of components • 28% involved a manufacturer or vendor maintenance/inspection error • 49% involved error due to an airline’s maintenance policy • 49% involved poor design leading to maintenance errors (Source: Rankin et al., 2000) Slide 3

  5. Motivation Contd. NTSB Identification: ATL01IA001 || Air Carrier: Air Carrier operation of Continental Airlines || Accident occurred October 01, 2000 in Birmingham, AL || Injuries: 1 Minor, 146 Uninjured. Aircraft: McDonnell Douglas MD-80, Reg: N69826 • During cruise flight, at flight level 310, an MD-80, operated by Continental Airlines experienced an electrical fire. An emergency was declared and the flight was diverted into Birmingham, Alabama, and landed without further incident. The examination of the airplane disclosed a 2 by 1 1/2 inch fire-damaged hole in the left jump seat wall. • The failure of maintenance personnel to follow Fleet Campaign Directive (FCD) on how to install a certificate holder Slide 4

  6. Motivation Contd. NTSB Identification: CHI03IA027 || Air Carrier: United Airlines Accident occurred November 21, 2002 in Chicago, IL || Injuries: 82 Uninjured. Aircraft: Airbus A319-131, Reg: N804UA • United Airlines (UAL) flight 603 received minor damage when it landed on runway 04R (8,071 feet by 150 feet, asphalt) at the Chicago O'Hare International Airport. The airplane landed with the nose landing gear (NLG) wheels turned 90 degrees to the direction of travel. The 14 CFR Part 121 flight was being conducted in visual meteorological conditions and an instrument flight rules flight plan was filed. • The maintenance facility improperly assembled and installed the nose landing gear shock absorber assembly. Factors were the improper assembly which allowed the nose gear to turn 90 degrees to the direction of travel. Slide 5

  7. Motivation Contd. NTSB Identification: DCA00MA079. || Air Carrier: Air Carrier operation of AirTran Airlines Inc. || August 08, 2000 in Greensboro, NC || Injuries: 13 Minor, 50 Uninjured. Aircraft: Douglas DC-9-32, Reg: N838AT • AirTran Airways flight 913 executed an emergency landing at Greensboro Piedmont-Triad International Airport (GSO) shortly after declaring an emergency due to an in-flight fire and smoke in the cockpit. An emergency evacuation was performed. Of the 58 passengers and 5 crewmembers on board, 3 crewmembers and 5 passengers received minor injuries from smoke inhalation. The airplane sustained substantial fire, heat, and smoke damage. Visual meteorological conditions prevailed at the time of the accident. • A phase-to-phase arc in the left heat exchanger cooling fan relay, which ignited the surrounding wire insulation and other combustible materials within the electrical power center panel. • Contributing to the left heat exchanger fan relay malfunction was the unauthorized repair that was not to the manufacturer's standards and the circuit breakers' failure to recognize an arc-fault. Slide 6

  8. Humans Equipment Machines Environment Maintenance System Slide 7

  9. Maintenance System • Airline companies ensure • Supervision over maintenance operations • Evaluation of internal and external factors • Adherence to quality assurance requirements and FAA regulations • Airline companies oversee their • Flight procedures • Operating methods • Airman qualifications • Auditors’ proficiency • Aircraft maintenance activities Slide 8

  10. Human Factors in Aviation • Human is the central part of the aviation system • Human factors research • places emphasis on humans • focuses on development of error tolerant systems • aims at improved safety by evaluating aircraft maintenance activities • conducts evaluation of maintenance activities in a rigorous fashion Slide 9

  11. Methods to Track Maintenance Errors • Some of the existing methods to minimize maintenance errors: Slide 10

  12. Problem Statement • Reactive in nature • Lack of standardization of data collection • Within an airline • Across the industry Northwest (Aug, 2000) Cause of the accident: Crew failed to secure aircraft parking brake during maintenance Slide 11

  13. Research Overview • How to achieve an effective maintenance system? • Seek input from diverse, reliable, and relevant sources • Perform data collection, data reduction, and data analysis in a standardized fashion • Generate trend analysis for causal factors within and across organizations • Pro-actively identify factors contributing to maintenance errors • Use the historical data to generate trends • Generate an overall safety index Slide 12

  14. WebSAT Research Objective • Web-based Surveillance & Auditing Tool • Capture data from aviation maintenance processes • Analyze data obtained from the maintenance processes • Perform risk assessment and generate trends using the data obtained Slide 13

  15. WebSAT Mission Statement • Product Description • A distributed application, incorporating a recommended categorization and data collection scheme for surveillance and auditing of maintenance operations • A data reduction module that allows analysts to perform preliminary data analysis • A data analysis module that facilitates trend analysis Slide 14

  16. Mission Statement Contd. • Key Business Goals • Achieve standardized data collection, reduction and analysis of maintenance errors across geographically dispersed entities of the airline industry • Develop a proactive system that captures maintenance errors • Accomplish trend analysis in future versions of WebSAT Slide 15

  17. Mission Statement Contd. • Primary Market • Air Transportation Industry • Assumptions • Develop WebSAT such that it adheres to FAA standard research software design specifications (For example SQL server, ASP.NET, PHP) Slide 16

  18. Mission Statement Contd. • Stakeholders • FAA • Carrier Quality Assurance Department including QA representatives and auditors • Carrier Information Technology Department Slide 17

  19. Methodology • Task analytic & user centered software lifecycle methodology guides the following phases • Phase I: Identify process measures • Phase II: Design and develop WebSAT prototype • Phase III: Develop data analysis module Slide 18

  20. Phase I • Phase I: Identifying process measures What are Process Measures? • The process measures incorporate the response and observation-based data collected during surveillance, audits, and the airworthiness directives control processes • Analysis will be conducted based on data obtained from process measures to identify the potential problematic areas affecting the safety of an aircraft • The performance of processes, vendors, auditors and Quality Assurance representatives will also be evaluated Slide 19

  21. Phase I Contd. • Phase I: Identifying process measures • Discover • Understand surveillance, auditing and AD work functions • Conduct interviews, focus groups and observations • Identify • Determine process measures • Validate • Ensure measures are representative of those used across maintenance facilities • Survey other stakeholders to obtain consensus on process measures Slide 20

  22. Phase I: Research Methodologies • Research methodologies adopted in identifying the key maintenance oversight processes and their functionality • Interviews • Focus groups • Observation sessions • Studying documentation • Questionnaires (Source: Iyengar et al., 2004) Slide 21

  23. Phase I: Research Methodologies • Research methodologies (contd.) • Interviews • Meeting airline QA managers • Understand the work environment • Collect useful documents • Stakeholders meet the research team • Observation sessions • Understand how maintenance is done • Observe current product users in work environment • Document studies • Highly regulated industry • Questionnaires • Web-based process measures evaluation survey (remotely) with partnering airlines Slide 22

  24. Phase I: WebSAT Modules Modules in WebSAT • These modules directly influence the air worthiness of the aircraft • The four key processes/modules in the QA department of FedEx identified for WebSAT are: • Surveillance: Oversight of day-to-day maintenance and inspection activities on the airplane • Technical Audits: System level evaluation of standards and procedures of suppliers, fuel vendors, and ramp operations done on a periodic basis • Internal Audits: Evaluation of internal processes in the departments of an organization • Airworthiness Directives: Evaluation of the applicability, loading, and tracking of airworthiness directives Slide 23

  25. Phase I: WebSAT Modules • Surveillance • Primary Objective: • Achieve planned and random sampling , an accurate, real-time, and comprehensive evaluation of vendor’s compliance with the FedEx and FAA approved CAMP, GMM, and regulatory requirements Slide 24

  26. Phase I: WebSAT Modules • Technical Audits • Looks at the system and not the airplane (for example: compliance of flight procedures) • A system is audited annually (longer period) • Types • Supplier audits • FMR audits Slide 25

  27. Phase I: WebSAT Modules • Technical Audits • SupplierAudits • Select audit standards and complete the audit • Report findings to manager • Generate “corrective actions” report • FMR Audits • Aircraft fueling, Line maintenance, Ramp operations • Oversight of these functions accomplished for FedEx owned or contracted facilities Slide 26

  28. Phase I: WebSAT Modules • Internal Audits • The evaluation of internal processes in the departments of an organization to verify their compliance with regulatory, company and departmental policies and procedures. • Five work areas in Internal Audits at FedEx • EMM, FOD, SAI, EPI, and Ad hoc audits • EMM and FOD are termed as internal audits which together comprise of approximately 45 audits (within a department) • SAI and EPI are internal evaluations that are part of ATOS (across the departments) • work towards risk mitigation as opposed to corrective action • Ad hoc audits are those which are conducted as a result of problems that shoot up unexpectedly. Slide 27

  29. Phase I: WebSAT Modules • Airworthiness Directives Control Group: • check for applicability of an AD • responsible for the implementation of any new, revised or corrected ADs • Apply AMOC or use ADNT depending on the AD applicability Slide 28

  30. Internal Audits Phase I: WebSAT Framework WebSAT Framework Slide 29

  31. Phase I: Process Measures • Existing number of process measures for surveillance: 17 • Data collected using work cards • No process measures in place for Technical Audits • Data collected by virtue of checklists • Existing number of process measures for Internal Audits: 6 • Data collected by virtue of checklists • No process measures in place for ADCG • Data collected by virtue of checklists and canned review statements Slide 30

  32. Identified List of Process Measures Slide 31

  33. FROM TO Identification of Process Measures for Surveillance Slide 32

  34. Phase I: Process Measures • Difficulties associated in categorizing data from surveillance work function using the existing set of process measures • Ambiguities exist in classifying data into process measures using CASE standards, GMM and IPM. • Difficulty to choose the most appropriate one for each work card • Memorizing process measures is not a primary task to the QA representatives Slide 33

  35. Phase I: Validation • Online-survey for validation of identified process measures • A 2-phase online survey was conducted for the validation of process measures • First phase with FedEx • Second phase with additional partnering airlines (data still awaited) • Surveillance has 6 process measures • Internal audit has 6 process measures • Technical audits have 7 process measures • Airworthiness directives group has 2 process measures Slide 34

  36. Phase I: Validation Contd. • Online-survey for validation of identified process measures Slide 35

  37. Phase II • Phase II: Develop the prototype of WebSAT • Task analytic and user-centered software development methodology • Converting customer statements to need statements • Develop metrics to measure the performance of the tool with respect to the identified needs • Competitive benchmarking to set target specifications for the tool (Help!) (Source: Ulrich et al., 2004) Slide 36

  38. Phase II: Sample metrics • Sample Customer Statement- Needs- Metrics Slide 37

  39. Phase II: Interface Design • Interface design • Develop user profiles • Develop personas • Develop scenarios • Generate concepts • Combine and refine concepts • Develop low fidelity prototypes • Test it with users Slide 38

  40. Phase II: Work in Progress • Work in Progress • Development of Technical Audit Module Prototype Slide 39

  41. Phase II: TA Persona • Typical persona for technical audit • Name: Eric Brandon • Age: 38 • Job: Auditor at FedEx for the past 4 years • Work hours: 8 am to 5 pm • Education: B.S. Aeronautical Engineering • Location: Memphis, TN • Income: $ 90,000/yr • Skills: Received best auditor award consecutively three times from FAA for the years 2000, 2001 and 2002. Worked in the maintenance hangar for 8 years and is in his current position based on past experience. • Hobbies: Golf, football, parasailing, Jazz music and travel with family Slide 40

  42. Phase II: TA Persona Contd. • Typical persona for technical audit contd. • Goals: Eric performs audits at various vendor locations inside and outside the U.S. • Tasks: • Eric uses a checklist to evaluate a vendor. • Eric enters the responses to the questions in the check-list. • Eric enters comments in the audit report based on the overall performance of the vendor. • Eric also reports the corrective actions that need to be addressed by the vendor. Slide 41

  43. Phase II: TA Persona Contd. • Expectations: • Eric thinks that the product is useful to him because it helps him analyze the data better. • He thinks that the product is easy to use and will help him to evaluate the vendors in a standardized fashion. Slide 42

  44. Phase II: TA Scenario • Typical scenario for technical audit • Positive Scenario : Eric now wants to enter audit data which includes the corrective actions recommended and his overall comments on the audit. He logs into the system, finds the appropriate audit into which the data needs to be entered. After entering the data, he generates the audit report and sends it to his manager, William Cox. • Negative Scenario : Eric had entered preliminary audit data into the system. He continues with his audit on shop floor for a couple of days and returns to the system. He logs into the system but he is apprehensive if the system has stored the information on audit he did not complete. Alas! He could not find the appropriate audit into which the data was entered. He again started entering the entire audit information in the new file. Slide 43

  45. Phase II: Surveillance Persona • Typical persona for Surveillance • Name: Bob Lewis • Age: 42 years • Job: Surveillance Rep. at FedEx for the past 2 years • Work hours: 7 am to 4 pm • Location: Mobile, AL • Income: $ 90,000/yr • Skills: Worked in the maintenance hangar for 3 years in another airline. Has a keen eye for detecting maintenance problems and is in his current position as QA based on past experience. • Hobbies: football, and reading • Goals: Bob performs surveillance at Mobile Aerospace Engineering, AL and believes that his primary motto is to see that the flights which leave the hangar are airworthy. Slide 44

  46. Phase II: Surveillance Persona Contd. • Typical persona for Surveillance contd. • Tasks: • Bob uses the surveillance system to identify incoming aircrafts. • Bob performs surveillance on the work cards that are chosen by the system through random sampling. • Bob enters results of the surveillance into system. • Bob also enters non-routines that need to be attended by the vendor. • Expectations: • Bob feels that the tool would make the categorization of data easy but it adds a step to his routine life. • Bob thinks that the product is useful to him because it helps him analyze the data better. • He thinks that the product will help him to evaluate the vendors in a standardized fashion. Slide 45

  47. Phase III • Phase III • Develop Advanced Data Analysis Module into Prototype Application • module will enable the analyst to conduct advanced analysis of selected data sets • identify problem areas forming the first step to conducting risk assessments Slide 46

  48. Internal Audits Phase III: Data Format WebSAT Framework • Format of data collected: • Surveillance: Accepts, rejects and non-routine for a work card • Technical and Internal Audit: Yes/No, Multiple choice and open ended responses to questions asked in an audit • Airworthiness Directives: Status and location of an AD or EO Slide 47

  49. Phase III: Vision • Vision • The goal of WebSAT is to allow the user to understand the implication of the data collected and presented- not only to the user such as auditor or surveillance representative but also to the managers • WebSAT risk model aims at presenting the risk present in terms of process measures, impact variables and safety index Slide 48

  50. Phase III:Data Analysis Slide 49

More Related