1 / 17

Dong-Han Ham , JinKyun Park, & Wondea Jung

Presentation at ‘In Use In Situ’ Workshop. Empirical Studies Using a Full-Scope Simulator of NPPs. Dong-Han Ham , JinKyun Park, & Wondea Jung. October 28, 2005. Introduction. 1 st Study: New Diagnosis Procedure. 1. 2. 3. 6. 4. 5. 2 nd Study: Human Performance Data. Summary.

leia
Download Presentation

Dong-Han Ham , JinKyun Park, & Wondea Jung

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Presentation at ‘In Use In Situ’ Workshop Empirical Studies Using a Full-Scope Simulator of NPPs Dong-Han Ham, JinKyun Park, & Wondea Jung October 28, 2005

  2. Introduction 1st Study: New Diagnosis Procedure 1 2 3 6 4 5 2nd Study: Human Performance Data Summary 3rd Study: Task Complexity Measure Discussion Contents

  3. Introduction (1) • Methods for measuring human performance • Full-scope simulator • Simulates work contexts as much close as possible to real ones • Software and hardware complex to be a replica of real systems • Main usages are training, qualification test, V& V, and so on Paper prototype Laboratory-scale simulator Full-scale simulator Real-world observation Fidelity High Low

  4. Introduction (2) • Use of full-scope simulator • Makes it possible to observe human behaviour with considerable degree of freedom of uncontrolled variability • Needs much time and effort -> how to conduct experimentations using this simulator in an economical way • Typical methodological issues should be discreetly handled • Selecting participants and Determining sample size • Designing experimental tasks and Determining performance measures • Managing extraneous variables • Etc. • Despite several advantages, it is not easy to do empirical studies using a full-scope simulator In order to give useful information for researchers planning the use of full-scope simulator, this paper aims to introduce our experience on three empirical studies using a full-scope simulator of nuclear power plants (NPPs) and to discuss several methodological issues

  5. 1st Study: New Diagnosis Procedure (1) • Purpose • To examine the effectiveness of new diagnostic procedure which was cognitively engineered • Background • Diagnosis procedure, whether it is written or electronic, helps operators identify the nature of on-going events and select appropriate emergency operating procedures (EOPs) • There is no established framework for designing cognitively engineered procedure in a systematic way • We have developed a framework and designed new diagnosis procedure based on the framework • Validation on the effectiveness of new procedure • 1st: interview with plant experts -> positive to new procedure • 2nd: this empirical study using a full-scope simulator -> compared in-use procedure and new procedure -> showed the superiority of new procedure

  6. 1st Study: New Diagnosis Procedure (2) • Participants • Twelve main control room (MCR) operating crews actually working in the reference NPP • Each crew consists of four operators who has distinctive roles: SRO, RO, TO, EO • Experimental design • Latin-square, within-subject design with treatments of the types of procedure and diagnosis tasks • To avoid carry-over effects, combination of tasks and procedures were counterbalanced • Tasks • Diagnosis in six accident scenarios (e.g., loss of all feedwater) • Performance measures • Diagnosis time • Diagnosis accuracy • NASA-TLX

  7. 2nd Study: Human Performance Data (1) • Purpose • To collect human performance data using the simulator and to evaluate the validity of the data • Background • We developed a DB called OPERA (Operator PErformance and Reliability Analysis) and collected human performance data by using OPERA • The data was obtained under emergency situations emulated by the full-scope simulator • The data was collected over three years (Sep 1999 to April 2001) • The data was collected when 24 different MCR operating crews were retrained • Two task analysis methods were used for analyzing performance data: Protocol analysis and Time-line analysis

  8. 2nd Study: Human Performance Data (2) • Performance data • Performance time of SPTA procedure • Event diagnosis time • Performance time of procedural steps in ORP • Task completion time • Factors the difficulty of performing diagnosis procedure • Types of non-compliance behaviours when conducting procedural steps • Validation of the data • We compared the obtained data with those from three other studies under different simulation environments and one real situation study • Comparison showed that the operator’s performance could be reasonably predicted and understood by the obtained data

  9. 3rd Study: Task Complexity Measure (1) • Purpose • To validate new measure, called TACOM (TAsk COMplexity), which quantifies the task complexity in emergency operating procedures • Background • TACOM is composed of five sub-measures, each of which is quantified by graph entropy concept • SIC (Step Information Complexity) • SLC (Step Logic Complexity) • SSC (Step Size Complexity) • AHC (Abstraction Hierarchy Complexity) • EDC (Engineering Decision Complexity) • Task complexity is a weighted Euclidian norm of five measures

  10. 3rd Study: Task Complexity Measure (2) • Validation of TACOM • We compared the data obtained from OPERA with the associated TACOM scores (28 tasks were used) • Results showed the strong correlation between the estimated TACOM scores and the averaged task performance time data

  11. Discussion (1) • The reasons for conducting empirical studies • To evaluate the new concept of design solutions and recommend for better design (<- 1st study) • To observe variables related to human performance and identify their relations (<- 2nd study) • To develop and validate theories or concepts (<- 3rd study) • To assess the conformance of the artefacts in use to standards • To evaluate the improvement or changes in a system • Etc. • Important methodological issues will differ by the purpose; however, the following four issues are in common • Experimental context • Performance Measure • Validity • Evaluation Aspect

  12. Discussion (2) • Experimental context • Study using a full-scope simulator has a high fidelity to actual work situations (e.g., six diagnosis tasks in the 1st study can be said to be realistic in terms of human-system interaction context) • Generalization of experimental contexts in a full-scope simulation would be a minor issue • More critical issue is how to increase or manage the complexity of tasks along the experimentation or how to compare performance data on different trials (e.g., the comparison of OPERA data with those of three other simulation studies) • Performance Measure • To exhibit anticipated effects, diverse kinds of measures should be employed • Detailed task analysis is helpful to identify meaningful measures (e.g., diagnosis accuracy and mental workload measure in the 1st study) • Sometimes subjective performance measures can play a critical role

  13. Discussion (3) • Validity • Three types of validity: construct, internal, and external • Typical factors degrading validity: accidental occurrence, maturation effects, change of measurement tool, and so on (-> all factors should be fully considered, like any other studies) • Construct validity: researchers should try to test as many levels of independent variables for the concept as possible (e.g., more than 20 tasks in the 3rd study) • Internal validity: strict and effective control of confounding variables (e.g., counterbalancing to avoid order effects) • External validity: full-scope simulation study has high external validity, generalization of experimental results and contexts • Evaluation aspect • Three aspects: compatibility, understandability, and effectiveness • Full-scope simulator: only efficient for effectiveness (-> in order to avoid unwanted time and effort, researchers should solve compatibility and understandability issues before conducting study using a full-scope simulator

  14. Summary • Introduced three studies using a full-scope simulator of NPPs., which examined the interaction between human and complex process control systems under cognitively demanding tasks • To enjoy the benefits from full-scope simulation studies, researchers should • Have a thorough preparation of experimental design and data analysis and broad knowledge ranging from statistical methods to work domains • Conduct analytical evaluation of the work domain or tasks to secure the high-quality simulation studies

  15. Q & A Thank You

  16. In-use procedure New procedure In-Use and New Procedures Appendix

  17. Example of Protocol Analysis Appendix

More Related