1 / 22

Development of Indicators for Integrated System Validation

Development of Indicators for Integrated System Validation. Leena Norros & Maaria Nuutinen & Paula Savioja VTT Industrial Systems: Work, Organisation and System Usability Research 20.1.2005. Outline of the Presentation. NPP control room modernizations Integrated System Validation (ISV)

faolan
Download Presentation

Development of Indicators for Integrated System Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Development of Indicators for Integrated System Validation Leena Norros & Maaria Nuutinen & Paula Savioja VTT Industrial Systems: Work, Organisation and System Usability Research 20.1.2005

  2. Outline of the Presentation • NPP control room modernizations • Integrated System Validation (ISV) • Performance indicators in validation • Development of the evaluation framework for intelligent environments • Conclusions

  3. NPP Control Room Modernizations • Current control and automation systems are being modernized • No changes to the degree of automation • Technological rationale for the change • Maintenance costs • Lack of spare parts • Technological possibilities exist • Different strategies adopted by the utilities • Some human centered design principles implicitly adopted in the projects • Happening at the same time • OL3 • Generation change within the personnel of existing NPPs

  4. NPP Control Room Modernizations: The Effective Changes • Loss of individual data points and controls in the information panels and desks • The decrease in peripheral information  Tacit knowledge, Process feel and awareness • Spatial memory  memorability, skill based behavior, response times • Co-operation within the crew  communication, group awareness • Adoption of individual information displays • Sequential use of information instead of parallel, “key hole effect”  Windows and dialogs might hide information • Active searching required  understanding of the available resources • Secondary tasks from manipulating the interface  possibility of confusion, response times • Higher abstraction level in the information  orientation, constraints and possibilities • Adoption of large screen displays • Basis for shared co-operation  group SA • Higher abstraction level in the information

  5. User practices Control and automation system UI Process performance NPP Control Room Modernizations: Model of the Change User practices Control and automation system UI Process performance

  6. How do we know that a complex system can be safely operated?

  7. Integrated System Validation • Performance based evaluation of the integrated design, in order to ensure that the human system interface supports the safe operation of the plant • Use of full scope simulator • The effect of contextual and situational factors to the safety of operation must be evaluated • Towards the end of the design process • The total system is available • After the training period of the operators • Use of actual crews, representative sample of the population • Use of normal conditions, specific failures, accidents, beyond design basis events • Compare the selected measures with the predefined acceptance criteria

  8. Integrated System Validation: Current Problems • Which indicators to use • Which measures reflect the safety • Which measure are relevant in the change situation • Which measures reflect performance in a way that can be generalized • How to set the criteria • What is the acceptable level of performance with the selected indicators • The effort needed, the amount of testing required • Generalization of the results Norros & Savioja 2004, Heimdal et. al. 2004

  9. Validation: The Problem of the ε – case How to predict what will happen in a very rarely occurring beyond design basis, beyond validation possibilities, event that nobody predicted ever to happen? Predictive capabilities of validation procedures?

  10. Performance Indicators: Development Challenges • Process performance • Do not really differentiate enough • High degree of automation • Complex defenses within the system • Thorough training process • Difficult to anchor to the HF-related changes taking place in modernization • Not predictive of future performance in the conditions not tested • Human performance • Do not describe how and based on what underlying assumptions the crew acts in the situation • Not predictive of future performance in the conditions not tested

  11. Evaluation framework Evaluation framework Evaluation framework preliminary version version 1 version 2 Simulation & Simulation & Evaluation Evaluation  BASELINE Current control room Modernisation phase 1 Modernisation phase 2 The Development of the Evaluation Framework Evaluation Framework Development Simulation & Evaluation The Design Process

  12. User practices Control and automation system UI Process performance Concept of System Usability • System Usability: The effect of the emerging technology on the whole activity system • In NPP modernizations: the effect on process performance, user practices, user acceptance • System Usability denotes how the system works as a • Material • Cognitive • Communicative tool in an organization promoting the fulfillment of the core task

  13. Modelling Domain - motives &objectives - functions Situation - constraints &possibilities - resources Complexity - interactions - dynamics - uncertainty PRACTICE Indicators Outcome - process measures - error - work load - procedure following Way of acting - orientation - way of perception and action - way of collaboration - way of communication - way of using procedures Assessment Criteria Effectiveness & efficiency - process parameters - number of errors - TLX - number of deviations Core-task oriented appropriateness - realistic-objectivistic - reactive-interpretative - transparency of actions - shared horizon and meaning - understanding the rationale as making sense External good Situational criteria Internal good Assessment of system usability Data Empirical - orientation interview - simulator run - stimulated process tracing interviews - interface interviews Course of action analysis - goals, perceptions & actions - communications - resource utilisation EXPERIENCED APPROPRIATENSS Indicators - trust -utilisation of functional possibilities Criteria - Evidence of the possibility for creating new usage practices and culture

  14. Conclusions • Traditional scientific performance measures do not differentiate between UIs in a highly automated environment  more profound criteria in assessment are needed • A system with high system usability induces good working practices on the users • With practices individual users cope with system uncertainty which is a critical demand in the NPP environment • In validation practices within a new system will be compared to the practices in the baseline evaluation within the valid traditional system • Further work: Connect the practice-driven performance indicators to the changes in the modernization

  15. Thank You!

  16. Classification of User Practices • Reactive • Repetition of pre-learned • Not understanding the reasoning behind i.e. procedures • Diffuse • Characteristics of both reactive and interpretative • Interpretative • Takes into account the situational variation in objectives • Attempts to interpret what contextual factors have an effect • Understands the trade off between the actions for acute and chronic cures; the effect of one’s own actions to the overall performance goals of operation

  17. Practice Related Criteria in Validation Baseline Validation Interpretative Diffuse Reactive Fail Fail Pass Acceptable: Rea validation≤ ReabaselineΛInt validation ≥ Int baseline

More Related