1 / 24

Benchmark Exercise of Safety Evaluation of Computer Based Systems

Benchmark Exercise of Safety Evaluation of Computer Based Systems.

Sophia
Download Presentation

Benchmark Exercise of Safety Evaluation of Computer Based Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas1, C.Kirchsteiger1, B.Soubies2, F.Daumas2, J.Gassino2, JC. Péron2, P. Régnier 2, J. Märtz3, M. Baleanu3, H. Miedl3, M. Kersken 3, U. Pulkkinen4, M. Koskela4, P. Haapanen4, M-L.Järvinen5, H-W.Bock6, W.Dreves6, 1) EC DG-JRC, IE (NL); 2) IRSN (FR) 3) ISTec (D) 4) VTT (FIN) 5) STUK (FIN) 6) Framatome ANP (D) To be presented at post -FISA’2003 workshop JRC – IE Petten

  2. Project partners Project duration: 01/2001 – 12/2003 Project coordinator: EC-JRC-Inst. for Energy, Petten (Netherlands) Project started by: EC-JRC-IPSC, Ispra (Italy) Industrial partner: Framatome ANP (Germany, form. Siemens) Assessment teams: IRSN (France) ISTec (Germany) STUK and VTT (Finland) JRC – IE Petten

  3. Project objectives The project primary target is a comparative evaluation of existing safety assessment methodologies of safety critical computer based systems in use in the nuclear field among EU regulators and technical support organizations. JRC – IE Petten

  4. Work packages WP1: High-level specification of the Benchmark Exercise WP2: Reference software study case definition and design WP3: Final specification of the assessment methodologies WP4: Application of the assessment methodologies WP5: Comparison of the assessment methodologies WP6: Project coordination and financial coordination JRC – IE Petten

  5. Project implementation Framatome ANP provided a reference case study of a hypothetical reactor protection system, including the requirements and functional specification of a limited number of safety functions that were selected by the project partners. The proprietary documentation was made available to the assessor partners, namely IRSN, ISTec, STUK and VTT. Each assessor applied their assessment methodology to the reference case study. The comparison study was performed to highlight the current practices and methods used in the field by major research and regulatory support organizations. JRC – IE Petten

  6. Reference study case The case study comprised a limited part of a complete safety I&C modernization project, using the tools of the TELEPERM XS system platform. 8 MADTEB group functions were selected from the safety functions of the KWU Konvoi plants and which were intended to be applied for the Finland 1400 MW PWR plant (status of 1993 year). The MADTEB functions are part of the reactor limitation system and limit the allowed range of process variables (mainly coolant pressure and pressurizer level) of the primary coolant loop of the reactor. JRC – IE Petten

  7. Reference study case MADTEB functions - A33 Reduction of leakage in case of steam generator tube rupture -  A34 Ensure effectiveness of extra borating system spraying -  C31 Prevent violation of maximum allowable working pressure -  D01 Prevent inadvertent opening of 1st pressurizer safety valve -  D02 Prevent response of 2nd and 3rd pressurizer safety valve -  D32 Pressurizer overfeed protection -  D33 Prevent loss of coolant via stuck open 1st pressurizer safety valve - J34 Prevent emptying of pressurizer JRC – IE Petten

  8. Reference study case design process • Provision of typical documents to be developed in a safety I&C modernization project; • Specification of the requirements to be met by the system; • Specification of the safety system on the basis of the TELEPERM XS system platform; • Detailed design of the functions; • Verification of the design using the SPACE-engineering tools of TELEPERM XS; • Production of code which is able to run on an existing test system; • Demonstration of operation of the code in the test system; • Validation tests of the software. JRC – IE Petten

  9. Reference study case documentation • All the benchmark study related and generic Teleperm XS system documentation was available to the assessors; • Software source code was available upon request; • A number of technical meetings were organised to provide clarifications and discuss technical questions; • FANP made available simulation testing of the benchmarked software JRC – IE Petten

  10. Reference study case limitations As a consequence of the limited scope essential parts of a real project were not performed, e. g. - Validation of the functional requirements - Design of interfaces to other systems - Hardware procurement and manufacture - Validation of the hardware JRC – IE Petten

  11. Assessment studies and results To be presented by each partner: IRSN ISTec VTT/STUK JRC – IE Petten

  12. Comparison procedure The comparison is based for the following main items: - technical basis of different approaches, - depth of analysis allowed to perform by the methodologies; - availability of various methods and analysis tools used for assessment; - assessment phases; - assessment results and findings. JRC – IE Petten

  13. Comparison procedure The comparison procedure will not target to - identify any possible deficiencies in the methodologies; - decide which methodology is better or worse; - conclude anything about safety of study case software. JRC – IE Petten

  14. Comparison procedure The comparison procedure resulted in a descriptive study of the following main items: - Comparison of the methodological approaches; - Comparison of the assessment studies; - Comparison of the assessment results and findings. JRC – IE Petten

  15. Comparison: Regulatory requirements All three assessment teams follow national regulatory requirements, which are mainly based on the international IEC 60880 guide. Although based on the same international standard, at the national level, the requirements are slightly different. For example, finnish regulatory guide YVL-5.5 requires quantitative reliability analysis for the safety critical class 1 computer based systems, while the French and German regulations do not. Also finnish regulation explicitly requires FMEA to be performed. JRC – IE Petten

  16. Comparison: Life cycle All assessment teams follow basically the same assessment steps that correspond to life cycle phases. The typical assessment procedure starts with general assessment of the quality assurance and V&V plans and engineering process itself. Then each life cycle phase is evaluated. Although different in titles, the content of the life cycle phases is nearly the same. The following life cycle was used for the comparison purposes: - Requirements specification - System specification - Detailed Design - Code generation - Testing JRC – IE Petten

  17. Comparison: Quality assurance and eng. process A number of deficiencies were identified by the assessors in this phase. It is important to note that most of them are related to limitations of the study case. This especially concerns FANP testing strategy, as all test were performed by simulation tool SIVAT, but not on a realistic system. In the frame of this benchmark study, testing on a real system would not justified due to limited resources of the project. However, some identified deficiencies, like lack of united life cycle model description or lack of rigorous V&V procedures could be useful to improve the software documentation. JRC – IE Petten

  18. Comparison: Requirements specification The results of the assessment indicate the need for independent verification of each development step. As a validation of the specification of the requirements by a process engineer was out of the scope of BE-SECBS, a fault could be identified (incorrect operation of AA011 valve). JRC – IE Petten

  19. Comparison: System specification As all previous ones, also this assessment step was performed by the assessors only by a critical review of the documentation. One assessor also used the SPACE-tool for navigating and tracing signal-paths. No critical faults were identified. The deficiencies mentioned by the assessors would be useful to improve the development process and documentation. JRC – IE Petten

  20. Comparison: Detailed design Most of the deficiencies reported are related to limitations of the benchmark exercise. However, supported by the SPACE-tool during checking the detailed design, one assessor detected an inconsistency in the function diagram JEB00CS811. According to FANP, it has no impact on the functional behavior of the integrated system, but this unintended inconsistency passed developers verification process. This confirms that in addition to any manufacturer's internal verification process an external independent validation by an assessor is needed. JRC – IE Petten

  21. Comparison: Source code No major errors were reported, but the source code analysis by static analysis tools (QAC, Polyspace Verifier, RETRANS) provided some interesting insights that would be hard to observe manually. In addition, a number of potential problem areas were reported that would require more detailed analysis, which was, however, not performed due to limited project resources. JRC – IE Petten

  22. Comparison: Testing The assessment results of the testing life cycle phase revealed some differences in the approaches and depth of the analysis: • One assessment team concluded that due to the normed structure of the TXS C-code, the definition of the amount of testing that is required is only based on the functionality of the system and not on code-coverage measuring. Within BE-SECBS, the amount of functional testing could be accepted as sufficient; • Another assessment team identified the missing tests and other non-compliances with the regulatory requirements and suggested additional tests. These differences could come from the different approaches, analysis tools and different requirements applied. JRC – IE Petten

  23. Comparison: Quantitative reliability analysis A Bayesian network (by VTT/STUK) was developed, consisting of the following variables: requirement specification, concept design, detailed design, application C-code, code compilation and linking, platform software and integrated system tests. The main information on which the Bayesian network is built is coming from the limited qualitative assessment, which is based mostly on a critical review of the documentation. No tools or other assessment methods were applied that could enhance the credibility of the quality rating. The quantitative reliability study could significantly be improved with the information that is now available from the qualitative analysis performed by IRSN and ISTec by using QAC, Polyspace Verifier, Claire, Gatel, RETRANS tools. JRC – IE Petten

  24. Comparative assessment conclusions • The actual findings (in requirements specification and detail design) were identified, this confirms the need for independent as well as internal verification and validation processes to be performed; • Assessment tools (both own developed or standard) could enhance a lot the depth of the assessment and the credibility of the evaluation; • Quantitative software reliability analysis represents a useful analysis item that could be used also in PSA studies. The credibility of the analysis could be enhanced by information from the qualitative analysis, performed by various analysis tools, used in the exercise. JRC – IE Petten

More Related