industrial practice on mixed criticality engineering and certification in the aerospace industry n.
Skip this Video
Download Presentation
Industrial practice on mixed-criticality engineering and certification in the aerospace industry

Loading in 2 Seconds...

play fullscreen
1 / 19

Industrial practice on mixed-criticality engineering and certification in the aerospace industry - PowerPoint PPT Presentation

  • Uploaded on

Industrial practice on mixed-criticality engineering and certification in the aerospace industry. 2013-03-22 DATE 2013 / WICERT 2013. Development process in Aerospace. Development process - Definition. Development process – Integration and V&V. Verification vs. Validation.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Industrial practice on mixed-criticality engineering and certification in the aerospace industry' - eaton-puckett

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
industrial practice on mixed criticality engineering and certification in the aerospace industry

Industrial practice on mixed-criticality engineering and certification in the aerospace industry


DATE 2013 / WICERT 2013

who defines the requirements
Who defines the requirements
  • Customer
    • Typically OEM – formal and … “less formal” definition
  • Civil Aviation Authority
    • FAA, EASA, etc., through Technical Standard Orders(TSO, ETSO)
  • Internal standards
    • Body performing the development

! Functional, non-functional, and process requirements

who checks whether requirements are met
Who checks whether requirements are met
  • Customer
    • Typically OEM – Reviews, Validation, Integration testing
  • Civil Aviation Authority
    • FAA, EASA, etc. - Through Certification
  • Internal standards
    • Body performing the development – As defined by operating procedures. Typically aligned with aerospace development processes.
how is certification done
How is certification done
  • Civil Aviation Authority does not only check the requirements testing checklist.
  • CAA primarily controls the Development process and checks the evidence that it was followed
    • Typically, an agreement must be reached with the CAA on acceptable means of compliance/certification. Typically, this includes:
      • Meeting DO-254 / ED-80 for electronics
      • Meeting DO-178B / ED-12B for software (or newly DO-178C)
      • Meeting agreed CRI – Certification Review Item
    • The following items are agreed:
      • The means of compliance
      • The certification team involvement in the compliance determination process
      • The need for test witnessing by CAA
      • Significant decisions affecting the result of the certification process
why not having one process
Why not having one process?
  • CS-22 (Sailplanes and Powered Sailplanes)
  • CS-23 (Normal, Utility, Aerobatic and Commuter Aeroplanes) – <5.6t,<9pax or twin-prop <9t,<19pax
  • CS-25 (Large Aeroplanes)
  • CS-27 (Small Rotorcraft)
  • CS-29 (Large Rotorcraft)
  • CS-31GB (Gas Balloons)
  • CS-31HB (Hot Air Balloons)
  • CS-34 (Aircraft Engine Emissions and Fuel Venting)
  • CS-36 (Aircraft Noise)
  • CS-APU (Auxiliary Power Units)
  • CS-AWO (All Weather Operations)
  • CS-E (Engines)
  • CS-FSTD(A) (Aeroplane Flight Simulation Training Devices)
  • CS-FSTD(H) (Helicopter Flight Simulation Training Devices)
  • CS-LSA (Light Sport Aeroplanes)
  • CS-P (Propellers)
  • CS-VLA (Very Light Aeroplanes)
  • CS-VLR (Very Light Rotorcraft)
  • AMC-20 (General Acceptable Means of Compliance for Airworthiness of Products, Parts and Appliances)
small vs large aircraft cs23 vs cs25
Small vs. large aircraft (CS23 vs. CS25)

Table modified from FAA AC 23.1309-1D

Figure from FAA AC 25.1309-1A

>10-5 10-5-10-9 <10-9


Can happen to any aircraft


Happens to some aircraft in fleet

<10-9 Never happens

software related standards do 178b c
Software-related standards – DO-178B(C)
  • Development Assurance Level for software (A-D)
    • Similar definition also exists for Item, Function …
  • Specifies
    • Minimum set of design assurance steps
    • Requirements for the development process
    • Documents required for certification
    • Specific Objectives to be proven
more than one function
More than one function
  • Two situations:
    • Single-purpose device (e.g. NAV system, FADEC – Full Authority Digital Engine Controller, etc.)
      • Typically developed and certified to a single assurance level
      • Exceptions exist
    • Multi-purpose device (e.g. IMA – Integrated Modular Avionics)
mixed criticality
  • Same HW runs SW of mixed criticality
    • For SW, more or less equals to mixed DAL
  • Implicates additional requirements
    • Aircraft class specific
    • CS-25: Very strict
      • Time and space partitioning (e.g. ARINC 653)
      • Hard-real-time execution determinism (Problem with multicores)
      • Standards for inter-partition communication
    • CS-23: Less strict
      • Means to achieve safety are negotiable with certification body
design assurance verification certification
Design assurance, verification, certification
  • Design assurance + qualification  does it work?
    • Functional verification
    • Task schedulability
    • Worst-case execution time analysis
    • … and includes any of the verification below
  • Verification  does it work as specified?
    • Verifies requirements
      • Depending on the aircraft class and agreed certification baseline, requirements might include response time or other definition of meeting the temporal requirements
        • On singlecore platforms, this is composable – isolation is guaranteed
        • On multicore platforms – currently in negotiations (FAA, EASA, Industry, Academia, …)
      • Requirements are typically verified by testing
  • Certification  is it approved as safe?
    • Showing the evidence of all the above
software verification means 1
Software verification means (1)
  • Dynamic verification
    • Testing – based on test cases prepared during the development cycle. Integral part of the development phase.
      • Unit testing – Is the implementation correct?
      • Integration testing – Do the units work together as anticipated?
      • System testing – Does the system perform according to (functional and non-functional) requirements?
      • Acceptance testing – As black-box - does the system meet user requirements?
    • Runtime analysis – Memory usage, race detection, profiling, assertions, contract checking … and sometimes (pseudo)WCET
    • Domain verification – not with respect to requirements, but e.g. checking for contradictions in requirements, syntax consistency, memory leaks
software verification means 2
Software verification means (2)
  • Static analysis – type checking, code coverage, range checking, etc.
  • Symbolic execution
  • Formal verification – formal definition of correct behavior, formal proof, computer-assisted theorem proving, model checking, equivalence checking
    • Not much used today … just yet
verification for certification
Verification for certification
  • Typical approach is to heavily rely on testing, supported by static analysis and informal verification
  • DO-178C (recent update of DO-178B) defines requirements for using
    • Qualified tools for Verification and Development – DO-330
    • Model-based development and verification – DO-331
    • Object-oriented technologies – DO-332 (that’s right, 2012)
    • Formal methods – DO-333
      • To complement (not replace) testing
  • Tool qualification
    • Very expensive
      • Tools that can insert error: basically follow the same process as SW development of same DAL. Usable only for small tools for specific purposes.
      • Tools that replace testing: for A,B: developed and tested using similar development process. For C,D: subset needed. Can use COTS tools.
near future
Near future
  • Replacement of (some) testing by qualified verification tools

 Towards formal verification

  • Adoption of DO-178C instead of DO-178B
  • Definition of technical and process requirements for multicore platforms
  • Growth of model-based development and verification
  • Enablers for incremental certification (which assumes composability of safety)
and that s it

... and that’s it!

Ondřej Kotaba