1 / 9

HCMDSS Panel Software and Systems Engineering

HCMDSS Panel Software and Systems Engineering. John Anton Kestrel Institute November 16-17, 2004. State of commercial art. How it goes today (roughly): requirements --> spec (maybe UML) --> (partially automated) code production --> testing (unit, integration, model checking) [spiral]

albert
Download Presentation

HCMDSS Panel Software and Systems Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HCMDSS Panel Software and Systems Engineering John Anton Kestrel Institute November 16-17, 2004

  2. State of commercial art • How it goes today (roughly): • requirements --> spec (maybe UML) --> (partially automated) code production --> testing (unit, integration, model checking) [spiral] • Use ‘best practices’ (e.g., CMM-N) • UML-based tools • Labview, MathWorks (Matlab, Stateflow, Simulink), Modelica • Documentation support (e.g., through UML tools, 3GL IDEs, etc.) • Quality assurance • In-house QA, COTS tools, outsourced services • Problems • air gaps • referential integrity • tool semantics, tool integration • code visibility/accessibility (e.g., Labview, MathWorks) • code portability (e.g., MathWorks) • property assessment on code • MC/DC testing impracticality • high assurance can be at odds with code clarity • non-uniformity of product design policies and their application

  3. Best practice SEI (CMM-N) Praxis (best practice on steroids) Others Model checking CMU (strong leadership) NASA (with work from U Kansas) U Cincinnati (BDDs) Rockwell-Collins (with work from UT/Austin) Others Code QA suppliers tool vendors service providers “N-GL” environments Programmatica (OGI/Galois) Eclipse (IBM, public domain) Specware (Kestrel Institute, Kestrel Technology) “Safe” code Simple (MISRA) C (JPL with Kernighan & Ritchie support) Safety critical Java (The Open Group thrust with Bush, Bollella, Locke support) Correct-by-construction technologies Kestrel, NASA, Z, B, … Automated certification support AutoSmart (JavaCard, FIPS 140-2, Kestrel) Reusable (certified) modules Middleware (VU, Wash U, …) Others Aspect weaving Code level (AspectJ,UBC, IBM) Spec level (HandlErr, etc., Kestrel) Others … Some current research for high assurance code

  4. Problems to address for HCMDSS • Language • Inconsistency, lack of precision • Multiple disciplines for regulatory evaluators to contend with • Software spectrum, domain details • Blank screen • For developers, testers, evaluators • Application code reuse has not met initial promise • Optimization, platforms, change impact, mismatched models, properties of composition

  5. Considerations • Formal Jargon • Libraries of specifications

  6. Toward efficient (re)certification - Formal Jargon • What is it? • In each domain, a description in logic of basic terms, definitions, axioms, desirable properties, functionality, behavior, constraints • Organized in a semantically rich taxonomy (systematic evolution) • Developed, published and maintained as a standard • Why consider it? • Communication (developers, plug & play, FDA, …) • Improve economics in the certification process • Basis for (abstract) specification libraries • How to get there? • Consider development of a new “product line” of standards (NIST, The Open Group, OMG) • Domain participants collaborate with regulatory bodies (FAA, FDA,…) • Start with a single domain to serve as style-guide for others

  7. Toward efficient (re)certification - Specification and proof libraries • Use formal (standardized) language (Formal Jargon) • Libraries of specifications • Standardized, domain-specific language • Proven properties • Support ‘plug & play’ • Address • functionality & behavior • interfaces (static and dynamic aspects) • “policies” (e.g., error handling) • Include reference implementations and compliance tests • Proof libraries • Mechanisms for field-time certification maintenance • Run-time monitoring archive review • Pharmaceutical experience -- but don’t wait for bad news • FAA framework for airplane maintenance

  8. Summary • Promising directions • Formality • Abstraction • Challenges • Composition • “Policy” (design-level mandates) • Runtime uncertainties • COTS components and certification • Tech transfer

  9. Bio John Anton is the founder of Reasoning Systems, and Kestrel Technology LLC, where he is now President/CEO. He is also President/CEO/Co-founder of the non-profit Lexia Institute, whose mission is to develop and deliver technology to help dyslexic people and their teachers. In addition, he is a Manager at the Kestrel Institute. Anton has expertise in the areas of control theory, signal processing, software technologies, and their application. As VP for Advanced R&D at Systems Control, Inc., he led a team that built the Reconfigurable Inflight Control System (RIFCS) for McDonnell Aircraft – built using technology from CTRL C (the predecessor to today’s Matlab), which was also built under his leadership. Anton was an Adjunct Professor at Santa Clara University where, for 10 years, he taught courses in linear systems theory, optimal and stochastic control, and decision theory. He received a Ph.D. in Applied Mathematics from Brown, a B.S. from Notre Dame, and was a Fulbright Fellow at the Technische Hochschule, Germany.

More Related