1 / 12

Dale Johnson (MITRE) Myong Kang (Mitretek) Doug Williams (MITRE)

Organically Assured and Survivable Information Systems (OASIS) Program Validation Framework Summarization Survivability Coverage. Dale Johnson (MITRE) Myong Kang (Mitretek) Doug Williams (MITRE). Objective. What is the space covered or protected by OASIS technologies?

Download Presentation

Dale Johnson (MITRE) Myong Kang (Mitretek) Doug Williams (MITRE)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Organically Assured and SurvivableInformation Systems (OASIS) Program Validation Framework SummarizationSurvivability Coverage Dale Johnson (MITRE) Myong Kang (Mitretek) Doug Williams (MITRE)

  2. Objective • What is the space covered or protected by OASIS technologies? • From the information assurance and survivability validation framework summaries produced by the OASIS PIs create a coverage matrix that shows the overall coverage that the projects collectively provide against a standard list of vulnerabilities and attacks to yield the five main information assurance and survivability attributes

  3. Abstract Coverage Matrix

  4. Basis (1 of 2) • PIs provided validation framework summaries for 24 projects using the standard format discussed at the last PI Meeting in July 2001 • Thanks to PIs for their excellent efforts • OASIS projects cover an extensive set of domains of application • Domains emerged from an initial analysis of the projects • Validation framework summaries for the projects used various lists of vulnerabilities and attacks, which were not standardized over all the projects

  5. Basis (2 of 2) • Time (when) and place (where) of vulnerabilities and attacks were interpreted differently by various PIs: some were given at the point of “origin” where a countermeasure/mechanism can be effective and some were given at the point of the effect of the vulnerability or attack • The standard format used the DoD attributes (Joint Pub 3-13, Joint Doctrine for Information Operations): confidentiality, integrity, system availability, authentication, and nonrepudiation

  6. Development of Coverage Matrix (1 of 3) • We grouped the 24 OASIS projects in the following application domain categories in order to provide some consistency for our analysis across the projects • Modeling • Implementation/source code • Mobile code • In-line technologies • Distributed applications/middleware • Server (Web or mail) and client • Distributed file system • Database • Firmware

  7. Development of Coverage Matrix (2 of 3) • We developed a “standard” list of 32 vulnerabilities and attacks grouped by design, implementation, and operation time (when) and place (where) hardware/firmware, network, servers/clients, etc. • The list was derived from the sets of vulnerabilities and attacks provided by the PIs and our discussions and analysis • A standard list is a “moving target”, which is difficult to finalize, but the list we developed seemed to fit current needs

  8. Development of Coverage Matrix (3 of 3) • We mapped the project vulnerabilities and attacks into our standard list of vulnerabilities and attacks as closely as possible • Not a straightforward task, since interpretations can vary • We then used the rationale tables provided by PIs to map from the standard list of vulnerabilities and attacks to the five standard attributes with entries designating the projects and corresponding mechanisms (countermeasures) that counter the vulnerabilities and attacks to yield the attributes

  9. Result • A full coverage matrix • X-axis: Our standard list of common vulnerabilities and attacks (reasons/motives for inserting mechanisms/countermeasures) • Y-axis: The five standard security attributes (positiveresults of inserting mechanisms/countermeasures) • Entries: Projects and corresponding protection mechanisms from the rationales provided by the PIs • Coverage depends on the interpretation and effectiveness of the mechanisms that counter the vulnerabilities and attacks to yield forms of the standard attributes

  10. Possible Changes from PIs • Reclassify project placement in application domain categories • Add or delete vulnerabilities and attacks that project addresses • Add more mechanisms/countermeasures • Adding items in the standard list of common vulnerabilities and attacks should be based on consensus

  11. Possible Next Steps • Analyze application domains further to characterize projects more accurately within those domains • Provide simple abstract models of the domains • Determine coverage of projects in greater detail relative to individual entries in matrix, since entries correspond to shades of gray, not black or white • Explore coverage with a view to selecting technologies for future integration efforts for building systems • Is it possible to produce a simple handbook to aid in selecting technologies for integration efforts? • Develop selection criteria for OASIS technologies

More Related