1 / 24

Developing IV&V Information Assurance Analysis Techniques

Roger Harris Joelle Loretta. Developing IV&V Information Assurance Analysis Techniques. -or- Paranoia for Fun and Profit. Agenda. IA definition Integrating IA analysis into IV&V processes Selecting projects Understanding minimum success Determining criticality( IV&V PBRA/RBA)

shelley
Download Presentation

Developing IV&V Information Assurance Analysis Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Roger Harris Joelle Loretta Developing IV&V Information Assurance Analysis Techniques -or- Paranoia for Fun and Profit

  2. Agenda • IA definition • Integrating IA analysis into IV&V processes • Selecting projects • Understanding minimum success • Determining criticality( IV&V PBRA/RBA) • Creating and refining analysis techniques and methods

  3. Information Assurance • Managing risk related to the systems that use, process, store, and transmit data • NASA’s focus • Protect data and systems from threats • Ensure continuity of operations • What threats exist to NASA data?

  4. Selecting Projects • What kind of projects need Information Assurance? • Those that store, use, process, or rely upon data • What projects don’t do that?? • What risks exist to projects that provide for Information Assurance? • Confidentiality • Integrity • Availability

  5. Minimum Success • Minimum Success Criteria for information systems must provide for the confidentiality, integrity, and availability of: • User Data • Telemetry, scientific data, data stores, medical records, financial records • System Configuration • Router firmware settings, modem software, database connections, administrative settings, radio frequencies, crypto keys • System Software • COTS, Modified COTS, GOTS, Open source software, cryptographic implementations, managing interfaces between systems

  6. Determining Criticality • To determine criticality, the IV&V Program uses: • Portfolio Based Risk Assessments (PBRA) • Risk Based Assessments (RBA) • Need to establish criteria • Many sources exist to help determine the impact of an IA event harming a system • Few sources help to determine likelihood – opportunity for future research • National Institute of Standards and Technology (NIST) recommends using qualitative or quantitative techniques • Each has drawbacks

  7. Qualitative Likelihood • Qualitative example: • Using the Facilitated Risk Assessment Process (FRAP) to guesstimate • Project Impact Assessment Checklist • FRAP bases likelihood on the analyst’s expectation of whether they expect the threat to be exploited in the next year or not. • NIST focuses on threats from capable and motivated threat sources

  8. Quantitative Likelihood • Quantitative Example: • Using statistics to establish the rate of specific IA events occurring • Log analysis to determine probability • We know we prevented many kinds of events, but what about ones we don’t know we prevented, or the ones we don’t know we did not prevent? • NASA Security Operations Center and Code 700 have raw data on attacks

  9. Creating new methods • Methods drafted based on the milestones of the Risk Management Framework • Helps fill the gap between the project creating the controls and the evaluator reviewing the controls • Need more methods • How does this impact code analysis? • How do we perform configuration analysis? • How do we decide if a COTS product is safe? • How do we ensure that software is integrated securely? • You can’t test for everything, so how do we know we’ve tested enough?

  10. Drafted methods • Based on NIST Risk Management Framework (RMF) • Security Requirements Validation • Security Control Selection • Security Control Implementation • Security Remediation • Security Authorization Analysis

  11. Code Analysis • Secure Coding Standards • https://www.securecoding.cert.org • Code inspection tools • Klockwork, HP Fortify, MS SDL • Top 25 most dangerous software errors: • SQL Injection • OS Command Injection • Buffer Overflow • Cross Site scripting • Missing authentication for critical functions • See the rest at http://cwe.mitre.org/top25/

  12. Configuration Analysis • Configuration Analysis depends entirely on the system under discussion • Domain experts relied upon to use industry best practices • NIST RMF and manual inspection by domain experts are good mitigations, however the best defense is proper configuration management • Common types of problems with configurations • Default system accounts enabled, allowing unknown remote attacker to connect to device and view data • Default or incorrect ports opened on perimeter devices allowing remote attacker to infiltrate network • Disallowed protocols enabled on end user devices allowing a remote attacker to enumerate running services on those devices • Previous configurations lost when device reimaged, resulting in loss of service while technicians scrambled to create a new configuration • Each of these examples occurred at GSFC within the past 12 months

  13. COTS Analysis • Private industry is notorious for holding vulnerabilities close to the vest • Dynamic testing • System Inspection • Penetration Testing • Public NIST resources • National Vulnerability Database • Security Content Automation Protocol • Security Checklists • Integrating multiple systems

  14. Wrap-up • Going Forward • Work on likelihood criteria for PBRA • Will require research, coordination, and collaboration • Develop and refine new methods • Drafted methods for RMF are good start • Will create methods for integration, testability, code, configuration, and COTS analysis • Must analyze the tradeoffs between usability and security • Work with West Virginia University • NSA Center of Excellence for training and research in computer security

  15. Questions • ?

  16. Backup Slides

  17. Managing Risk • NIST Risk Management Framework • Process to determine inherent risk of the information system • Steps to generate appropriate controls at a system level to protect the system against known attacks • Encourages Defense in Depth • Creates bureaucracy to deal with tracking controls • Does not enact, implement, or enforce controls • Does not demonstrate the line between “good enough” and “too much” security • Covers protecting system infrastructure and data, but no code

  18. Security Life Cycle • Org Inputs • Policies • Security Reqs Architecture Is system correctly scoped and categorized? Step 1 Categorize Information System FIPS 199 / SP 800-60 Have appropriate controls been selected? Step 6 Monitor Security Controls SP 800-53A Step 2 Select Security Controls FIPS 200/ SP 800-53 Step 5 Authorize Information Systems SP 800-37 Step 3 Implement Security Controls SP 800 Series • Common, hybrid, and specific controls allocated to sys elts? • Controls correctly implemented in design? • Evaluate residual risks • Verify controls implementation (AIDT) Step 4 Assess Security Controls SP 800-53A • Review Security Authorization Pkg • Security Plan • Security Assessment report • Plan of Actions & Milestones (POAM) • Evaluate residual risk Review Security Assessment Report Re-assess remediations Evaluate residual risk

  19. Outputs of RMF • Consolidates documentation such as: • Security Controls • Risk Analysis • System Description and Boundary • Disaster Recovery processes • Continuity of Operations documentation • Contingency plans • Configuration Management Plans

  20. Security Requirements Validation - Security Risk Management Step 1 (NIST SP 800-37) • Verify system categorization is appropriate for selection of Security Controls and validate security requirements meet system needs • 1.0 Review Security Plans and Enterprise Architecture • 1.1 Develop an understanding of how the system owner and developer documents (or plans to document) system and security plans, specifications, and procedures. • 1.2 Identify applicable federal government (FISMA, FIPS, NIST 800) and institutional security requirements and guidelines. • 1.3 Review the project's security plans to evaluate and understand the projects approach to complying with (1.2) above. • 1.4 Develop an understanding of the organization's enterprise architecture to include common security controls, e.g., organizational PKI solution. • 2.0 Review Security Categorization • 2.1 Identify and review the system decomposition • 2.2 Verify that the security categorization per CNSS 1253 (National Security System) or FIPS Publication 199 (other than national security system) as documented in the Security Plan is consistent with the purpose and use of the system. The Security categorization is used to select Security Controls. Subsystems may be separately categorized enabling different allocations of security controls (in RMF Step 2). • 3.0 Review System Description • 3.1 Descriptive information about the information system is documented in the system identification section of the Security Plan, included in attachments to the plan, or referenced in other standard sources for information generated as part of the system development life cycle. Review system descriptions to develop an understanding of the system in sufficient detail to support subsequent IV&V evaluation of security controls. A list of recommended system description information is provided in NIST SP 800-37 RMF Task 1-2. • 4.0 Verify Security Requirements • 4.1 Using information from steps 1.0, 2.0, and 3.0 above, verify that security requirements satisfy applicable legislation, policies, directives, regulations, standards, and organizational mission/business/operational requirements and are appropriate for the Security categorization from Step 2.0 above.

  21. Verify Security Control Selection - Security Risk Management Framework (RMF) Step 2 (NIST SP 800-37) • Verify that selected security controls are appropriate for the system and its security categorization • 1.0 Review Common Controls • 1.1 Review and understand organizational common controls to be inherited by the system as documented in organizational security plans.. • 2.0 Verify Security Controls • Using the guidance provided in NIST SP 800-53, Evaluate whether the security controls for the system are appropriate for the system categorization and are capable of meeting system security requirements. Supporting tasks include: • 2.1 Verify that baseline security controls are identified for broad system application • 2.2 Verify that baseline security controls are tailored by applying scoping, parameterization, and compensating control guidance • 2.3 Verify the tailored baseline security controls are supplemented, if necessary, with additional controls and/or control enhancements to address unique organizational needs based on a risk assessment (either formal or informal) and local conditions including environment of operation, organization-specific security requirements, specific threat information, cost-benefit analyses, or special circumstances • 2.4 Verify minimum assurance requirements are identified, as appropriate. • 2.5 Verify that a sound rationale is provided for security control selection decisions. • 2.6 Verify that the intended application of each security control is sufficiently defined to enable a compliant implementation of the control. • 2.7 Verify that security controls for external interfaces are consistent with the security requirements and controls for the system.

  22. Verify Security Control Implementation - Security Risk Management Framework (RMF) Step 3 (NIST SP 800-37) • Verify that selected security controls are correctly implemented within the system. • 1.0 Life Cycle Analysis • 1.1 When viewing Security Requirements as any class of requirements, standard IV&V life cycle analysis tasks within the catalog of methods shall apply, such as those methods addressing traceability analysis, design analysis, implementation analysis, and test analysis. • The remainder of the Analysis Steps in this procedure are specific to Security capabilities and are supplemental to and do not replace the Life Cycle Analyses as described in Analysis Step 1.0 above. • 2.0 Security Design Analysis • 2.1 Verify that common, system specific, and hybrid controls are correctly allocated to system design elements • 2.2 Verify that security designs are consistent with the enterprise architecture • 2.3 Verify that selected COTS products meet security requirements and are compatible with the security architecture and capable of supporting controls allocated to these COTS products. COTS products include off software, computing operating systems, network appliances, or other forms of network attached complex electronics. • 2.4 Verify that COTS products have been subjected to third party security assessments, such as vulnerability scans. • 3.0 Security Implementation Analysis • 3.1 Verify that security related programming standards and common security solutions are correctly implemented within the software. • 3.2 Verify COTS product implementation sufficiently protects exploitation of any known security vulnerabilities • 3.3 Verify security configuration settings for software, hardware, firmware, appliances, or complex electronics (to include network systems) are consistent with the security design, security procedures, and external operating environment (e.g., third party WAN).

  23. Verify Security Remediations - Security Risk Management Framework (RMF) Step 4 (NIST SP 800-37) • Verify that security weaknesses and deficiencies are corrected • 1.0 Evaluate Security Assessment Report Findings and Recommendations • 1.1 Verify Security Assessment Report findings are consistent with other sources of information and are a correct basis for stated recommendations. This includes evaluating findings for false positives. Review IV&V security related findings to determine whether such findings should be included in the Security Assessment Report. • 1.2 Verify that recommended actions are appropriate and given the right priority. • 2.0 Remediation Analysis • 2.1 If weaknesses or deficiencies in security controls are corrected, reassess the remediated controls for effectiveness. Security control reassessments determine the extent to which the remediated controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the information system.

  24. System Authorization Analysis - Security Risk Management Framework (RMF) Step 5 (NIST SP 800-37) • Verify that any residual risk is adequately addressed in the Plan of Actions and Milestones (POAM) • 1.0 Review POAM • The POAM contains weaknesses or deficiencies at the time the authorization package is sent to the authorizing official. Verify that planned actions and schedules apply appropriate emphasis to remediation of weaknesses and deficiencies and that no weaknesses or deficiencies present sufficient risk to warrant a disapproval for system operation, considering: • 1.1 The security categorization of the information system; • 1.2 The specific weaknesses or deficiencies in the security controls; • 1.3 The importance of the identified security control weaknesses or deficiencies (i.e., the direct or indirect effect the weaknesses or deficiencies may have on the overall security state of the information system, and hence on the risk exposure of the organization, or ability of the organization to perform its mission or business functions); and • 1.4 The organization’s proposed risk mitigation approach to address the identified weaknesses or deficiencies in the security controls (e.g., prioritization of risk mitigation actions, allocation of risk mitigation resources). • A risk assessment per NIST SP 800-30 guides the prioritization process for items included in the plan of action and milestones • 2.0 Communicate Operational Issues • Communicate to the authorizing official any issues that warrant disapproval for system operation.

More Related