1 / 26

CSCE 548 Secure Software Development Security Operations

CSCE 548 Secure Software Development Security Operations. Reading. This lecture: Security Operations, McGraw: Chapter 9 Bridging the Gap between Software Development and Information Security, Kenneth R. van Wyk and Gary McGraw, http://www.cigital.com/papers/download/bsi10-ops.pdf

azure
Download Presentation

CSCE 548 Secure Software Development Security Operations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCE 548 Secure Software DevelopmentSecurity Operations

  2. Reading • This lecture: • Security Operations, McGraw: Chapter 9 • Bridging the Gap between Software Development and Information Security, Kenneth R. van Wyk and Gary McGraw, http://www.cigital.com/papers/download/bsi10-ops.pdf • SANS, Software Security Institute, http://www.sans-ssi.org/ • Next lecture: • Review for Midterm

  3. Application of Touchpoints External Review 3. Penetration Testing 1. Code Review (Tools) 6. Security Requirements 4. Risk-Based Security Tests 2. Risk Analysis 7. Security Operations 5. Abuse cases 2. Risk Analysis Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field

  4. Traditional Software Development • No information security consideration • Highly distributed among business units • Lack of understanding of technical security risks

  5. Don’t stand so close to me • Best Practices • Manageable number of simple activities • Should be applied throughout the software development process • Problem: • Software developers: lack of security domain knowledge  limited to functional security • Information security professionals: lack of understanding software  limited to reactive security techniques

  6. Software Security Best Practices • Abuse cases • Business risk analysis • Architectural risk analysis • Security functionality testing • Risk-driven testing • Code review • Penetration testing • Deployment and operations

  7. Deployment and Operations • Configuration and customization of software application’s deployment environment • Activities: • Network-component-level • Operating system-level • Application-level

  8. Abuse Cases • Drive non-functional requirements and test scenarios • Need information security professionals to understand attacker’s mind • Collaboration between software developers and infosec people

  9. Business Risk Analysis • “Who cares” • Business stakeholders • Technology assessment  need software-level assessment • Answer security related questions: how much down time, cost of recovery, effect on reputation , etc.

  10. Architectural Risk Analysis • Assess the technical security exposures at system design-level • Evaluates business impact of technical risks • Infosec people: understanding of technology, e.g., application platform, frameworks, languages, functions, etc. • Real world feedback

  11. Security Testing • In addition to testing functional specifications and requirements, need test for risk-based attacks • Understand attacker’s way of thinking

  12. Code Review • Requires knowledge of code • Need information about attacker’s way of thinking

  13. Penetration testing • System penetration testing: driven by previously identified risks • Outside  in activity • Application penetration testing • Inside  out activity

  14. Deployment and Operations • Configuration and customization of software application’s deployment environment • Fine tuning security functionality • Evaluate entire system’s security properties • Apply additional security capabilities if needed

  15. Who are the attackers? • Amateurs: regular users, who exploit the vulnerabilities of the computer system • Motivation: easy access to vulnerable resources • Crackers: attempt to access computing facilities for which they do not have the authorization • Motivation: enjoy challenge, curiosity • Career criminals: professionals who understand the computer system and its vulnerabilities • Motivation: personal gain (e.g., financial)

  16. Attacker’s Knowledge • Insider • Understand organizational data, architecture, procedures, etc. • May understand software application • Physical access • Outsider • May not understand organizational information • May have software specific expertise • Use of tools and other resources

  17. Types of Attack • Interruption – an asset is destroyed, unavailable or unusable (availability) • Interception – unauthorized party gains access to an asset (confidentiality) • Modification – unauthorized party tampers with asset (integrity) • Fabrication – unauthorized party inserts counterfeit object into the system (authenticity) • Denial – person denies taking an action (authenticity)

  18. Vulnerability Monitoring • Identify security weaknesses • Methods: • Automated tools • Human walk-through • Surveillance • Audit • Background checks

  19. System Security Vulnerability • Software installation • Default values • Configurations and settings • Monitoring usage • Changes and new resources • Regular updates • Tools • Look for known vulnerabilities

  20. Red Team • Organized group of people attempting to penetrate the security safeguards of the system. • Assess the security of the system  future improvement • Requested or permitted by the owner to perform the assessment • Wide coverage: computer systems, physical resources, programming languages, operational practices, etc.

  21. Building It Secure • 1960s: US Department of Defense (DoD) risk of unsecured information systems • 1981: National Computer Security Center (NCSC) at the NSA • DoD Trusted Computer System Evaluation Criteria (TCSEC) == Orange Book

  22. Orange Book • Orange Book objectives • Guidance of what security features to build into new products • Provide measurement to evaluate security of systems • Basis for specifying security requirements • Security features and Assurances • Trusted Computing Base (TCB) security components of the system

  23. Orange Book Levels Highest Security • A1 Verified protection • B3 Security Domains • B2 Structured Protection • B1 labeled Security Protections • C2 Controlled Access Protection • C1 Discretionary Security Protection • D Minimal Protection No Security

  24. Security Awareness and Training • Major weakness: users unawareness • Organizational effort • Educational effort • Customer training • Federal Trade Commission: program to educate customers about web scams

  25. SANS: Software Security Institute • Set of six comprehensive examinations • Demonstrate security knowledge and skills needed to deal with common programming errors • For programmers • Target: • Implementation issues in individual programming languages • Secure programming principles that are directly relevant to the programmers

  26. SANS: Secure Programming Skills Assessment • Aims to improve secure programming skills and knowledge • Allow employers to rate their programmers • Allow buyers of software and systems vendors to measure skills of developers • Allow programmers to identify their gaps in secure programming knowledge • Allow employers to evaluate job candidates and potential consultants • Provide incentive for universities to include secure coding in their curricula

More Related