1 / 54

Lecture 1: Overview

Lecture 1: Overview. modified from slides of Lawrie Brown. Outline. The focus of this chapter is on three fundamental questions: What assets do we need to protect? How are those assets threatened? What can we do to counter those threats?. Computer Security Overview.

rwoodruff
Download Presentation

Lecture 1: Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 1: Overview modified from slides of Lawrie Brown

  2. Outline The focus of this chapter is on three fundamental questions: • What assets do we need to protect? • How are those assets threatened? • What can we do to counter those threats?

  3. Computer Security Overview • The NIST Computer Security Handbook defines the term Computer Security as: “The protection afforded to an automated information system in order to attain the applicable objectives of preserving the integrity, availability and confidentiality of information system resources” includes hardware, software, firmware, information/data, and telecommunications.

  4. The CIA Triad

  5. Key Security Concepts • Confidentiality • Integrity • Availability • preserving authorized restrictions on information access and disclosure. • including means for protecting personal privacy and proprietary information • guarding against improper information modification or destruction, • including ensuring information nonrepudiation and authenticity • ensuring timely and reliable access to and use of information Is this all?

  6. Levels of Impact • Low • Moderate • High • The loss could be expected to have a limitedadverseeffect on organizational operations, organizational assets, or individuals • The loss could be expected to have a serious adverse effect on organizational operations, organizational assets, or individuals • The loss could be expected to have a severe or catastrophic adverse effect on organizational operations, organizational assets, or individuals

  7. Computer Security Challenges • computer security is not as simple as it might first appear to the novice • potential attacks on the security features must be considered • procedures used to provide particular services are often counterintuitive • physical and logical placement needs to be determined • multiple algorithms or protocols may be involved

  8. Computer Security Challenges • attackers only need to find a single weakness, the developer needs to find all weaknesses • users and system managers tend to not see the benefits of security until a failure occurs • security requires regular and constant monitoring • is often an afterthought to be incorporated into a system after the design is complete • thought of as an impediment to efficient and user-friendly operation

  9. Computer Security Terminology • Adversary (threat agent) • An entity that attacks, or is a threat to, a system. • Attack • An assault on system security that derives from an intelligent threat; a deliberate attempt to evade security services and violate security policy of a system. • Countermeasure • An action, device, procedure, or technique that reduces a threat, a vulnerability, or an attack by eliminating or preventing it, by minimizing the harm it can cause, or by discovering and reporting it so that corrective action can be taken.

  10. Computer Security Terminology • Risk • An expectation of loss expressed as the probability that a particular threat will exploit a particular vulnerability with a particular harmful result. • Security Policy • A set of rules and practices that specify how a system or org provides security services to protect sensitive and critical system resources. • System Resource (Asset) • Data; a service provided by a system; a system capability; an item of system equipment; a facility that houses system operations and equipment.

  11. Computer Security Terminology • Threat • A potential for violation of security, which exists when there is a circumstance, capability, action, or event that could breach security and cause harm. • Vulnerability • Flaw or weakness in a system's design, implementation, or operation and management that could be exploited to violate the system's security policy.

  12. Security Concepts and Relationships

  13. Assets of a Computer System

  14. Vulnerabilities, Threats and Attacks • vulnerabilities • leaky (loss of confidentiality) • corrupted (loss of integrity) • unavailable or very slow (loss of availability) • threats • capable of exploiting vulnerabilities • represent potential security harm • attacks (threats carried out) • passive or active attempt to alter/affect system resources • insider or outsider

  15. Countermeasures • prevent • detect • recover • means used to deal with security attacks • may introduce new vulnerabilities • Residual vulnerabilities may remain • goal is to minimize residual level of risk to the assets

  16. Lecture 2: Overview (cont) modified from slides of Lawrie Brown

  17. by Peter Steiner, New York, July 5, 1993

  18. Threat Consequences Unauthorized disclosure is a threat to confidentiality • Exposure: This can be deliberate or be the result of a human, hardware, or software error • Interception: unauthorized access to data • Inference: e.g., traffic analysis or use of limited access to get detailed information • Intrusion: unauthorized access to sensitive data

  19. Threat Consequences Deception is a threat to either system or data integrity • Masquerade: an attempt by an unauthorizeduser to gain access to a system by posing as an authorized user • Falsification: altering or replacing of valid data or the introduction of false data • Repudiation: denial of sending, receiving or possessing the data.

  20. Threat Consequences Usurpation is a threat to system integrity. • Misappropriation: e.g., theft of service, distributed denial of service attack • Misuse: security functions can be disabled or thwarted

  21. Threat Consequences Disruption is a threat to availability or system integrity • Incapacitation: a result of physical destruction of or damage to system hardware • Corruption: system resources or services function in an unintended manner; unauthorized modification • Obstruction: e.g. overload the system or interfere with communications

  22. Scope of Computer Security

  23. Computer and Network Assets Jamming

  24. Passive and Active Attacks • Passive attacks attempt to learn or make use of information from the system but does not affect system resources • eavesdropping/monitoring transmissions • difficult to detect • emphasis is on prevention rather than detection • two types: • message contents • traffic analysis • Active attacks involve modification of the data stream • goal is to detect them and then recover • categories: • masquerade • replay • modification of messages • denial of service

  25. Security Functional Requirements • computer security technical measures • management controls and procedures • overlap computer security technical measures and management controls • access control • identification & authentication; • system & communication protection • system & information integrity • awareness & training • audit & accountability • certification, accreditation, & security assessments • contingency planning • maintenance • physical & environmental protection • planning • personnel security • risk assessment • systems & services acquisition • configuration management • incident response • media protection

  26. Data Origin Authentication • corroboration of the source of a data • supports applications where there are no prior interactions Authentication Service • assuring a communication is from the source that it claims to be from • interference by a third party masquerading as one of the two legitimate parties • Peer Entity Authentication • corroboration of the identity of a peer entity • confidence that an entity is not performing • a masquerade or • an unauthorized replay

  27. Access Control Service Nonrepudiation Service • limit and control the access to host systems and applications • each entity trying to gain access must first be identified, or authenticated • prevents either sender or receiver from denying a transmitted message

  28. Data Confidentiality Service • connection confidentiality • connectionless confidentiality • selective-field confidentiality • traffic-flow confidentiality • protection of transmitted data from passive attacks • protects user data transmitted over a period of time

  29. Data Integrity Service • connectionless integrity service • provides protection against message modification only • connection-oriented integrity service • assures that messages are received as sent • no duplication, insertion modification, reordering, or replays • can apply to a stream of messages, a single message, or selected fields within a message • with and without recovery

  30. Availability Service • a variety of attacks can result in the loss of or reduction in availability • some of these attacks are amenable to authentication and encryption • some attacks require a physical action to prevent or recover from loss of availability • depends on proper management and control of system resources • a service that protects a system to ensure its availability • being accessible and usable upon demand by an authorized system entity

  31. Security Implementation • detection prevention complementary courses of action • recovery • response

  32. Security Mechanism • Feature designed to • Prevent attackers from violating security policy • Detect attackers’ violation of security policy • Responseto mitigate attack • Recover, continue to function correctly even if attack succeeds • No single mechanism that will support all services • Authentication, authorization, availability, confidentiality, integrity, non-repudiation

  33. Fundamental Security Design Principles • Economy of mechanism • Fail-safe defaults • Complete mediation • Open design • Separation of privilege • Least privilege • Least common mechanism • Psychological acceptability • Isolation • Encapsulation • Modularity • Layering • Least astonishment

  34. Attack Surfaces • Consist of the reachable and exploitable vulnerabilities in a system • Examples: • Open ports on outward facing Web and other servers, and code listening on those ports • Services available on the inside of a firewall • Code that processes incoming data, email, XML, office documents, and industry-specific custom data exchange formats • Interfaces, SQL, and Web forms • An employee with access to sensitive information vulnerable to a social engineering attack

  35. Attack Surface Categories • Network Attack Surface • Software Attack Surface • Human Attack Surface • Vulnerabilities created by personnel or outsiders, such as social engineering, human error, and trusted insiders • Vulnerabilities in application, utility, or operating system code • Vulnerabilities over an enterprise network, wide-area network, or the Internet • Included in this category are network protocol vulnerabilities, such as those used for a denial-of-service attack, disruption of communications links, and various forms of intruder attacks • Particular focus is Web server software

  36. Security Technologies Used

  37. Types of Attacks Experienced

  38. Defense in Depth and Attack Surface

  39. Computer Security Strategy • Specification & policy • Implementation & mechanisms • Correctness & assurance • what is the security scheme supposed to do? • how does it do it? • does it really work?

  40. Computer Security Strategy • Security Policy • Formal statement of rules and practices that specify or regulate how a system or organization provides security services to protect sensitive and critical system resources • Security Implementation • Involves four complementary courses of action: • Prevention • Detection • Response • Recovery • Assurance • The degree of confidence one has that the security measures, both technical and operation, work as intended to protect the system and the information it processes • Evaluation • Process of examining a computer product or system with respect to certain criteria

  41. Security Policy • formal statement of rules and practices that specify or regulate security services • factors to consider: • value of the protected assets • vulnerabilities of the system • potential threats and the likelihood of attacks • trade-offs to consider: • ease of use versus security • cost of security versus cost of failure and recovery

  42. Assurance and Evaluation • assurance • the degree of confidence one has that the security measures work as intended • both system design and implementation • evaluation • process of examining a system with respect to certain criteria • involves testing and formal analytic or mathematical techniques

  43. Security Trends www.cert.org (Computer Emergency Readiness Team)

  44. Early Hacking – Phreaking • In1957, a blind seven-year old, Joe Engressia Joybubbles, discovered a whistling tone that resets trunk lines • Blow into receiver – free phone calls • Cap’n Crunch cereal prize • Giveaway whistle produces 2600 MHz tone

  45. The Seventies • John Draper • a.k.a. Captain Crunch • “If I do what I do, it is only to explore a system” • In 1971, built Bluebox • with Steve Jobs and Steve Wozniak

  46. The Eighties • Robert Morris worm - 1988 • Developed to measure the size of the Internet • However, a computer could be infected multiple times • Brought down a large fraction of the Internet • ~ 6K computers • Academic interest in network security

  47. The Nineties • Kevin Mitnick • First hacker on FBI’s Most Wanted list • Hacked into many networks • including FBI • Stole intellectual property • including 20K credit card numbers • In 1995, caught 2nd time • served five years in prison

  48. Code-Red Worm • On July 19, 2001, more than 359,000 computers connected to the Internet were infected in less than 14 hours • Spread

  49. Sapphire Worm • was the fastest computer worm in history • doubled in size every 8.5 seconds • infected more than 90 percent of vulnerable hosts within 10 minutes.

More Related