1 / 46

Security Policy: US Gov / DoD Examples

Security Policy: US Gov / DoD Examples. By Lance Spitzner. About the Speaker. 8 years experience security consulting and research, focus on gathering information about threats. Authored Honeypot: Tracking Hackers , co-authored Know Your Enemy and published numerous security articles.

kemal
Download Presentation

Security Policy: US Gov / DoD Examples

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Security Policy:US Gov / DoD Examples By Lance Spitzner

  2. About the Speaker • 8 years experience security consulting and research, focus on gathering information about threats. • Authored Honeypot: Tracking Hackers, co-authored Know Your Enemy and published numerous security articles. • Served 7 years in the military, 4 as officer in RDF.

  3. Why Security Policy? • It’s the first, and one of the most critical, steps to securing your environment. • Based on my experience, policy is unfortunately the last step that often gets done (if at all).

  4. Why US Gov / DoD Policy • Very difficult to get examples of real policy (often considered confidential). • US Gov / DoD has structured approach that is public and published. • They take it very seriously (have a lot to lose). • They have the resources to invest in developing it properlyand don’t have to prove ROI.

  5. Definition A security policy defines the rules that regulate how your organization manages and protects its information and computing resources to achieve security objectives (CERT).

  6. What is it? • A set of documentation • Documentation is a ‘living document’ that should be constantly updated to stay current. • Basically a plan, outlining what the company's critical assets are, and how they must (and can) be protected

  7. Challenges • Extremely difficult to develop, policy often unique to each organization. • No common format or process for developing one. • Making it simple so everyone can understand and use it. • Getting management consensus. • How do you enforce it?

  8. US Gov / DoD • No single organization has overall responsibility, each organization develops and implements its own. • OMB sets guidelines for US Gov, Pentagon sets guidelines for DoD. • Organizations such as NIST and CERT provide whitepapers and templates to assist.

  9. Office of Management and Budget OMB Circular A-130, Appendix III mandates: Establish a set of rules of behavior concerning use of, security in, and the acceptable level of risk for, the system. The rules shall be based on the needs of the various users of the system. The security required by the rules shall be only as stringent as necessary to provide adequate security for information in the system. Such rules shall clearly delineate responsibilities and expected behavior of all individuals with access to the system. They shall also include appropriate limits on interconnections to other systems and shall define service provision and restoration priorities. Finally, they shall be clear about the consequences of behavior not consistent with the rules.

  10. FISMA • Federal Information Security Management Act. • Provide a framework and minimum controls for federal government to enhance information security. • Congress passed the legislation in 2002 and OMB has the charter to enforce/oversee its implementation.

  11. DoD Policy • Policy is generated by a combination of Joint Staff/J6 and and the Office of the Assistant Secretary of Defense for Networks & Information Integration (ASD/NII) for DoD wide. • Services can and do write their own policy for their specific applications and situations (e.g. DoD policy couldn't possibly address all the unique configurations on all the Navy ships) • DISA produces <optional> config policy called Secure Technical Implementation Guides (STIGs) and makes sure the network connectivity is available

  12. General Examples • What systems can be added to the network, and the standards and procedures it has to adhere to. • What can and cannot be monitored and under what conditions (i.e. warning banners). • Classification of information and how it is handled. • What is an incident and how to report it. • How the policy is enforced. • Acceptable Use Policy (employees often biggest problem).

  13. Basic Approach • Identify what you are trying to protect • Look at whom you are trying to protect it from • Define what the potential risks are to any of your Information Assets • How are you going to enforce it (it has to be enforceable)

  14. Training and Awareness • Once approved and disseminated throughout organization: • Train and educate ALL employees • Have everyone sign • Begin enforcement immediately • Accomplish non-personnel related actions • Security personnel must constantly assess viability

  15. OMB Requirements • Ensure that all individuals are appropriately trained in how to fulfill their security responsibilities before allowing them access to the system. • Such training shall assure that employees are versed in the rules of the system, be consistent with guidance issued by NIST and OPM, and apprise them about available assistance and technical security products and techniques. • Behavior consistent with the rules of the system and periodic refresher training shall be required for continued access to the system.

  16. Enforcement • You will not be able to catch all the violators all the time. • You most likely will want to catch enough to deter.

  17. Revision Keep in mind, policies are a living document, they do not do any good if they do not change as your organization does.

  18. Key Points • Keep policy as simple as possible. • Include staff in policy development. • Security awareness and training. • Make sure all employees have read, understood, and signed policy. • Set clear penalties and enforce them.

  19. Security Modules Prevent Detect Respond Improve

  20. Prevention • Keeping threats out (both internal and external).

  21. Policy for Prevention • Documented standards for minimized, secure builds. • Process for approval to add anything with IP stack or new application to the network. • Acceptable Use Policy (AUP) • Keeping systems and applications current.

  22. System Access The following notice and consent banner, approved by the DoD General Counsel, may be used on all DoD Web sites with security and access controls. This banner may be tailored by an organization but such modifications shall be approved by the Component’s General Counsel before use. “This is a Department of Defense Computer System. This computer system, including all related equipment, networks, and network devices (specifically including Internet access) are provided only for authorized U.S. Government use. DoD computer systems may be monitored for all lawful purposes, including to ensure that their use is authorized, for management of the system, to facilitate protection against unauthorized access, and to verify security procedures, survivability, and operational security. Monitoring includes active attacks by authorized DoD entities to test or verify the security of this system. During monitoring, information may be examined, recorded, copied and used for authorized purposes. All information, including personal information, placed or sent over this system may be monitored. Use of this DoD computer system, authorized or unauthorized, constitutes consent to monitoring of this system. Unauthorized use may subject you to criminal prosecution. Evidence of unauthorized use collected during monitoring may be used for criminal, administrative, or other adverse action. Use of this system constitutes consent to monitoring for these purposes.”

  23. Determining Threat Levels • T1: Inadvertent or accidental events, e.g., tripping over the power cord. • T2: Passive, casual adversary with minimal resources who is willing to take little risk, e.g., listening. • T3: Adversary with minimal resources who is willing to take significant risk, e.g., unsophisticated hackers. • T4: Sophisticated adversary with moderate resources who is willing to take little risk, e.g., organized crime, sophisticated hackers, international corporations. • T5: Sophisticated adversary with moderate resources who is willing to take significant risk, e.g., international terrorists. • T6: Extremely sophisticated adversary with abundant resources who is willing to take little risk, e.g., well-funded national laboratory, nation-state, and international corporation. • T7: Extremely sophisticated adversary with abundant resources who is willing to take extreme risk, e.g., nation-states in time of crisis.

  24. DoD Password Policy DoD policy stipulates that all passwords shall: • Be at least eight characters in length, 12 – 16 if feasible • Be a combination of upper and lower case letters, numbers and special characters • Be changed every 90 days or upon direction • A history of individual password usage will be maintained for 1 year to preclude the use of old passwords. • Not be composed of ay words found in a dictionary • Passwords must not be displayed at any terminal or printer. • The user will employ appropriate actions to prevent disclosure while logging-on to system.

  25. DoD Removable Media Policy • All media introduced to DoD systems shall be virus scanned prior to executing application/ file • As much as practicable, drives and CD burners shall be disabled • When files are transferred to removable media, they shall be properly annotated (using pre-printed identification or hand-written) with the appropriate classification markings

  26. Mobile Code Policy • Mobile Code in E-mail. Due to the significant risk of malicious mobile code (viruses and worms) downloaded into user workstations via E-mail, the DOD policy for mobile code in E-mail is more restrictive. To the extent possible, the automatic execution of all categories of mobile code in E-mail bodies and attachments will be disabled. Whenever possible, desktop software will be configured to prompt the user prior to opening E-mail attachments that may contain mobile code. • Disable automatic execution of all mobile code (e.g., ActiveX, Java, JavaScript, and VBScript) in E-mail bodies and attachments (when possible). • Disable automatic execution of HTML in E-mail bodies and attachments (when possible). • Enable user prompts prior to opening an E-mail attachment that may contain mobile code. • Prevent E-mail products from automatically forwarding mobile code in attachments to WSH for execution. • Some E-mail products cannot be configured to implement all of these countermeasures. System administrators will implement as many of these countermeasures as possible.

  27. DoD System Backup Policy • System back-ups shall be accomplished at least once per week on all DoD systems • As threat levels rise, the frequency of back-ups shall be more often based on the mission criticality of the system • An integrity check shall be conducted on each back-up

  28. DoD E-Mail/Instant Messaging • Authorized, unclassified government business will use USG E-mail accounts. • Contractors and foreign nationals will be identified in their DOD user E-mail address (john.smith.ctr@army.mil or john.smith.uk@army.mil) and electronic signatures (e.g., John Smith, Contractor, J-6K, Joint Staff). Abbreviation used for a contractor in an E-mail address should be "ctr." For additional guidance on foreign nationals and use of country codes. • Unapproved accounts, such as AOL (including Instant Messenger), HOTMAIL, ICQ, MSN Messenger or YAHOO, will not be used for official business unless specifically authorized to do so by the DAA. ISP or web-based E-mail/IM systems will be approved only when mission-essential and USG-owned E-mail systems are not available. • All mail connections to and from mail servers used for anonymous mail redirection are to be blocked. Mail should be traceable to an individual and to known servers.

  29. DoD Web Server Security DoD Web servers that are externally accessed shall be isolated from the internal network of the sponsoring organization. The isolation may be physical, or it may be implemented by technical means such as an approved firewall. The server software will be FIPS 140-2 compliant with all security patches properly installed. Approved DoD security protocols will be used for all Web servers. Additional security measures shall also be employed consistent with the risk management approach and security policy of the individual DoD Web site. Examples of additional measures to be considered include: • Disable IP forwarding, avoid dual-homed servers • Employ least privilege • Limit functionality of Web server implementation • Employ tools to check configuration of host • Enable and regularly examine event logs

  30. Managing Risk Levels To manage various risk levels of vulnerabilities DISA created three different types of vulnerability alerts • Information Assurance Vulnerability Alerts: • Severe vulnerability • Immediate threat to DoD Information Infrastructure • Compliance action required • Information Assurance Vulnerability Bulletin • Medium risk vulnerability • Potential threat escalation • Acknowledgement required • Technical Advisory • Low risk vulnerability • Potential escalation unlikely

  31. IAVA Process • Information Assurance Vulnerability Alert process is an advisory system that features “positive control", to include the following features: • Each relevant network has a responsible security/systems administrator identified by name • Each advisory notice is to be sent to the named officials • Each notice (deemed of sufficient importance) contains a mandatory confirmation response acknowledging receipt of the notice • Each notice requiring corrective action has a stipulated implementation date • DISA will develop automated method to remotely “check” whether the correction is in place • The IAVA process implementation consists of the following components: • Identifies security-related flaws in operating systems and applications • Evaluates vulnerabilities based on the severity of impact they pose to the DII • Disseminates vulnerability notices to CINCs, Services and Agencies, promoting command involvement • Provides an automated means for CINCs, Service and Agencies to report receipt of the notices and compliance with recommended fix actions • Compliance statistics are tracked for reporting to senior DOD leadership

  32. Detection • Sooner or later, there will be a failure in prevention. • That failure must be detected as soon as possible.

  33. Policy issues • What information is collected and monitored? • How is that information is collected? • Where that information is centralized? • Who has access to that information? • How is that information reviewed, and when? • What determines an incident? • What priorities are involved?

  34. Contractors • Many US military and government organizations contract out detection to commercial organizations.

  35. AirForce Policy The contractor will provide 24 x 7 intrusion detection monitoring using intrusion detection tools and system audit logs for the system servers, software, database, networks, and firewalls under its control. Daily intrusion detection reports will be submitted to the ISSM for assessment and possible corrective action. In turn, the contractor will take immediate corrective action requested by the ISSM to eliminate system vulnerability or to prevent future intrusion attempts.

  36. US Army Network Monitoring Policy • Only LE/CI personnel are authorized to intercept the content of an individual's communication, after obtaining appropriate legal authority. • Browsing or reading a user's e-mail is prohibited. The SA/NA may intercept, retrieve, or otherwise recover an e-mail message only upon the incident specific consent of the parties involved or as part of a properly authorized LE/CI investigation or as a necessary part of a non-investigatory management search. Neither a blanket consent nor the warning banner provides this consent. • The SA/NA may remove any e-mail message or file that is interfering with the operation of an IS without consent of the originator or recipient. The SA/NA will notify the originator and recipient of such actions.

  37. Response Once detected, the failure must be mitigated as soon as possible.

  38. Policy for Response • Confirming what is an incident, and level of priority. • What do you want to achieve? • Backups

  39. US Navy Incident Handling • Requirement to collaborate and cooperate with other appropriate organizations in the sharing of incident, vulnerability, threat and countermeasures information concerning those systems. • All commands, units, and activities in the Navy and Marine Corps will report any computer intrusion incident, or suspicion of one, to the FLTINFOWARCEN. This reporting is in addition to the requirements levied upon specific Navy and Marine Corps commands by the Defense Intelligence Agency and National Security Agency/Central Security Service. • Reports by units experiencing computer network incidents can be transmitted via any available communications systems.

  40. US Army Incident and Intrusion Reporting A serious incident report (SIR) will be generated and reported under the following conditions • The incident poses grave danger to the Army's ability to conduct established information operations. • Adverse effects on the Army's image such as Web page defacements. • Access or compromise of classified or sensitive information (for example, soldier identification information (SSN), medical condition or status, patient-client or attorney-client privilege). • Compromise originating from a foreign source. • Compromise of systems that may risk safety, life, limb, or has the potential for catastrophic effects, or contain information for which the Army is attributable.

  41. Department of Energy Incident Reporting • To assist the Computer Incident Advisory Capability (CIAC) in analyzing the DOE corporate threat and providing DOE with guidance, the following information is needed: • How? • How was access gained? What vulnerability was exploited? • How was the incident detected? • Who? • Determine responsible party’s identification, usually IP address(es) or host name(s). • Does the compromise involve a country on the DOE Sensitive Country List?

  42. Department of Energy Incident Reporting • What? • What type of information was the compromised system processing (classified or unclassified such as OUO, UCNI, NNPI, Export Controlled)? • What service did the system provide (DNS, key asset servers, firewall, VPN gateways, IDS, etc.)? • What level of access did the intruder gain? • What hacking tools/techniques were used? • What did the intruder delete, modify, or steal? • What unauthorized data collection programs, such as sniffers, were installed? • What was the impact of the attack? • What preventative or mitigative measures have been (are being) implemented? • When? • When was the cyber security incident detected? • When did the cyber security incident actually occur?

  43. Improve • Last and final phase of the security cycle. • Your business, and its threats are always changing. • As such, you have to be constantly adapting and improving your security.

  44. Policy for Improve • Learning from incidents. • Assessments • Security training and awareness • Staying updated • Courses • Books / Publications • Maillists

  45. NIST Assessment Doc • NIST Special publication 800-26 • Security Self-assessment guide • Self-assesments provide a method for agency officials to determine the current status of their information security programs, and where necessary, establish a target for improvement.

  46. US DoD/Gov Policy URLs http://www.radium.ncsc.mil/tpep/library/rainbow/ http://docs.usapa.belvoir.army.mil/jw2/xmldemo/r25_2/main.asp http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf http://sc.afit.edu/support/policy_security/policy/app_mail.html http://www.cpms.osd.mil/vmo/documents/SOW/Files/A16%20Intrusion%20Detection.wdf.doc http://www.dodig.osd.mil/audit/reports/fy01/01184sum.htm http://www.hpcmo.hpc.mil/Htdocs/DREN/dren-up.html http://nileweb.gsfc.nasa.gov/security/access-policy1.html http://www.usna.edu/InfoTech/Policies/Policies_AcceptableUse.htm

More Related