slide1 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Enterprise Continuous Monitoring Program PowerPoint Presentation
Download Presentation
Enterprise Continuous Monitoring Program

Loading in 2 Seconds...

play fullscreen
1 / 32

Enterprise Continuous Monitoring Program - PowerPoint PPT Presentation


  • 211 Views
  • Uploaded on

Enterprise Continuous Monitoring Program. Training Date. Contents. Continuous Monitoring Refresher Continuous Monitoring Process Approach for FY08 Task Overview and Activity Breakout Implementation Process and Next Steps Q&A. Continuous Monitoring Refresher .

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Enterprise Continuous Monitoring Program' - Samuel


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
contents
Contents
  • Continuous Monitoring Refresher
  • Continuous Monitoring Process Approach for FY08
  • Task Overview and Activity Breakout
  • Implementation Process and Next Steps
  • Q&A
continuous monitoring refresher
Continuous Monitoring Refresher
  • What is Continuous Monitoring (CM)?
    • The Federal Information Security Management Act of 2002(FISMA) requires periodic testing (at least annually) of selected security controls for all federal certified and accredited (C&A) systems to evaluate their effectiveness
    • System documentation is updated to reflect changes and modifications to the system
    • While general guidance on continuous monitoring is provided by National Institute of Standards and Technology(NIST) SP 800-37 and SP 800-53A, the Agency follows guidance to help create their process framework
  • The establishmentof a robust Continuous Monitoring Framework is an integral piece of the information security program
  • A solid continuous monitoring approach will keep system stakeholders apprised of their security status and help integrate security into everyday roles and responsibilities
  • FY08 framework is based on the need for an enterprise continuous approach and on guidance for the selection of controls and discussion with auditors
continuous monitoring refresher continued
Continuous Monitoring Refresher (continued)
  • Security controls are to be tested on an annual basis
    • Continuous Monitoring would occur between C&A/Security Test & Evaluation (ST&E) cycles for systems
    • A minimum number of security controls will be tested to monitor the state of security for all systems on a yearly basis as well as satisfying FISMA requirements
    • Testing throughout the year fosters a more active Plans of Action & Milestones (POA&M) update and reconciliation process strengthening the accuracy and accountability of each system’s POA&M and high volatility controls
  • Eventually, continuous monitoring will be integrated into the quarterly POA&M update process allowing the system stakeholders to use these plans to guide future security certification and accreditation activities
continuous monitoring approach for fy08
Continuous Monitoring Approach for FY08
  • Starting the process earlier in FY08
  • May use a three-phrase approach for system testing (similar to C&As)
  • Takes lessons learned from both an audit report and stakeholders Security controls are to be tested on an annual basis
    • Conduct internal meetings, surveys, or questionnaires with System Points of Contact (POCs) and stakeholders to help identify system changes to POCs or security controls (Security Control Assessment Guide provides a list of questions to assist with determining significant changes to the system)
    • Conduct training with all SPMOs on the enterprise continuous monitoring approach
    • Provide Security Program Management Office (SPMO) with a training deck to assist the System POC and testers with security control testing
    • Conduct pre-testing kick-off meeting to outline how testing will be conducted
    • Coordinate C&A documentation updates with CM activities
    • Review each system to determine application-specific control set for testing
task 1 initiation and planning overview
Task 1: Initiation and Planning Overview
  • The Initiation and Planning task includes activities that will assist in the overall continuous monitoring process.
  • Systems undergoing CM during the FISMA year are identified and scheduled for testing.
  • C&A documentation from the previous cycle stored in Trusted Agent FISMA (TAF) is downloaded and reviewed to ensure that the appropriate controls are identified, selected, and fed into Task 2.
  • Distribute Security Control Selection Guide to assist business owners in identifying any changes necessary to select controls.
  • A preliminary update is made to the System Security Plans (SSPs) (via Appendix YY) based on the results from the questions answered from the Security Control Selection Guide.
  • Testing of controls will not be the responsibility of any one individual or organization. Depending on the controls selected, the test team may involve Information Technology Business Unit representatives to conduct tests with a technical or operational flavor.
  • Stakeholder training will be provided to help set expectations and educate the stakeholders on roles/responsibilities, tasks, activities and schedules
task 2 control selection overview
Task 2: Control Selection Overview
  • Test Control Selection is conducted by the business owner based on the Assessment Control Selection Guide
  • The list of the mandatory and high volatility NIST SP 800-53 controls to test for each system during the annual cycle is pre-determined and should be used as a starting point for control selection
  • Other controls to be selected include closed POA&M controls
  • Collaboration between the stakeholders will help determine any additional security controls should be tested throughout the year
  • Selected controls should reflect the agencies priorities and the importance of the information system to the agency. For example, certain security controls may be considered more critical than other controls because of the potential impact on the information system if those controls were subverted or found to be ineffective.
  • Once selected, a control selection agreement is formalized via the Control Selection Memo process
task 2 control selection overview continued
Task 2: Control Selection Overview (continued)
  • The Assessment Control Selection Guide was created to assist the SPMO/business owner to select the security controls to be tested for the purpose of performing continuous monitoring/annual security control testing.
  • The flow of the guide is as follows:
    • The Security Control Selection Guide Overview - will provide the document’s background and purpose
    • The Security Control Selection Process
      • Types of Controls – discusses the types of controls from which a system owner will select the controls to be evaluated during continuous monitoring.
      • Types of Security Control Testing – covers the two types of security control testing based on whether a system has performed a Certification and Accreditations (C&A) in the current FISMA cycle.
      • Selection of Additional Volatile Controls – details three approaches for further selection an additional subset of controls
    • A quick reference guide summary
task 3 pre test preparation overview
Task 3: Pre-Test Preparation Overview
  • Update testing workbooks with selected controls
  • Update any security controls with additional technical test cases from previous C&A effort referring to the ST&E plan for these test cases
  • Testing workbooks are MS Excel spreadsheets containing controls selection matrix, security assessment report form and test cases for each control
  • A testing schedule is created and finalized during the “pre-test” meeting
  • The “pre-test” meeting will be held for each system
    • Invites will be sent out
    • Outlines specific testing guidelines
    • Answers questions about the evidence required
    • Obtain participation commitments
task 3 pre test preparation overview continued
Task 3: Pre-Test Preparation Overview (continued)
  • Workbook Tab – Control Selection Matrix
task 3 pre test preparation overview continued17
Task 3: Pre-Test Preparation Overview (continued)
  • Workbook Tab – Control test cases and Sample
task 3 pre test preparation overview continued18
Task 3: Pre-Test Preparation Overview (continued)
  • Workbook Tab – Security Assessment Reporting Form
task 4 perform test overview
Task 4: Perform Test Overview
  • Controls will be tested using NIST 800-53A test procedures and documented in the testing workbooks for each system
  • Note how the NIST guidance aligns with our process
    • Input = Phase 2 (Control Selection)
    • Processing = Phase 3 (Pre-Test Prep)
    • Output = Phase 4 (Perform Test)
  • Assessment methods are used to assess objects (in parentheses):
    • - Examine (documents - to include gathered evidence as necessary)
    • - Interview (personnel)
    • - Test (activities or HW/SW)
task 4 perform test overview continued
Task 4: Perform Test Overview (continued)
  • CM tests can be conducted via teleconference
    • Invitations should be sent out
  • Documentation will be updated with test results
    • Workbooks/Reports
    • SSPs (via Appendix YY)
  • Appendix YY will be used to document changes from the test results
    • Contains control status updates
  • SSP Appendix YY will be validated with the system stakeholders and any last documentation updates will be made
task 5 analyze results overview
Task 5: Analyze Results Overview
  • Upon completion of the testing workbooks, the System POC delivers the test results to the SPMO
  • SPMO will perform analysis of the documented results providing scoring recommendations for the evaluated security controls
  • The scoring methodology ensures the test procedures are assessed appropriately and the required evidence is provided for audit purposes
task 5 analyze results overview continued
Task 5: Analyze Results Overview (continued)
  • Scoring criteria is based on a “Satisfied” or an “Other than satisfied” for each determination statement under each control test procedure:
    • If the results are determined to be sufficient and the test procedure is determined to be satisfied, the determination statement and procedure will be marked as “S/Satisfied”
    • If the test procedure is not fully addressed or the results determine the system does not comply, the determination statement and procedure will be marked as “O/Other than satisfied”
    • If the determination statement and procedure is not applicable to the information system, the determination statement and procedure will be marked as “N/A.”
  • Based on the determination statement results, the test teams should mark the control as “In Place, Partial, Planned, risk based decision, or Not Applicable”
    • Scoring criteria consistent with NIST guidance has been applied to ensure that test procedure is assessed to determine if the result sufficiently addresses the focus of the test procedure and that the required evidence is provided
      • Should all of the determination statements and test procedures for a control be marked as “S/Satisfied”, then the control will be scored as “In Place”.
      • Should one or more of the determination statements and test procedures for a control be marked as “O/Other than satisfied”, then the control will be scored as “Partially in Place”.
      • Should most or all of the determination statements and test procedures for a control be marked as “O/Other than satisfied” or “N/A.”, then the control will be scored as “Planned, RBD, or N/A”.
task 6 confirm results overview
Task 6: Confirm Results Overview
  • Following the completion of results analysis by SPMO, SPMO representatives will deliver the “CM Package” to the system POCs
  • The CM package contains the analysis of each system’s respective test results, including an executive-style report describing the scoring recommendations and identifying all weaknesses to each system
    • CM Package also contains the Signed Control Selection Memo
    • The system owner will review and concur with the results
    • This will allow the system stakeholders and the SPMO the opportunity to discuss scoring recommendations made by the SPMO, and give the SPMO representatives the opportunity to explain scoring rationale and justifications
  • Upon agreement between the system stakeholders and SPMO representatives, the confirmed results will be used to update each system’s respective POA&M
    • If not in agreement, SPMO representatives will work with system stakeholders to ensure results are agreed upon before the system POC updates their system POA&M (if applicable)
  • Finally, the DAA is briefed of the results by the system POC and the SPMO uploads the final CM Package to TAF
implementation process and next steps
Implementation Process and Next Steps
  • SPMOs Train Test Teams and start control selection process: Feb 2008
  • Hold Pretest Status Meetings: Feb 2008
    • Deliver Control Selection memos and test workbooks to begin FY08 testing
    • Objective – ensure test teams to prepared to start testing
      • Complete Control Selection Matrix
      • Review Control Selection Memo and request DAA signature
      • Provide Workbooks with expected test completion dates
      • Start testing
  • Provide test support during all test phases: Feb 2008 – Apr 2008
  • Hold Close-out Status Meeting: May 2008
    • Objective is to close-out previous test phase and begin next test phase
      • Close-out to Discuss/resolve issues associated with test results, process, etc.
      • Review executive summary and request signature
      • Update SSP (if necessary)
  • Start new cycle with FISMA 09 Status Meeting: June 2008