1 / 13

Motorola GSD Self-Audits

Motorola GSD Self-Audits. How does it work / what do we do? 1.) What is an audit? 2.) What is the procedure? 3.) What are the “judgement parameters” 4.) How are metrics (ColorChart) evaluated? 5.) What are the deliverables 6.) How can you “prepare your site / people”. What is an Audit….

mateja
Download Presentation

Motorola GSD Self-Audits

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Motorola GSD Self-Audits • How does it work / what do we do? • 1.) What is an audit? • 2.) What is the procedure? • 3.) What are the “judgement parameters” • 4.) How are metrics (ColorChart) evaluated? • 5.) What are the deliverables • 6.) How can you “prepare your site / people”

  2. What is an Audit… • A "snapshot in time”, taken by qualified people in a field of endeavour to assess the competencies of people, policies & processes for compliance with a set group of guidelines (model), executed over time (in the past) that should, philosophically, predict the competencies of the people, policies & processes into the future.

  3. Audit Schedule -1week/site • Day1: team arrives, intros, logistics, audit team reviews documents (nights?) • Day2-3: interview admin staff / users; audit team reviews documents • Day 4: a.m.: first draft report to admin mgr & MD; p.m.: review action item list • Day 5: help / update support admin staff.

  4. Document Review: • Read policy documents to determine applicability & relevance; under CM, dual author, etc. • Read process document for similar content • Review log-files (raw and/or filtered) for appropriate content. Log-file procedures for applicability. Log-file filtering scripts for completeness & relevance. • Select documents & subject areas at random.

  5. Interview people to evaluate: • Understanding of: • Policy: - what is it, why is it there, what are the consequences of not following the policy • Process: - same • Log-file processes - why & how & consequence • Tailor interview to how the interviewee: • Interacts with the issue (user / manager / admin) • Background, knowledge, culture, impact of issue

  6. Success vs Opportunity: • Judgement calls on Success vs Opportunity and [red / yellow / green] needs guidelines: • Does Policy begat Process which is: • Universally understood by all individuals • Well documented in print / electronic form with CM • Being followed, rigourously, over time with documented evidence, thereof • IF all these are true, then Yellow or Success (if true for one year, then Green)

  7. Success vs Opportunity (2 of 2) • From the Audit Color Chart Legend (pg4) • Green: all (parts of) this subject-area comply and are documented & have been followed for 1 year • Yellow: - very large majority (more than 50%) of this subject area comply (policy, process, both documented, & periodic follow-up) but do not have 1 year of documented evidence. • Red: - some portion of this subject area has no: documented policy, no documented process

  8. Final Report: what to expect (1/2) • Intro: • Who we (audit team) are, what site we audited, what our process was (document review, interviews, etc.) • What subject areas or issues we reviewed: OS security, web-access-controls, router ACLs, DRP/BIA/testing, backup, physical, POPI, others. • Body: • 1 short section for each issue/subject area - as the following example:

  9. Final Report: what to expect (2/2) • What we found: 1 short summary per issue: • In the area of “Web Access Controls” we find: • Excellently written policy, well documented, under CM. • Adopted process / procedure template which covers 2/3 of policy. (template appears to have been adopted from another organization with minimal tailoring for local conditions) • Log-file collection system in place, set of log-file filters run on a regular basis. Little documentation that a human reviews either the raw or filtered logs on a regular basis & reports. • Risk: high = potential for intrusions to go undetected • Color: - Yellow: - there is some support for this area, however, the key point of regular review does not exist

  10. Action Items:(appendix to report) • Action Item List: - one “section” in the report per subject area or issue; example: • Web-Access Controls: • 1.) web-access-controls process template review with admin, legal, hr by: dd_mon_1999 owner: JoeAdmin • 2.) put above document under configuration management by: dd_mon_1999 owner: JennySQA • 3.) write policy for log-file-review by: dd_mon_1999 owner: MaryAdmin

  11. Action Items:(appendix to report) • Select individual to process / document log-file reviews by: dd_mon_2000; owner: MissyLog • Etc. etc. etc. • IF there is excellent policy/process/follow-up then say so. • Orient all recommendations / actions as Positive…

  12. Quick-step to success: • POLICIES of focus areas on ColorChart • are the Policies Documented? • PROCESS or PROCEDURE for the implementation of every POLICY area? • Are these Procedures Documented? • Evidence (documentation) procedure has been followed, over time (1 year, min?)

  13. One-Sentence Summary: • Man-hours on process & documentation.

More Related