1 / 22

Security Metrics

Security Metrics. Pete Lindstrom, CISSP Research Director Spire Security, LLC www.spiresecurity.com petelind@spiresecurity.com. Risk and Control Metrics. Agenda. System & Information Assets Estimating Loss Controls - Four Disciplines Risk & Control Metrics “Business” Metrics

chapa
Download Presentation

Security Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Security Metrics Pete Lindstrom, CISSP Research Director Spire Security, LLC www.spiresecurity.com petelind@spiresecurity.com

  2. Risk and Control Metrics

  3. Agenda • System & Information Assets • Estimating Loss • Controls - Four Disciplines • Risk & Control Metrics • “Business” Metrics • Measure Security Spending • Activity-based Costing • ROI & ROSI

  4. You Can’t Quantify Risk Yes, you can quantify risk.

  5. The Precedent is Set Stollenwerk v. TriWest Health Care Alliance, No. 03-0185 PHX SRB (D. Ariz.) “Furthermore, the affidavit of Plaintiffs' expert conclusorily posits that Plaintiffs' risk of identity fraud is significantly increased without quantifying this risk. Defining "significant" for the purpose of awarding credit monitoring is a matter of law for the Court, however, and mere allegations that an increase is significant do not constitute evidence. Similarly, although Plaintiff's expert opines that credit monitoring will "substantially" reduce the risk of identity fraud, he fails to quantify the reduction of risk in objective terms. Because the Court finds that there is no evidence in the record before it that Plaintiffs‘ personal information itself endured significant exposure, that Plaintiffs' risk of identity fraud is significantly increased, or that credit monitoring will reduce the risk of identity fraud to the necessary degree, Defendant's motion for summary judgment must be granted as to this case even if credit monitoring were available in other circumstances.”

  6. Risk is NOT What You Think it is You may have some preconceived notions (let’s dispel them now): • Risk is NOT static; it is dynamic and fluctuates constantly with potentially high degrees of variation – like any financial index. • Risk is NOT about the possibility that something bad could happen; it is about the probability that it might happen. • Risk is NOT some pie-in-the-sky academic exercise; you have all of the necessary information available to you today. • Risk is NOT a vague, ambiguous concept; it is a continuum along which you can plot many levels of tolerance and aversion.

  7. What is Risk, then? • Risk is the likelihood that something unwanted will happen. • You define what is unwanted (I will give you some ideas). • The goal of any information security program is to minimize risk within the constraints set forth by the organization, technical environment, and available resources.

  8. Three Faces of Risk • Risk = likelihood of negative outcome, where the impact of the negative outcome is understood (and sometimes quantified). • Three faces of risk: • Manifest Risk – The risk of attack or compromise associated with end-user system events. (Activity) • Inherent Risk – the risk associated with the “possibility” of attack due to the availability or exposure of targets. (Asset) • Contributory Risk – the risk related to control process failure and/or incompletene. (Admin)

  9. 1. Define Unwanted Outcomes Step 1 to Quantifying Risk Take C, I, A objectives to the next level: • Content is read by inappropriate people (confidentiality breach). • Content is inappropriately modified (Integrity and authenticity breach) • Content access is interrupted (availability breach). • The Internet has made resources as important as data/content: • Program access is interrupted (productivity breach). • Programs are abused (liability breach). • UO’s are compromises of these objectives, precipitated by attacks.

  10. The Ginsu Approach to UOs Attacks Compromises Inbound (In-Transit) Stored (At-Rest) Outbound (In-Transit) Confidentiality Sniff Copy (“steal”) Leak Data/Information Data/Information Integrity Modify Redirect Spoof, Replay, Insert Availability Overload Delete Overload Productivity Resources Overload Distract Consume Liability Relay/Bounce Abuse (illegal) Propagate

  11. 2. Define the Event Set “Universe” • Once we have defined unwanted outcomes, we can “back up” to identify the event set of all possible outcomes (that include both wanted and unwanted outcomes): • 1) similar events that can create the outcome (attack/exploit vectors); or • 2) all objects that could be affected by the outcome. (more later). • The most common events in data processing are: • Network Flows • [Host- or application-based] Sessions • Program Operations • [Content-oriented] Transactions / Messages

  12. Recall IT Activities (Events) • Network Layer: Flows • Source IP, Dest IP, Dest Port • Inbound and/or Outbound • Host Layer: Sessions • Sessions under management • Number of logins • Application Layer: Program Operations • System calls • Application calls • Data Layer: Transactions • Messages • Business Activities (financial trades, purchase orders, published articles, etc.) • Queries – Record Retrieval

  13. 3. Calculate Risk Total Events Good Events Bad Events Step 1: Define unwanted outcomes Risk = Bad Events Total (Good + Bad) Events Step 2: Define event set Risk = Bad Emails Total (Good + Bad) Emails Step 3: Calculate risk: count and divide step 1 by 2:

  14. This is “Manifest” Risk • Manifest Risk is that risk associated with “real-time” or ongoing data processing activities in your computing environment. • (I repeat) The most common events in data processing are: • Network Flows • [Host- or application-based] Sessions • Program Operations • [Content-oriented] Transactions / Messages • The philosophy is “you can’t have a compromise without an attack”; if your systems and data never get touched, they can’t result in a compromise. • But it’s really not that simple, is it? Enter the INFORMATION SECURITY PROFESSIONAL!

  15. Manifest Risk • Events occurring within the computing environment. (Actual) • Philosophy: A compromise can’t occur without online activity. • Count discrete activities. • Actual Flows (network) • Actual Sessions (system) • Actual Program Commands (application) • Actual Transactions (data) • Count number of “bad” activities.

  16. 4. Calculate Control Coverage • Recall that risk is dynamic. • The key factor in minimizing risk is the set of controls we place over the risk. • We have existing controls and apply new ones throughout our computing environment constantly. • Control Coverage is the percent of events that any or all of our controls evaluate.

  17. Calculate Control Coverage Total Events Good Events Bad Events Controlled Uncontrolled Uncontrolled Controlled Note: some set of good events and some set of bad events are uncontrolled. Coverage = Controlled Events Total Events

  18. 5. Control Success/Failure Total Events Good Events Bad Events Controlled Uncontrolled Uncontrolled Controlled Allowed Denied Allowed Denied Success Failure Lucky Failure Failure Success (false positive) (omission) (false negative)

  19. 6. Calculate “Residual” Manifest Risk Total Events “Residual” manifest risk = False Negatives + Omissions Total Events Good Events (“wanted”) Bad Events (“unwanted”) denied allowed Controlled Events Uncontrolled Events

  20. Email Risk Email Messages 2 Legitimate Email Spam 1 Controlled Uncontrolled Uncontrolled Controlled 3 4 Allowed Filtered Allowed Filtered Success Failure Lucky Failure Failure Success 7 8 6 5 (false pos) (omission) (false neg) Risk = Spam Email Msgs Coverage = Controlled Email Msgs Effectiveness = Success Email Msgs “Resid” Risk = Incidents Email Msgs + + + 3 5 7 4 6 8 1 2 2 2 2

  21. Buffer Overflow Risk System Calls 2 1 Legitimate Calls Overflows Controlled Uncontrolled Uncontrolled Controlled 3 4 Allowed Blocked Allowed Blocked Success Failure Lucky Failure Failure Success 7 8 6 5 (false pos) (omission) (false neg) Risk = Overflows Sys Calls Coverage = Controlled Sys Calls Effectiveness = Success Sys Calls “Resid” Risk = Incidents Sys Calls + + + 3 5 7 4 6 8 1 2 2 2 2

  22. Agree? Disagree? Pete Lindstrom petelind@spiresecurity.com www.spiresecurity.com

More Related