It risk management planning and mitigation tcom 5253 msis 4253 detailed risk analysis
This presentation is the property of its rightful owner.
Sponsored Links
1 / 31

IT Risk Management, Planning and Mitigation TCOM 5253 / MSIS 4253 Detailed Risk Analysis PowerPoint PPT Presentation


  • 77 Views
  • Uploaded on
  • Presentation posted in: General

IT Risk Management, Planning and Mitigation TCOM 5253 / MSIS 4253 Detailed Risk Analysis. 18 October 2007 Charles G. Gray. Review of Summary Level Risk Analysis. Communicating a well-documented risk may trigger stakeholder (business owner) action

Download Presentation

IT Risk Management, Planning and Mitigation TCOM 5253 / MSIS 4253 Detailed Risk Analysis

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


It risk management planning and mitigation tcom 5253 msis 4253 detailed risk analysis

IT Risk Management, Planning and MitigationTCOM 5253 / MSIS 4253Detailed Risk Analysis

18 October 2007

Charles G. Gray

(c) 2007 Charles G. Gray


Review of summary level risk analysis

Review of Summary Level Risk Analysis

  • Communicating a well-documented risk may trigger stakeholder (business owner) action

    • Must have enough detail to determine an appropriate mitigation solution

    • The Risk Management Team must remain involved

(c) 2007 Charles G. Gray


Summary level risk rating

Summary Level Risk Rating

Definition of “moderate” or “high” depends on each organization’s needs

(c) 2007 Charles G. Gray


Summary risk level list

Summary Risk Level List

  • Develop a “summary” level list, including ALL identified assets (slide 13 – last week)

    • Based on slide 29 matrix (from last class)

  • Extra columns for supporting information can be added

  • Tailor the process to meet the organization’s individual needs

  • Every organization must define “high risk” in its own unique enterprise

    • “Medium” impact and “medium” probability may have “moderate” or “high” risk (slide 30)

(c) 2007 Charles G. Gray


Summary level risk rating1

Summary Level Risk Rating

(c) 2007 Charles G. Gray


Preparing for detail level analysis

Preparing for Detail Level Analysis

  • Become familiar with the entire detailed risk analysis process before beginning

  • Leverage the inputs used in the summary level analysis, but include considerably more detail

    • Well organized documentation is essential

    • Microsoft spreadsheets (from the MS Management Guide) are ideal

(c) 2007 Charles G. Gray


Sample risk statements

Sample Risk Statements

  • Summary level – “Within one year, high value servers may be moderately impacted from a work due to unpatched configurations”

  • Detail level (1) – “Within one year, high value servers may be unavailable for three days due to worm propagation caused by unpatched configurations”

  • Detail level (2) “Within one year, high value servers may be compromised,affecting the integrityof data due to worm propagation caused by unpatched configurations”

(c) 2007 Charles G. Gray


Tasks to produce the detailed level list of risks

Tasks to Produce the Detailed Level List of Risks

  • Task one – Determine impact and exposure

  • Task two – Identify current controls

  • Task three – Determine probability of impact

  • Task four – Determine detailed risk level

(c) 2007 Charles G. Gray


Confidentiality or integrity exposure ratings

Confidentiality or Integrity Exposure Ratings

(c) 2007 Charles G. Gray


Availability exposure rating

Availability Exposure Rating

(c) 2007 Charles G. Gray


Composite exposure rating

Composite Exposure Rating

  • Collect exposure ratings for each potential impact

  • Choose the highest value from slide 9 or 10 as the “exposure rating”

    • E. g., if the “confidentiality” rating is 3, and the ”availability” rating is 4, then choose 4 as the “exposure rating”

(c) 2007 Charles G. Gray


Impact values

Impact Values

  • Typical impact values for each impact class

  • May be adjusted to “fit” each organization

(c) 2007 Charles G. Gray


Exposure factor

Exposure Factor

  • Microsoft recommends a linear scale

  • Must be tailored to each organization

(c) 2007 Charles G. Gray


Impact rating

Impact Rating

  • Impact = impact class value (V) (from slide 12) times the exposure factor (EF) (from slide 13)

  • Impact = V * EF

(c) 2007 Charles G. Gray


Impact rating example

Impact Rating (Example)

(c) 2007 Charles G. Gray


Review output from task one

Review - Output from Task One

  • Choose the highest exposure rating between:

    • Confidentiality or integrity of an asset

    • Availability of an asset

  • Assign an exposure factor (EF) for each exposure rating (slide 13)

  • Determine the “impact rating” (slide 14)

  • Result is an asset list sorted by impact

(c) 2007 Charles G. Gray


Identify current controls

Identify Current Controls

  • Business owners/stakeholders should identify the various controls

    • “Directed questioning” by the Risk Management Team may be needed

  • The controls themselves may be “objective”, that is, written down (de jure), or may be only “de facto” (word-of-mouth)

    • “Effectiveness”, however, will probably be subjective (see slides18 and 19)

(c) 2007 Charles G. Gray


Evaluating effectiveness of current controls

Evaluating Effectiveness of Current Controls

  • Effectiveness is subjective and will rely on the experience of the Security Risk Management Team to understand the control environment

  • Answer each question (next slide) and total the values

  • Lower value means the controls are effective and MAY reduce the probability of an exploit occurring

(c) 2007 Charles G. Gray


How effective are current controls

How Effective are Current Controls?

(c) 2007 Charles G. Gray


Control effectiveness example

Control Effectiveness - Example

(c) 2007 Charles G. Gray


Review output from task two

Review – Output from Task Two

  • A list of controls and their effectiveness agreed between the stakeholders and the Risk Management Team

(c) 2007 Charles G. Gray


Determining probability of impact

Determining Probability of Impact

  • Probability rating depends on:

    • Probability of the vulnerability existing in the environment based on attributes of the vulnerability and possible exploit (1-5)

    • Probability of the vulnerability existing based on the effectiveness of current controls (1-5)

  • Relative risk rating =

    probability rating * impact rating

(c) 2007 Charles G. Gray


Vulnerability attributes h

Vulnerability Attributes (H)

  • High (Assign value of 5 if ANY apply)

    • Large attacker population – script kiddie/hobbyist

    • Remotely executable

    • Anonymous privileges needed

    • Externally-published exploitation method

    • Automated attack possible

(c) 2007 Charles G. Gray


Vulnerability attributes m

Vulnerability Attributes (M)

  • Medium (Assign value of 3 if ANY apply)

    • Medium-sized attacker population – expert/specialist

    • Not remotely executable

    • User level privileges required

    • Exploitation method not publicly published

    • Non-automated

(c) 2007 Charles G. Gray


Vulnerability attributes l

Vulnerability Attributes (L)

  • Low (Assign value of 1 if ALL apply)

    • Small attacker population – insider knowledge

    • Not remotely executable

    • Administrator privileges required

    • Exploitation method not publicly published

    • Non-automated

(c) 2007 Charles G. Gray


Vulnerability sum

Vulnerability Sum

(c) 2007 Charles G. Gray


Review output of task three

Review – Output of Task Three

  • Probability rating taking into account the current controls in place

  • Sum of vulnerability rating (slide 26) and control effectiveness (slide 19)

    • Column 9 on the slide 28

(c) 2007 Charles G. Gray


Baseline risk current controls

Baseline Risk – Current Controls

(c) 2007 Charles G. Gray


Summary qualitative ranking

Summary Qualitative Ranking

(c) 2007 Charles G. Gray


Review output of task four

Review – Output of Task Four

  • Detailed prioritized risk list with an objective (mostly) “risk rating” with a range of 0 to 100

  • A risk analysis chart to assist stakeholders in visualizing the relative risk ratings

  • Risk levels should be used only as a guide for decision makers, and some adjustments are allowed by stakeholders

    • However, everybody must recognize that every asset cannot be “number one” on the priority list

(c) 2007 Charles G. Gray


Next week

Next Week

  • Quantifying Risk

  • The hard work starts

    • Putting numbers ($$$) to the assets and the loss expectancy

(c) 2007 Charles G. Gray


  • Login