Part Two: Risk Management Frameworks

The course is part of this learning path

Part Two: Risk Management Frameworks
Overview
Difficulty
Beginner
Duration
50m
Students
68
Ratings
5/5
starstarstarstarstar
Description

This course explores risk analysis and prepares you for the CISM examination, which will cover the significant aspects of risk. We'll cover different risk levels and types of risk and how they can potentially affect an organization. We also look at the risk assessment cycle and the stages required when analyzing risk. You'll also learn about the various risk analysis methods available. Then we'll move on to how risk analysis can be used when planning and deploying risk controls and countermeasures.

If you have any feedback relating to this course, please contact us at support@cloudacademy.com.

Learning Objectives

  • Identify risk levels and potential impact of given risks upon the assets
  • Learn about the risk assessment cycle
  • Learn about different risk analysis methods including qualitative, semiquantitative, quantitative, OCTAVE, and FAIR
  • How to use risk analysis to control threats and risk
  • Define a strategy for deploying risk countermeasures

Intended Audience

This course is intended for those looking to take the CISM (Certified Information Security Manager) exam or anyone who wants to improve their understanding of information security.

Prerequisites

Any experience relating to information security would be advantageous, but not essential. All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.

Transcript

Here you see the latest issue from NIST, the Risk Management Framework, which is usually characterized in their special publication series 800, volume 37 and volume 39. The first step is to prepare by framing the risk. Second, we are going to actually conduct the assessment. And there you see the five steps normally involved in doing this process. Third, we are going to communicate the results to management to inform their decision-making process. And fourth, we're going to maintain the assessment, which encompasses all of the risk management activities that we will do in an ongoing continuous manner.

Historically, there are two families of general analysis techniques. The first is qualitative analysis method. Qualitative analysis relies less on numbers, meaning orders of probability and dollar values and more on scenarios and cause and effect of the real interaction between the threat and the asset.

Here you see a matrix that allows us to evaluate the likelihood of an occurrence of an adverse event and the level of impact of that adverse event, and the likelihood of occurrence. We have five levels from improbable to frequent. For the consequence or the impact, we similarly have five levels from negligible to catastrophic.

In this cross-impact matrix, we're then able to rate from low-risk to very high, by going across to measure the likelihood of occurrence, and locating the intersection between that and what level of damage we believe will result. While it is true that eventually our analysis will have to get numbers meaning dollars, this will enable us to get a rough order of magnitude, of impact to the asset, and interpolate that to his subsequent impact of the operation to which the asset belongs.

A semiquantitative analysis rests on a technique that is largely qualitative, but it applies numeric values in the matrix instead of low, medium, high types of indicators. It is true that any form of non-purely quantitative analysis that these values that we place in these matrices are going to be largely subjective, rather than largely objective as might be indicated by the numbers derive from the past impacts of a similar type. This gives us an impact and a likelihood cross-impact matrix that is similar to the one of the previous slide but by applying numbers is less subjective.

Then of course, we have the quantitative analysis. This is a family of techniques that relies very heavily on numbers. Meaning probability indicators and dollar values. The formulas that are used here, take the probabilities of occurrence and the dollar values and calculate the terms that you see there on the slide. Probability of event occurrence, monetary impact, technical impact, operational impact, and human impact. 

Quantitative analysis is oftentimes preferred over qualitative analysis, due to the fact that the objective nature of the numbers tends to give confidence, because of how they are derived and presented. This objectivity indicates the removal of interpretation and opinion. It is nevertheless, true that every calculation will be a melding of qualitative and quantitative methods.

The qualitative will describe the context in which these events will occur and these assets may be impacted by agents. The quantitative will describe the numbers as before, probability and dollar values. No risk analysis can be done in the abstract. A context of interaction must be defined in order to give these numbers meaning. But it must do a risk analysis and not derive the numbers would provide no basis for doing cost benefit analysis or loss evaluation, or supporting insurance claims for such losses. And so therefore, the combination must be the way that these risk analysis are to be done.

Various types of derivatives of these risk analysis techniques, have been produced over the years. One form is called value at risk. This type is oftentimes required as one of the analysis techniques used in financial sectors to determine value at risk. However, it requires a lot of historical data that has been very carefully documented and authenticated. It shows promise for information security management usage.

It's worth mentioning that every method of risk analysis requires accurate data and quite likely lots of it. But any method that can be used that improves our results and that lead to better mitigation strategies is worth investigating.

One method of risk analysis very much favored by the Air Force and the military in general, is called OCTAVE. OCTAVE is an acronym which stands for Operationally Critical Threat, Asset and Vulnerability Evaluation and tends to be a very tactical and detailed method. As with all methods of risk analysis, it is used to identify and prioritize and manage risk.

It has three phases. First is the location and identification of all assets within the scope of the analysis project. The second is to locate and quantify all networks and components required for each asset and in detail, evaluates all vulnerabilities within this context. To this the third step is where we assign risk to each asset and informs our decision-making process.

As I mentioned, there are many different risk analysis methods. Some are extremely technical and are therefore rather unwieldy for everyday type of use. Others are an exact enough that their results are oftentimes questionable. It can therefore be easily said that without excellent data and proper analysis techniques the results of any risk analysis method would be suspect. You see several other methods for performing risk analysis. Some of these methods have found application in very specific types of operational settings while others are more generally applicable to a variety of contexts.

The FAIR method, for example, which is Factor Analysis of Information Risk is defined by the FAIR Institute. This relatively younger method seeks to evaluate each factor involved in risk calculation by deconstructing the situation and assigning a level of risk or impact or value to each of the factors identified. But whichever risk analysis method is selected for use it must be recognized the proper analysis techniques, quality, and quantity of data will play a very large role in the integrity of the results produced.

About the Author
Avatar
Ross Leo
Instructor
Students
3580
Courses
47
Learning Paths
6

Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years.  He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant.  His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International.  A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center.  From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.

 

Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004.   During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide.  He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004.  Mr. leo is an ISC2 Certified Instructor.