This course within the CISM Domains learning path looks at risk management and the resources that can be used in order to avoid and tackle risk in an organization. We'll start by looking at risk identification and risk analysis, which is the quantification and comparison of risks. Then we look at a variety of risk management frameworks in use by companies today.
Then we look at the constraints that can hamper your efforts to manage risk, focusing on working with third parties and the technical and human aspects to take into consideration when doing so.
Learning Objectives
- Understand how an organization can identify and analyze risk
- Learn the constraints to risk management
Intended Audience
This course is intended for anyone preparing for the Certified Information Security Management exam or anyone who is simply interested in improving their knowledge of information security governance.
Prerequisites
Before taking this course, we recommend taking the CISM Foundations learning path first.
We continue now with the cloud academies presentation of the CISM examination preparation review seminar. We're starting with section 38 information risk management and resources that help. Now the business value of assets is often represented by the relative criticality or sensitivity of the given item.
Generally, a greater level of either means greater value to the business. And of course, a greater negative impact if that asset is lost or compromised in some manner. It has always been an ideal situation to have the information assets neatly classified and categorized to capture these qualities and values. However, it often proves to be a costly and tedious process. The product of which frequently starts diverging from the standard set by the exercise, thus, while it might be performed, it is rarely kept current.
Alternatively, it might be more profitable to evaluate the business dependency assessment instead. Usually as part of the business impact analysis. Which looks at cause and effect relationships between the operation the given asset as a part of and the consequences of its becoming unavailable. Either way, the analysis must ensure the asset inventory is reasonably complete, including location of assets that are identified and the location of data owners or users or custodians that are identified.
It is advisable to keep the number of classification levels to respectable minimum to reduce complexity and heightened maintainability. It is equally advisable to have it revered by IT stakeholders before publishing and distribution. The exercise itself should lead to identification or definition of security measures, appropriate to all set assets in a given classification.
Now, in the course of determining criticality or sensitivity of the given asset, it is advisable to focus on the impact resulting from the loss of assets rather than the cause of the loss. To do so loses the focus of the analysis, which is to determine the criticality or sensitivity of the asset to some adverse affecting influence. Otherwise you need to look at all events leading to loss. It would be more appropriate to employ the business impact analysis to determine that impact.
So to continue you divide the organization into business units, rating the relative importance of each to all the others included within the scope of the assessment. As an example, business unit B among these three is the most important. The analytic factors in valuation methods should, where possible, reflect those used elsewhere in the business to keep the measurement of these standards consistent.
Next step is to identify critical functions within each of these business units and rank them accordingly. And note the appropriate business unit that they belong to. The critical functions should be assigned their respective priorities within the units.
In this example, each business unit has two critical functions. Of course, this is to keep things simple. In the real world, it's very likely this analysis would be much more complex. And in keeping with our approach, the more important of these units is ranked highest.
Next, identify the assets or resources required for each critical function and rank the assets and resources within each unit respectively. When performing this function, caution must be exercised to ensure that the asset and the resource dependency chains are examined to ensure that potential single points of failure are captured if and where they may exist. It is not enough to look only at the first order assets and other critical items.
Single points of failure may exist lower down the line. And if compromised may render all lengths above it in inert which can just as effectively bring about the service interruption or other kind of compromise. This second and third order items must likewise be examined to ensure that some other quote unquote weakest link remains unknown and unmitigated.
When vulnerabilities exist in any asset or resource they must be discovered and mitigated in order to prevent their effect from effecting up or down the supply chain ultimately to affect the enterprise. The risks found up and down the chain are individually potential sources of failure. But the cumulative risk of vulnerabilities discovered in multiple levels can create a much greater level of technical fragility than any one alone. This also opens the way to determine a more systematic form of mitigation by examining the interrelationships and interdependencies between the assets and their vulnerabilities.
Once again, we're going to use the same ranking logic as before, the more critical the higher the rating. Now, in this map, we see the outcome of this process. Here, we see the business operations and map the specific risks to them. What we're looking for is where the risk originates whatever the risk might be. That's currently under examination.
Once we find where it originates it becomes easier to prioritize respective to other elements. Doing so makes the relationship between and among these assets much clearer. This factor deconstruction process illuminates the character and risk contribution of each resource and asset by process to business unit, to the overall enterprise. Now, the impact assessment and analysis will invariably rest on both qualitative and quantitative factors.
To look at qualitative, we have both advantages and disadvantages. The advantages, it prioritizes risks and identifiers in areas for improvement. But the disadvantages, purely speaking for qualitative it doesn't provide any measurable magnitude comparing one aspect to another.
We have quantitative, which also has advantages and disadvantages. An advantage would be it supports the business based cost benefit analysis of one element over another. But the disadvantages quantitative meanings may be unclear or arbitrary. Bearing in mind that the exclusive application of one type or another of these two types of analyses will have its skewing effect.
This of course will lead to the conclusion that all risk analysis methods must take a balanced hybrid approach between qualitative which provides context and cause and effect factors, and the quantitative which provides the ways and means to measure and compare the various techniques in the scenarios.
So we choose the hybrid methodology or whichever other methodology might be the best fit for the organization. The thing that we find is the methodology chosen compared to another alternative may indeed provide an outcome that is similar regardless, because it is driven by the priorities of the enterprise. The methodologies will tend to rate things in much the same manner. The overall risk assessment process identifies risks and produces controls and countermeasures as we worked the system through to a conclusion.
In summary the risk assessment phases include; risk identification, the risk analysis, which is the quantification and comparison of the findings. And then the overall risk evaluation oftentimes drawn from the qualitative and in combination with the quantitative. And here we see an illustration of the general process.
Let us start at the top with the threat assessment. Starting with threat assessment. We begin the process by looking at vulnerabilities and overall risk. Combining these two as parallel tasks, we evaluate risk then taking a look at control evaluation. That is a point at which we will apply countermeasures if indications are that new or different countermeasures will be required.
As we look at the controls evaluation what we're seeking to learn is whether or not the controls already in place are adequate to the demands being placed on them by risk elements or whether over time this risk profile has changed requiring a change in the controls being used. Once these are in place, we have to evaluate what the residual risk is because an action plan will be required to deal with that. Even if it's only continuous monitoring.
Moving from residual risk through that action plan we then go back to asset identification and valuation. Clearly this is an iterative process that can begin at asset identification and valuation, or it can begin at the concluding action plan from the prior exercise. So we look to determine the location and make the identification of assets, and this should be done and ranked on the basis of criticality or sensitivity of the assets as defined by the business unit that they belong to. We have to determine the value and the relative importance when we're doing this risk assessment.
Now, the valuation itself may be a financial figure, easy to do with hardware or it may be an arbitrary number as designated by business. Now, the valuation of information itself as an asset can be more difficult. This typically is based on how it contributes to revenue, cost to create, cost to recreate or re-acquire or the cost of preservation or all of these.
Individually identifiable information also has to have a value in order for us to make a cost effective analysis of what control measures should be put in place. This of course is aside from any compliance requirement that we may have to see as arbitrary but something that must be accomplished regardless. This of course includes various forms of this sensitive information such as social security numbers or the protected health information of HIPAA or the personally identifiable information that may be be contained in PCI transactions.
Now it is not necessarily the value of the data itself, but it is more like the value of a perspective loss of this data and all the costs associated with that, inclusive of legal fines, remediation types of activities, notification, and so forth.
Now marketing materials can be at risk if they present various forms of inaccuracy. This would be in the form of any litigation that might be resumed pursuant to a misrepresentation charge. Now, we look at information asset categories and we consider the following factors; proprietary information and process. This of course would be things like the intellectual property of patents or trade secrets.
Our current financial records and our projections, these have incredible value because they would reveal to competitors how we stand now and what we plan to do in the future. Anything to do with mergers and acquisitions, of course are very sensitive. Strategic marketing plans, the same is true of this. Trade secrets can portray market advantage to a very great extent, obviously giving them value to competitors or others. Patent and other intellectual property. And then any form of the kinds of sensitive data I've mentioned already.
Now, information asset valuation strategies are there to, first, identify and capture the inventory. And then by some method assess a value on the assets themselves, whatever those assets might be. We start with an matrix of loss scenarios and the potential impacts that can arise. The accuracy evaluation, the getting down to the last dollar of value, so to speak is much less important than being able to establish respective priority levels between the assets and the categories of assets. Applies to the individual assets themselves will also apply to the categories that they occupy. Media reports of high profile breaches can approximate loss potential as well as legal sediments that might occur in various industry segments.
Now the methodologies themselves can be ranging from very complex to very simplistic. What matters is are these realistic? Do they reflect the realities of our business and its operations? Quantitative itself could be very precise but it can be very complex. And truth to tell, quantitative valuation can drive us towards attempting to get to the last dollar. And this of course is not necessary. Orders of magnitude are a much more realistic approach.
We can consider a qualitative approach based on business knowledge or goals or arbitrary factors that may also apply. But as I've said in the past it turns out that we need a hybrid of qualitative and qualitative to come up with a better estimate.
Quantitative values can be based on purchase or replacement price. For example, we could also use value add or intangible values as included measures. For example, $20,000 server, if down could lead to millions of dollars in lost revenue. Intangible assets, our intellectual property like trade secrets and customer loyalty, but these two have a value and we must seek to establish these. If we're to have a total life cycle cost of ownership valuation.
Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years. He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant. His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International. A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center. From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.
Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004. During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide. He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004. Mr. leo is an ISC2 Certified Instructor.