CISM: Domain 1 - Module 2
The course is part of this learning path
In this course, we start off by looking at constraints that may prevent us from reaching our security objectives before moving on to how to form an action plan. This involves carrying out a gap analysis to see where you are and where you want to be (with regards to information security, of course) and then putting a plan into place to close the gap.
We then need to implement ways to measure progress towards closing the gap and we will look at that in the metrics and monitoring lecture. Finally, we look at the six strategic outcomes which help us to define what success looks like.
- Understand the potential constraints that may impede our security measures
- Learn how to create an action plan to reach our security goals
- Learn how to measure progress through metrics and monitoring
- Understand how we define success
This course is intended for anyone preparing for the Certified Information Security Management exam or anyone who is simply interested in improving their knowledge of information security governance.
Before taking this course, we recommend taking the CISM Foundations learning path first.
We continue on with Section 32 Information Security Governance and our discussion goes to metrics and monitoring. It's often been said quoted from many different sources that if you can't measure it, you can't manage it. And as it turns out, this really is true.
When we look at what the action plan needs, we find that it holds true even here because this is going to require monitoring and a measuring of our progress so that we can report as well as know exactly where we are along the road of our strategy. This should tell us whenever we have achieved a milestone or, for that matter, whenever we've missed it.
When our monitoring is taking place, we should be looking at cost in the case of any mid-course correction which are often found. We will need a process that will distill this information into a usable form, consumable by operational and senior management.
Any metric we choose will need to deliver the value that is described properly and in context whatever it is measuring. This is because it delivers information that is important so that we can manage our operations effectively. It will, of course, have to meet IT security management requirements derived from the strategy. In the larger picture, we'll need to meet business process own needs and provide senior management what it needs to know for informed decision-making.
Now, metrics themselves should be predictable and they need to be actionable so that if we encounter any need for a mid-course correction, we are able to do so based on the information we've gathered. But as always with the question, do metrics really help, we have to be selective in which metrics we choose.
Security metrics alone can't tell us how safe we really are or if programs are effective. We need to develop a set of metrics that represent management's requirements. One source is executing full audits and risk assessments, but these only provide a historical view and they may not be able to guide our day-to-day decisions. So we will need to augment these with yet other metrics.
So in order to get security metrics that are appropriate and that actually assist our efforts, we need to look at these. These are examples that don't say much about security and yet are monitored quite frequently, downtime due to viruses, number of system penetrations, penetration impacts and resulting losses, recovery times, number of vulnerabilities that network scans reveal, and percentage of servers patched. These don't meet predictive or actionable criteria, except to identify areas where action is needed and yet doesn't go quite far enough in telling us all that we need to know.
Some more useful metrics might include probabilities of penetration, a list of exposures that need to be mitigated, the value at risk, and the return on security investment, and our annualized loss expectancy numbers. These do make the more predictive and actionable criteria.
In looking for guidance on how to develop metrics or how to adopt metrics, we can look to some of these as guides: the ISO 27004 from 2016, the COBIT of 2019 which describes over 150 metrics with guidance on how to make your choices, the Center for Internet Security of course is a great source of information and they published a guide with 28 metrics for security.
In looking to our NIST special publication 800 series, we have Volume 55 Revision 1 from 2008 that offers an approach to the selection and implementation of metrics. Now, our impact assessments can only be validated after a negative event has occurred. Simulated attacks can of course help, but you need to conduct many before they provide real value to cover a range of various options.
Now, two key points. Some organizations are attacked more frequently than others. Also, there is a strong correlation between good security management and fewer incidents. So we need to be sure that we don't simply settle for what seem to be good metrics because these may not prove to be relevant or effective in helping us manage our programs.
As always, good metrics require clear goals that they are going to help illuminate for us. The measurements have to align with achieving these goals and metrics are useful at one level, strategic, tactical, or operational, and typically, they function best at only one of these.
Now, steps that we can employ to capture usable security metrics include these. Now, looking at these, we'll find that we can do from the bottom up strong upper level management support that encourages buy-in for the rest of the organization on security. So we begin with that as our foundation.
Moving up to the next level, we have practical security policies and procedures and these stress the need for realistic guidance documents and the authority that we need to enforce them. On top of that, we will be able to define or select quantifiable performance metrics which represent our IT security performance goals. And then at the very top, results-oriented metrics analysis to perform the periodic analysis of metrics data and this is done for a few reasons, one being do they really tell us what we need to know? Are we measuring what we need to measure and producing actionable intelligence from it?
Any program of this type needs to be reviewed periodically so that we know that what we think we're doing, what we think we're getting, the actions that we think we're taking are actually having the positive effects that we want. So it's constantly an adjustment, constantly under examination to ensure that we remain effective.
As always with any program, measuring how well we align with strategic goals is a vital part of our program of measurement and accomplishment. We should always seek to measure security cost performance against business goal cost performance types of standards.
We want to be sure that what we spend is appropriate and in line with the value of the assets that we're protecting. So we should never seek to spend more money on security than what the goals to protect are themselves worth. We should seek to define our security goals in business terms so that what the business itself emphasizes as priorities, we also emphasize as priorities.
What we build has to be tracked to a real need. Security for its own sake is oftentimes worse than worthless. When we seek to do security for its own sake, we lose focus on what the business specifies as our priorities. So we need always to set our goals as tracking close to a real need. For example, if we justify acquiring a firewall, first thing is our goal might be to not let our competitors steal a customer list. The requirement, build a three-layer defense in depth control. And the control, roll out three firewalls. That might be one solution.
On the other hand, we need to be sure that what we're doing is in line with what goals we're trying to meet and the value of the assets that we're trying to protect, in this case, a customer list. The indicators of good alignment between security and business include things like security enabling specific business activities, business activities that can't be performed without risk being managed to ensure that the business activities are not out of line with risk tolerance and our risk appetite, security teams have to therefore listen to business owners, and the business and security goals have to be clear and well-defined and agreed to by both parties.
Security activities have to be mapped therefore to business goals and the steering committee may have to serve as arbitrators, but typically must be made up of key execs not directly involved with the issues in play. Part of what we have to measure is how we're doing generally speaking at risk management. It can be difficult to measure the effectiveness of managing risk because it's not always a black and white number.
Quite a bit of it ends up in the gray. We may have to therefore settle for an indicator that correlates to a risk level. Now, some indicators. How well-defined a risk appetite might be. How complete the overall security strategy is. The number of defined mitigation goals.
Processes for reducing adverse impacts. How we can describe a continuous risk management process that covers all business-critical systems. And then determining a healthy ratio of known versus unknown security incidents. One of the best indicators might be how much negative impact over a year exceeds acceptable levels. For example, 5% of incidents exceeded acceptable levels. That might be a very good indicator of how you're doing.
Both frequency and impact of incidents should go down obviously, and express these units in financial terms for the benefit of senior management because boiling it down to the numbers is what they need in order to manage it and determine effectiveness.
A good question is also, are you delivering value? Now, value delivery is when investment in security provides the best support for business goals. An acceptable level of risk is reached at the lowest reasonable cost. Our key risk indicators and our key goal indicators are how we measure and show our value delivery. These check our security activities to see how we're reaching the goals at that lowest reasonable cost.
One of our principles is that the cost of security should be proportional to the asset value itself. The security resources that are assigned are based on the degree of risk and impact and would be more robust as those things are calculated to be more robust or less in proportion.
Controls are designed based on clear goals with the kind of metrics that establish whether or not those goals are being achieved. There are enough controls in place to reach the desired level of risk, also one of our calculations to ensure that what we do have is in no need of augmentation or change or that it is not. And then we have to periodically test all these things to make sure that what we think we are perceiving, what we think we are measuring is in fact what is being delivered.
So the question also comes, how to know when you're managing your resources well? There are many different measures that tell us when things are going wrong. What would be the indications to show that things are going properly?
We look at indications of effective security resource management in the nature of the following. We find out that the same problem is not recurring so that we don't keep rediscovering the same problem. In other words, we end up reinventing the wheel. We know that we are effectively capturing knowledge so that our lessons learned are not only captured but committed to enhancing our program elements in the future.
We have many standardized processes so that the same things are being done the same way over and over again to avoid problems. We have clearly defined roles that are carrying out all these tasks. Our project plans incorporate information security as a fundamental, one could say natural part of what they're doing.
Security activities address a high percentage of information, assets, and threats. This is of course the very heart of the program that we're dealing with, but we're seeing an effective addressing of these to demonstrate that we are in fact pursuing the right goals. Security has appropriate authority. More a qualitative measure than a quantitative one, but it means that we have the authority to act and to solve the problems that are within our domain to solve. And the per-seat cost of security is relatively low.
So are we performant yet? Effectiveness of the security machinery is measured in performance terms as is every other aspect of the business that this is a part of. And we have defined indicators to show effective performance, demonstrating that it is in fact effectively being accomplished.
We have a respectfully short period of time between detect and reporting of incidents. The frequency of unreported incidents trends downward. We are able to compare ourselves favorably to other organizational units showing reduction in areas where reduction means improvement or increase where increase means improvement.
We're able to determine the effectiveness of our controls, meaning that we have proper measurement and analysis mechanisms in place, and that we are gathering our metrics from the proper places in the processes. There is no unexpected or undetected security events.
There will always be unexpected ones, but this particular metric demonstrates that we have the mechanisms in place to do timely capture of these events' occurrences. We're performing consistent log reviews so that we are keeping tabs on things in a timely manner so that we can act upon them to reduce them.
We have business continuity and disaster recovery test results. And these show that our plan does in fact work. Our key controls are regularly monitored and acted upon when the indications are such that they need to be. And we are achieving clearly defined criteria for our metrics. We are measuring the physical and information security fusion.
In other words, we're looking at how the convergence of this program is going on. So how do we measure or establish key goal indicators for our convergence process? We find that there are no discernible gaps in the asset protection program.
In looking closely, we find that there are no security overlaps which complicate and confuse things. We have assurance activities that are integrated, with the elements delivering their metrics as measurements. We have well-defined roles and responsibilities so that everyone knows what everyone else is doing and what they themselves should be.
Our assurance providers understand their relationships to other functions, and we have effective communication between these assurance functions.
Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years. He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant. His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International. A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center. From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.
Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004. During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide. He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004. Mr. leo is an ISC2 Certified Instructor.