CISSP: Domain 2, Module 1
The course is part of this learning path
This course is the first of two modules of Domain 2 of the CISSP, covering asset security.
The objectives of this course are to provide you with and understanding of:
- How to classify information and supporting assets using policies and categorization systems
- How to determine and maintain ownership using data management practices
- Protecting privacy, through regulations, standards, and technology
This course is designed for those looking to take the most in-demand information security professional certification currently available, the CISSP.
Any experience relating to information security would be advantageous, but not essential. All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
So we're going to progress into Section 2 wherein we're going to do determination and maintenance of ownership. Now, clearly for all of our data assets, we're going to have to have good practices. So as some recommendations, here's what we have.
We have a data policy which starts out to define what our strategic goals are going to be for the information that we manage, this of course being a policy is defined at the management level, and it describes overall objectives. From that of course we have to develop more detailed plans to execute to meet the policy. We need to clearly define roles and responsibilities so that those who are going to be working with information in our organization, which as a practical matter will be pretty much everybody, they will understand clearly what the responsibilities of the role they're in and the function they fulfill are going to have to take care of in them. We have to have data quality procedures. Ultimately, the data must always be trusted to make correct decisions, which means we have to safeguard its integrity specifically for that purpose, but that doesn't eliminate the need to protect its confidentiality and its availability. But we have to have ways of measuring and protecting the data quality, whatever that involves. To the extent that we need documentation, we need to do it.
Now, nobody gets into IT to do copious amounts of perfect documentation, but we all know the value of documentation and how we have to suffer when we don't have it. So we have to have good quality, current, reliable documentation. We have to come to some form of understanding that everyone will adhere to these data management practices across the enterprise. One organization can't be very diligent and another one not be. It needs to be spread throughout as the standard that which our organization operates on. And for the tools that we're going to use, such as a database, a CMDB or other method, we're going to have to plan and configure and manage it well because if it is to be the single repository that we rely on, it must be properly designed and used and maintained. Now, all of the program definition of course has to translate into actual day-to-day types of activities. So to make sure that the data that we have gone to the trouble to classify and categorize is maintained in the state that it needs to be to ensure that it's properly used, that it has the proper quality, integrity, and other trustworthy characteristics, we have to be sure that we have good data management practices, which means we have to define detailed steps for how updates are going to be processed to make sure that they go in as they should to try to prevent errors from being plugged in initially, but also how we can catch them after they've been put in through the ongoing data audit that will look at the various data elements and make sure that they conform to the standards we have set. As data evolves, the other aspects of our system, like security and protection and privacy, are going to have to evolve with them. And in keeping with our defense in depth general program guidance, we want to be sure that we put these things in appropriate layers.
At every opportunity, we need to be sure that we make clear statements, that have crisply defined the criteria to make this program a success for both what the data should be, how it should be maintained, and for how access is going to be determined. And from all this, we need to put together a clear policy, documented procedures and so forth that we make available to the workforce to ensure that they understand the requirements, that we publish the data that will be needed to the appropriate parties so that they know what is available, how to get at it, and in what form it comes so that they understand the delivery mechanism and the requesting of access mechanism. So what would our data policy? And this would be the overarching guiding light, so to speak, for the entire program. What should it say? Well, as with all policy, it should establish high-level principles that need to be implemented through the organization that will establish the basis for the framework that we're going to use to manage all of this. As policy is normally, this is used to address the strategic level of issues. And it should be clearly understand that this is truly a strategic process, a strategic initiative that needs to exist at the highest levels in the organization because as it does, it's going to drive downwards all of the practices, all of the procedures that will govern how data is created or acquired, how it's managed, how it's kept in its high-integrity form that we need so that confidentiality, integrity, and availability are always given the definition and given the priority that they need by the given organization based on the given use. But like all policy, it needs to be flexible and dynamic so that the principles that are at work in this policy will be able to apply even as it must adapt and evolve over time.
So throughout this entire process, we need to be thoughtful and in consideration of the various attributes that you see on the slide in front of you. Cost of course has to figure in as an important component, but it's not the only or possibly even not the most important. We have to define ownership and custodianship. Who owns? That is, who will enforce a policy? And then who will see to it day-to-day that that policy is followed? We have to understand the privacy implications because of the legal effects that can occur when privacy violations also occur. It means that we have to define what liabilities might be present and how we can cope with those, mitigate those. We have to understand the sensitivity and criticality of the data elements that we're dealing with, sensitivity being one characteristic that over time it can increase or decrease, and then criticality, what happens if the data is present? What happens if it's not? And what the impacts from those things will be, both positive and negative. Without question, we need to understand what the applicable law has to say about all of this because that's a lot of what will drive our understanding of liability. And we need to understand the policy itself and the process for implementation. Now, that seems like a pretty simple or straightforward question, but the fact is we can write a policy that will say anything. We have to understand what it takes to actually implement the policy, what the mechanisms will be, what the measurements will be. So we have to understand all of these different aspects to get our program fully defined and fully in place.
Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years. He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant. His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International. A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center. From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.
Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004. During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide. He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004. Mr. leo is an ISC2 Certified Instructor.