1. Home
  2. Training Library
  3. Security
  4. Courses
  5. CSSLP Domain 2:1 Policy Decomposition

CIA Triad

Start course
Overview
Difficulty
Intermediate
Duration
43m
Students
13
Ratings
5/5
starstarstarstarstar
Description

This course is the first installment of three courses covering Domain 2 of the CSSLP, covering the topic of policy decomposition.

Learning Objectives

  • Understand the fundamental concepts of information security and operational security
  • Learn about the CIA Triad
  • Learn about Triple A services and how they help keep software available and safe
  • Understand the internal and external requirements for building secure software

Intended Audience

This course is designed for those looking to take the Certified Secure Software Lifecycle Professional (CSSLP)​ certification, or for anyone interested in the topics it covers.

Prerequisites

Any experience relating to information security would be advantageous, but not essential. All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.

Feedback

If you have thoughts or suggestions for this course, please contact Cloud Academy at support@cloudacademy.com.

Transcript

And here, of course, we have the much-discussed CIA Triad. Now, these represent the three most essential qualities of data that must be protected throughout its life cycle to ensure its continued trustworthiness and utility.

Now, as you know, these are the most vitally important qualities about the data, not necessarily the system, but these are the things that the system about the data should protect at the level necessary and reflective of the data's value. In like manner, the systems and applications processing such data must possess the mechanisms and the characteristics that will both produce and protect these data qualities from exposure, corruption or loss.

For this to be the case, those systems and applications must be designed and built with those notions firmly in mind. That being so, designers, developers, and programmers may well be faced with a challenging task of finding a balance between operational utility and effective security in the end product. Part of that will be in the CSSLP CBK, and it contains the knowledge about security essentials, of course, but it goes far beyond that by placing these in the context of the development process of an application or a system.

By giving security its proper place in that process, and then describing how to integrate the two, that is, the functional utility to meet the business need and the security necessary to protect the business asset, that is the data that is going to be processed by the system or application. The CSSLP is better equipped to contribute information and guidance to its team about how to achieve that optimal balance. And so here, we're going to talk about the CIA Triad and the data life cycle phases and how the confidentiality, integrity and availability attributes must be applied and attained in each and every phase of this.

For example, when we talk about the create or receive, there at the top right, here, we want to be sure that our data shows authenticity of origin, whether it was created or obtained from another source. As we move around the cycle, when it comes to storage, we need to have the security of the storage while we're assigning the appropriate classification and categorization to the data that we've either created or acquired.

Moving to usage, we need to be sure that all cases of usage are authorized, that they're based on policy and that they have a defined need to know for each user accessing that data. Accompanying usage is, of course, the sharing of this data. This sharing must be equally authorized and authenticated in every case and there should be a need to know parameter associated with each case of the sharing.

At some point, the data, of course, moves into more of an archival phase. That is to say, it moves out of something that is completely current and constantly in use to something that is accessed on a much less frequent basis and perhaps is even a bit deprecated. But here, we still need to emphasize the need for secure and compliant archival and retrieval storage and processes for accessing it when it's moved into this phase.

Its final phase is for disposition and this usually means, somehow or another, the data will be destroyed or rendered on usable, indecipherable or otherwise meaningless in the form that it ends up in to anyone who should access it. And yet, at this point we often hear about violations and breaches that take place because of inadequate measures being taken. So what we need is assured methods of disposal to obviate the residual risks that this data will fall into the wrong hands in a usable form and have some residual value for them.

So a brief review is in line for what we need to discuss further on. Now, confidentiality, of course, means preserving the approved protection mechanisms and employing them in proper configuration so that they perform as intended throughout the entire life cycle. In the case of confidentiality and our compromise being primarily unauthorized disclosure, we want to be sure that we have proper protections that we'll see to this requirement being met.

Now privacy, as a subset of confidentiality, refers to data about persons who can be identified from those particular artifacts. And this is included in confidentiality, but it has special requirements for protection in accordance with various regulations. The compromise that goes along with confidentiality is a breach that is the unauthorized release of this information in a human readable form to an unauthorized party. And so that is the primary attack type or compromise that we're working against.

So, let's deconstruct this a bit and examine the basic criteria applied to grant access and what they reflect. So first we have the what. Now this is the designated level and state of the information object in question. These data objects would be either Data-in-Motion, or DIM for short, Data-at-Rest, or DAR, for short, and Data-in-Use, or DIU. Then comes who. Now this must be a clearly defined subject with a properly defined need to know in order for the subject to even attempt to get access authorized by the proper authority to the data object in question. Followed by how. This is the explicit authorization for access to the information given by a process separate from defining what the need to know of that particular subject is. And then we have aware. This is defined limits that regard permissions on the subject, object, location and context, and in the planetary wide network world of the internet, this where becomes a key element of this decision process.

From confidentiality, we move on to integrity. Now, as you would expect, integrity means information assets conform to assign parameters and perform as expected. Now, it of course includes things like quality, authenticity, known and authentic origin and other characteristics that ultimately mean the data object and question is trustworthy and meets its utility types of requirements. Now, in this case, a successful attack for, or a compliance failure that would impact integrity, produces corruption or contamination of the data object. And this would include results such as disinformation or misinformation as well as outright falsification and corruption.

Now, some integrity assurance measures that we have include sound design techniques, and preferably the ones that employ standards that are widely accepted and proven as good, effective configuration management processes to ensure that we guard against any sort of flaw or bug being introduced by a less than adequate process. Verification and testing processes, which of course must be properly defined, properly positioned to test the right thing in the right place and that they are correctly done.

Pre and post-implementation rules, first to understand what we're faced with. and second, what results we actually attained. And then of course, we have to make sure that there are limitations, bounds and confinement controls placed around the data to ensure that the data changes only in ways that it is authorized to change and that we have ways of detecting any change that's been done, whether it's authorized, appropriate or otherwise. And last, but certainly not least, we have the aid of the CIA Triad availability. In its simple terms, this means that the resource or object is available to an authorized user, when and where it's needed and that it's in the proper format and state when it gets there.

Now, this of course includes the fact that the integrity is intact and that the data, when it's being accessed and used by the authorized subject, is in a trustworthy known state. Now, here's some examples of things that affect resource availability. Over the last couple of decades, we've had various events such as the Morris Worm, which over a period of several hours brought down several thousand Unix computers.

We have the IBM Christmas Tree Virus, which was, in fact, a mistake made internally to the company, but the mistake was so pervasive, it brought down over 205,000 users that were affected in less than four hours planet wide. We have, of course, the ever-present trouble of a misconfigured backbone router and other ancillary devices that depend on it. We have self-healing networks and fault tolerance systems. When they don't work, availability is damaged. And a way of recovering from all of this is to place redundant capabilities in various critical areas and over various critical functions.

Now, in some organizations, a loss of resource availability may mean potential loss of thousands to millions of dollars, as might happen in banks and brokerages. But in others, it could mean the loss of life, such as in a hospital or during flights with our space agency, NASA. There are of course cases where a flaw in the software produces little impact, but cases like that are getting to be more and more rare all the time.

Now, in this internet world, we're beginning to evolve into a state where full-time, 24 hours a day, seven day a week availability is becoming the norm that is expected. And yet, the computers still require time to be maintained, as well as recover, from various flaws or attacks. The time that the system will be available starting at 90%, or one nine, and you see that we have actually a rather large amount of time available throughout the year for any of these types of events and including standard maintenance.

In fact, if we look at the downtime per year, we have more than a month. And this is at 90%. When we bump that up by 10% to 99, or two nines, 36.5 days drops to 3.65, which is a rather drastic decrease. And yet 3.65 days, taken by itself, seems like a considerable amount of time. Now, as we bump this up by a factor of 10% each time, we go from 3.6 days to 99.9%, and that drops to 8.76 hours. Taken another, step 52 minutes, less than an hour per year. And it keeps dropping, down to seven nines availability with 3.15 seconds.

Now, as I said, the expectation that the internet based applications that we are increasingly using seem to have the ability to be there 24 hours a day, seven days a week. Basically, anytime you sign on to any system anywhere or browse, and yet, all the computers that are being used, must come up with creative ways to ensure that, where it's a contract relationship, they actually can be there, virtually any time you touch a key and sign into the application.

Now, one of the things, speaking of contracts, that we have to be careful of is what the word availability and its counterpart, uptime, may mean. In terms of contract language, and this is something that we have to bear in mind during the requirements development phase, because this is what it will translate into when the software becomes in use, these are not necessarily the same in terms of how they're defined in contract language, but we must be certain we understand which one is the one that is being committed so that the software is designed with that in mind.

About the Author
Avatar
Ross Leo
Instructor
Students
4042
Courses
55
Learning Paths
10

Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years.  He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant.  His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International.  A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center.  From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.

 

Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004.   During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide.  He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004.  Mr. leo is an ISC2 Certified Instructor.