1. Home
  2. Training Library
  3. CISSP: Domain 3 - Security Architecture & Engineering - Module 4

Principles and life-cycles of cryptography

The course is part of this learning path

Principles and life-cycles of cryptography

This course is the 4th of 6 modules within Domain 3 of the CISSP, covering security architecture and engineering.

Learning Objectives

The objectives of this course are to provide you with and understanding of:

  • The history of cryptography across the era's
  • The principles and life-cycles of cryptography
  • Public Key Infrastructure, known as PKI and the components involved
  • Digital signatures and how they are used
  • Digital rights management (DRM) and associated solutions

Intended Audience

This course is designed for those looking to take the most in-demand information security professional certification currently available, the CISSP.


Any experience relating to information security would be advantageous, but not essential.  All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.


If you have thoughts or suggestions for this course, please contact Cloud Academy at support@cloudacademy.com.


Now, regardless of the kind of cryptographic system we're using, there are core principles embodied in all of them. We have of course the primary one, confidentiality of the content. But there are mechanisms that we use, such as hashing, to ensure that we have integrity of the content that we're encrypting. 

Now, availability is of course part of this as the third element of the CIA triad, but availability is more something to be allowed or prevented than an actual goal such as confidentiality that is to be achieved through the use of these mathematical technologies. Elements that we also need are non-repudiation. Put another way, non-repudiation reflects the inability of a sender or a receiver to deny being in that position, a sender having been the one who signed something or encrypted something and then sent it to a receiver who, being able to decrypt it by whatever the mechanism, also demonstrates that they were in fact the receiver, each one having no basis for repudiating that they were in that role. Cryptography also provides us the ability to do authentication of the senders and receivers. And through these kinds of things, they can be used in another form to provide access control functions. 

Now, like so many things in IT, cryptography and hashing both have life cycles. For a hashing function, the ability to perform collisions or hashes that can reliably be reproduced in an economically feasible fashion without the original source, the relative difficulty of that, the timing, the ease, the length of time that the hashing algorithm has been around, as time goes on, these things may prove to be easier to do. And so the hashing algorithm is moved into a lower form, starting with acceptable, going into deprecated, moving to restricted, and then moving into legacy-use. When these functions get to be relatively easy economically or from a machine time perspective, then it moves down the path into ultimately the legacy mode. For an encryption system, it occupies one of those same four states. But in this particular case, it's a question of how easily the key is retrieved or how easily the ciphered version of the input is broken. Again, this has to do with the mathematical functionality, the age of the algorithm, and the advance of technology, such as computer speeds, that is able to break this stuff in a reasonable, economically feasible fashion. 

So these are the four phases. We begin with acceptable. We move to deprecated mode. Then from there, we move to restricted, and then we move to legacy-use. Now, algorithms like the Advanced Encryption Standard are functioning today in the acceptable mode and all key lengths. At some point in the future, though, it will move to deprecated mode. And that's when technology may have advanced enough that the speed with which it can recover a key is in an economically feasible way and that the information has not lost all of its value. There's no telling what timeframe that might be. It might move from, say, millions of years as it might be today, to a matter of days or months. That might be sufficient to move it to deprecated mode. When it degrades even further, then it will be moved to restricted mode. 

Now, the difference between acceptable, deprecated, and restricted is the restriction on the number of use cases in which you can apply it, that plus the amount of additional restrictions and controls that you place on its use. Say you change the channels through which such traffic will flow. You apply a different set of controls that are more restrictive, more robust around the channel and the functioning through which you'll pass these encrypted messages. And then ultimately it ends up in the legacy-use where it can only be used in very, very isolated, very restricted cases. Eventually with our current systems, all algorithms will end up in legacy mode at some point. DES, the Data Encryption Standard from the mid-'70s, lasted until the year 2000 when it was officially deprecated and taken out of certified use. That's rather a long time, 23 years, especially considering how widely it became used by the public at large. With the advance of technology, that length of time may not last for all algorithms. But AES officially sanctioned in 2002 and this being the year 2019, it has already lasted 17 years, so we'll see. Future algorithms may last even shorter periods of time. 

Now, our selection process of cryptographic mechanisms should be based on a policy. It should be including the standards that are published by authorities like NIST, and we need to have strong and well-followed procedures to ensure that these provide the strength that we need and last as long as we need them to. One other thing that they need to account for is a way to migrate smoothly and relatively painlessly from one cryptographic system to another because we know that that will have to be done at some point. 

So we want to start with our cryptographic algorithm selection. What key options are we going to need? What are the transitioning plans for moving from weakened or compromised algorithms and keys? What are the procedures that we're going to use? Who are they published to? How rigorously do we enforce them? The process of key generation, escrowing of keys, which will require security methods and management, and then ultimate destruction of keys to ensure they're not reused except in archival types of activities. And then of course we have to have an incident reporting process because undoubtedly since humans are involved, incidents will occur, either due to attackers or someone failing to follow a procedure properly. 

Now, cryptography and the cryptographic products and services that are run undoubtedly travel through international channels. Countries have set up policies and procedures and laws by which they govern the use of cryptography. For example, the country of Russia will assign someone who requests it a VPN, which of course is an encrypted tunnel. They, however, retain controls. And the user who has requested a VPN certainly must go through a justification process to obtain it, but the Russian government always has the ability to get into it and find out what this person is doing. The Chinese government has done the same. Now, that may seem a negative reflection, but there are many other governments, such as the French, who do the same thing. Countries understand that this can be used to export in a secure, impenetrable fashion various types of information, and they want visibility into this. So they put in place these rules, and there are restrictions on who can do what. Sometimes it has to do with the key length. For example, the United States back in the early '90s was restricting the export of the DES algorithm to a 40-bit length instead of the 56-bit full length, which at the time was a sound algorithm and judged to be adequate in its protective form.

About the Author
Learning Paths

Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years.  He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant.  His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International.  A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center.  From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.


Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004.   During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide.  He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004.  Mr. leo is an ISC2 Certified Instructor.