The course is part of this learning path
This course is the 3rdof 6 modules within Domain 3 of the CISSP, covering security architecture and engineering.
Learning Objectives
The objectives of this course are to provide you with and understanding of:
- Vulnerabilities of security architectures, including client and server-based systems, large-scale parallel data systems, distributed systems
- Cloud Computing deployment models and service architecture models
- Methods of cryptography, including both symmetric and asymmetric
Intended Audience
This course is designed for those looking to take the most in-demand information security professional certification currently available, the CISSP.
Prerequisites
Any experience relating to information security would be advantageous, but not essential. All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.
Feedback
If you have thoughts or suggestions for this course, please contact Cloud Academy at support@cloudacademy.com.
Now, some examples of symmetric algorithms are the Caesar Cipher. This is a manual method utilizing a mono-alphabetic substitution Cipher, involving the three-character left-shift. This is conceived in the first century BC by Julius Caesar.
The Spartan Scytale made use of a more physical or mechanical method. Is a tool used to perform the transposition cipher consisting of a cylinder, or the command staff of the field general with a strip of parchment that was wound around it in any particular direction, down the long axis of which was written the message. The recipient therefore must have a rod of the same diameter, on which the parchment was originally wound, and the original message written, so that they can rewrap the parchment containing the message around a rod of the same diameter, so that the letters will line up, and the message will then be readable. It was quick to employ, but for people who knew how this system worked, it was also relatively easy to break. Now, this one came about in the third century BC, predating the Caesar Cipher.
Now, let's fast forward about 2000 years, to the German Enigma. A poly-alphabetic ciphering system, making use of a typewriter keyboard, electrical circuitry and a plug board, and interchangeable rotating discs, made famous by the employment of this device by the Germans during the Second World War. So, let's fast forward another decade or two, to about the early 1960s. The Data Encryption Standard developed by Horst Feistal at the IBM labs, developed a family of algorithms that had it as a core principle, taking a block of input, in plain text and dividing it in half. It began with that, and then built its mathematics around that. This was intended for processing sensitive, but unclassified data. Now, this was undertaken by Horst Feistal at the IBM labs, at the request of the National Bureau of Standards, as NIST was then known, and the NSA. It was published as an official standard in the Federal Information Processing Standards Document, FIPS 46, in 1977, after it was officially adopted as the Data Encryption Standard.
Now, DES, of course, lasted from 1977, as the official standard, until in the latter half of the '90s decade, during which time several actual experiments were done and succeeded in breaking the cipher. In 1994, the first experimental cryptanalysis of DES was performed using linear cryptanalysis, one method of trying to break the cipher. In 1997, the DESCHALL project broke the message encrypted with DES for the first time, but it took six months to do it. In 1998, the Electronic Frontier build a DES cracker, called Deep Crack and it breaks the DES key, bear in mind that the DES key is 56 bits in length, in 56 hours. A year later, Deep Crack and distributed.net break a DES key and 22 hours and 15 minutes. Then, in 2017, a chosen-plaintext attack, using a rainbow table, was found to recover a DES key for a single specific chosen plaintext in 25 seconds.
Now, the progress in methods shows just how susceptible small key ciphers can be. As we go from 1994 to 2017, we find that it goes from being able to analyze it to six months, to 56 hours, to less than 23 hours, down to 25 seconds. What does that mean for future encryption systems? Well, probably nothing good. But it does mean that the technology of developing ciphers must continue to advance at a higher pace than the ability to crack it. So, following these things in the year 2000, it was decided by NIST and NSA, that 56 bit DES had outlived its usefulness, and would no longer be certified as an official algorithm. So 1985 was the last year that the certification was formal, and in 2000, it was dropped. And they announced a contest, looking for a new cipher specifically to replace the now retired 56 bit DES. And in May of 2002, they settled on this one, the Rijndael. It was accepted as the standard, and it comes with key and block sizes of 128, 192 or 256 bits, utilizing these four operations. To date, Rijndael has shown that these advancements have indeed capped encryption technology ahead of the technology for breaking it. But, bear in mind, all symmetric ciphers are susceptible to brute force attacks, which is simply an exhaustive form, trying every attack, every key that is possible, from all zeros to all the ones in the key space. All it requires is computing horsepower and time.
Now, the family of symmetric ciphers is very large. Here we have DES at 56 bits, 3DES, at 168 bits, AES at three different lengths, 128, 192, and 256. IDEA, which was the original Cipher, used in the original product, PGP. Blowfish and Twofish, both of which were designed by Bruce Schneier. RC4, which has variable lengths, a stream cipher that was used originally in WEP, but due to a poor implementation made WEP very weak. RC5, which has keys up to 2048 bits, RC6, which was designed to meet the AES specification, up to 2048. CAST, a family of algorithm starting at 40, very weak, 64, also very weak, 128 bits, considered strong, and then 256 bits, considered very strong. And then we have Serpent, also build to the AES spec, that came up as a runner up in that competition.
So, let us shift our discussion a bit, from the symmetric algorithms, or a secret key, to asymmetric, also known as public keys. Now in asymmetric algorithms, it's called that because it runs on a pair of keys, each one generated by a one way function. The process to look at one key or the other, is much simpler to go in one direction, forward, then to go in the other direction, such as, is done during reverse engineering. Now, determining the private key, through an analysis of the public key, and by that very name, it means that it's in the hands of almost anyone. It is a practical impossibility. But practical impossibility should not be taken to mean an absolute impossibility.
Now, there are different goals in public key encryption. We have authenticity or open message format. This is a public message that by that name, it can be read by anyone. Using public key encryption, this encrypts using the center's private key, but confidentiality is not to concern here. Authenticity of the sender is the concern here. So the encryption with the center's private key means that it can be encrypted by anyone who has the public key that corresponds to the center's private key. Which means anyone can open and read it. But in doing so they authenticate the claimed owner as the authentic owner, thus establishing authentication and non-repudiation. Another mode where confidentiality is the concern, is the secure message format, where the message is considered private for the recipient only. We encrypt this with the receiver's public key, please note the different key that is used. Only one person with the private key can decrypt and that is the intended receiver. Then we have signed and secure format, that provides services to establish confidentiality and authentication. This is encrypted with the sender's private key, and then encrypted a second time, with the receiver's public key.
On this particular example, please note which key is done at which point in the process. This provides mutual authentication, and mutual non-repudiation. Now, in this graphic, you see us combining public key service modes. This illustrates a confidential message with proof of origin, which provides both confidentiality and non-repudiation, but it does not contain a digital signature. So, notice the process, we start from the center with a plaintext, we are going to encrypt a message using the private key of the sender, then taking the intermediate ciphertext, and then encrypting it again, with the public key of the receiver, then we send the final product ciphertext across the open network, and the first operation that will be performed, and notice that what we do on the receivers end is exactly the opposite, and exactly the precise steps, as was done at the sender's end. We will decrypt using the private key of the receiver, which ensures that everything contained can only be opened by the receiver. Then we decrypt the actual content with the public key of the sender, to give us the plain text of the message.
So, as you see there, the last thing the sender does, and the first thing the recipient does is to establish and ensure the confidentiality of the message. The first step at the center does and the last step that the receiver does, establishes proof of origin. And that establishes authentication, and non-repudiation for sender, and in this case receiver, as well. Now RSA is a public key algorithm, and quite possibly the most commonly used one, amongst all IT systems. This one is based on the mathematical challenge of factoring the product of two large prime numbers. Two large prime numbers, each one of which is 100 to 200 decimal digits in length. Now, this can be attacked like all algorithms, and like all algorithms that generate a finite number of keys however large that number might be, it is susceptible to a brute force style of an attack.
Using factorization, the attacker will attack the mathematics directly. Or they will seek to intercept the keys as they pass from one person to another, in a form of interception, or timing. Equally widespread in its use is the Diffie-Hellman algorithm. The Diffie-Hellman algorithm was the original algorithm that created public key encryption. Whitfield Diffie and Martin Hellman were approached by the US military in 1972, with this idea in mind Could you possibly create an encryption algorithm or some mechanism that could be used to encrypt secret keys for distribution in the field, so that we could eliminate physical transfer, or any other form of unprotected transfer of these keys. After three years of work, Diffie and Hellman published their paper 1976. And by the early 80s, the Diffie-Hellman algorithm was officially created, and is still very much in widespread use, as a key encrypting algorithm to encrypt secret keys for distribution across open networks. And here you have an example of Diffie-Hellman key agreement. We have Bob and we have Alice. Each one has a public and private key pair. Each exchanges their public key with the other. Now, bear in mind that this is a publishing not a sharing operation because neither will use their public key for anything other than the publishing to other users. Now, between Bob and Alice, one will create a shared secret key. The Diffie-Hellman algorithm will be used to accomplish this and to transport the secret key between them and ensure that they both have exactly the same key. Because of the way public key algorithms work, this shared secret key will be securely distributed and shared between these two so that it travels as any message would encrypted by the public key of the receiver on the sender's end, so that only the receiver can decrypt it using their private key on the receiver's end.
Now, EL Gamal was an attempt to modify Diffie-Hellman to make it do things that it wasn't originally designed to do. Diffie-Hellman was originally designed, as I mentioned, for encryption of secret keys to be shared across an open network, without exposing the key in plaintext form. El Gamal was an attempt to modify it to do message confidentiality for larger messages, and to do digital signature services. This is based on the use of mathematical functions to calculate discrete logarithms on a finite field, a very common method to create and use digital signature and public key algorithms. It was successful in doing that, but Diffie-Hellman has thus far not widely been used for anything other than security agreement and exchange. Another form of this is the Elliptic Curve Cryptographic algorithm. This one uses calculation of discrete logarithms on a curved field, which makes for more efficient utilization of smaller memory spaces, and calculates keys up to 4k in length. Given that it uses less computational power, and less storage, it is more efficient, and yet has thus far proven to be just as strong as longer algorithms and more resource intensive ones. The Elliptic Curve algorithm does provide the public and private key pair of all public key algorithms. And it can be used for all the same services, data confidentiality, key exchange, and digital signatures, and all the services that accompany those. Because it is usable in a smaller memory space, and smaller resources, it has found widespread use in wireless situations.
Now, here we have a diagram of hybrid cryptography. And by hybrid, we mean that it uses both symmetric and asymmetric methods. So, let's walk through this. We have the large plaintext message, which by large means the message is large, and any attachments that might be with it. So, the first thing we're going to do is decide we're going to encrypt this. The most efficient way is to generate a symmetric key, and use that to encrypt the message. So, our first step is to create a symmetric key, that will be used to encrypt the message. That means that the encryption of the message inclusive of its attachments in the text of the message itself, and the symmetric key will now make up the contents of the message that we're going to send to the recipient. To do that, we're going to encrypt this package, with the public key of the receiver. Now, we will send the encrypted message and the encrypted symmetric key, both encrypted by the public key of the receiver in-band to the receiver. When he receives the package, he will then take out his private key and decrypt the package, extract the symmetric key, decrypt the actual message and its content, and then have the plaintext of the message, and will be able to read it. So let's summarize. Let's compare symmetric and asymmetric algorithms and keys and see which advantages and disadvantages each one has. When it comes to keys, the symmetric, of course, uses one key share between two or more entities, and that must be kept secret. Due to the nature of symmetric algorithms, the secret key, once exposed, can be used by anybody possessing it to decrypt anything ever encrypted by it. In the asymmetric world, one entity has the public key, and the other entity has the private key. The owner is the only one who will ever have his own public key, I'm sorry, his own private key, and all those he will communicate with, will have his public key. But no key is shared. In the exchange, or sharing mode, symmetric keys must be sent out of band. In asymmetric there is no sharing. Public Key and digital certificate are published via the directory system, in this case, LDAP. And they are used to encrypt the symmetric key, which can then be sent in-band. Now, the execution speed highlights the advantage of one, and the disadvantage of the other. Symmetric keys, mechanical efficiency of this is very high.
For asymmetric, because of the complexity of the math that is used, the mechanical efficiency is very, very low. By some calculations, the asymmetric complexity makes for a 10,000 times slower calculation of the keys, versus the mechanical efficiency of the symmetric. Now scalability, on the other hand, is a problem for symmetric. As you see, the formula to calculate the scaling is this, n, which is equal to the length of the key in bits, times the quantity and minus one. And that accounts for the fact that no user shares his own key with himself, divide this by two, and you come up with a complexity formula that shows that the number of keys in a given community of any size as depicted by n, scales up very, very rapidly into very, very large numbers. For example, if you change the number, the letter n for the number three, you get one result, you get three. If you change the number n for the number 10, that formula derives 45 keys, and by the time you get to a user community of 1,000 exchanging the letter n for the number 1,000 produces a number of nearly half a million.
In the asymmetric, the users times two, is the simple formula, because there is no shared key. The intended uses also highlights limitations of symmetric. It is very, very efficient and very effective, especially if the keys are properly protected for doing data encryption, and providing secure communications. On the other hand, the asymmetric will do data encryption at a much lower rate, a much slower rate, due to the mathematical complexity. But it will do secured symmetric key distribution, and it will provide a feature that symmetric systems cannot do, and that is digital signatures. So, symmetric is the king when it comes to providing confidentiality, where the asymmetric not so good on confidentiality for anything large, but it's the king when it comes to authentication and non-repudiation.
Now, in our next section, we'll talk about how we bring these things together to get the best of both. But for now, we're going to stop here, and we're going to end this session. Thank you very much for your attendance. Be with us as we continue on with domain three. Thank you.
Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years. He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant. His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International. A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center. From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.
Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004. During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide. He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004. Mr. leo is an ISC2 Certified Instructor.