CISSP: Domain 8, Module 2
The course is part of this learning path
This course is the second module of Domain 8 of the CISSP, covering Software Development Security.
The objectives of this course are to provide you with an understanding of:
- The database environment
- Software development and the world of the web
This course is designed for those looking to take the most in-demand information security professional certification currently available, the CISSP.
Any experience relating to information security would be advantageous, but not essential. All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
We're going to continue our discussion of Domain 8: Software Development Security and move into section four, where we discuss the software development and the world of the web. Now the world of the web, of course, is the way that most businesses at an increasing number are doing business electronically. The web application environment, therefore, presents us with a number of blessings and a number of challenges. For one thing, when it comes to making the presence of your product or service or company known, web applications are designed to be globally accessible. When it's published on the web, it can be seen literally worldwide. They are consequently heavily advertised.
Now, there are features, therefore, that administrators find are cumbersome in this kind of an environment. For one, oftentimes, logging is turned off. For people who surf the web, they go to a website and then leave and go elsewhere. Logging is simply a record of their having been there and there may not be any artifact left behind that would be in value to the people who analyze the traffic that patronizes their website. And so administrators turn off logging to make sure that this functionally useless data is not recorded.
Now, as we know, the web application environment presents a web page. And these attacks that occur against web pages are against that outward-facing application level. As a consequence of all this, web-based applications are really not well suited for firewalls or intrusion detection systems. So we need to identify what the threats and the protective measures are that we need to take. This may not be possible in the traditional way, using firewalls, antivirus software, intrusion detection, so forth, unless we reinforce it with stronger internal processes, so that what we publish on the web is actually stronger and more resilient against attacks to begin with.
This very likely means we need a particular signup process so that when web servers are put out there and the web pages are published, they're already hardened and made stronger to resist attacks. Hardening the operating system is a very common step, where we harden the platform, harden the operating system, and then build the application with these kinds of attacks and resistance to them in mind. We extend web and network vulnerability scans prior to deployment, so that whatever vulnerabilities might exist we know and can fix before they get assaulted by attackers elsewhere in the world. It doesn't hurt to passively assess IDS or IPS technology, but it may be a mixed bag as to whether or not it adds value. Nonetheless, it should be assessed before they're discarded out of hand.
For certain applications, we can build in application proxy firewalls to add strength for those kinds of applications where logging in to authorized users is then made available through a hardened process. As always, though, we should disable or remove any unnecessary documentation in libraries, so that should an attacker get deep enough into the application, they'll find nothing there to aid them in going further. Along with that, we should remove or appropriately secure any administrative interfaces.
A general rule should be that if it isn't needed for the website to operate, it should be removed. We should of course only allow access from authorized hosts or networks. But for things that should allow access from anywhere on the planet, we need to be sure that we take these hardening steps. One of the things that is still a common error and commonly leads to a website being compromised, is that somewhere along the line credentials have been hardcoded into the web pages that are visible to the web on the outside and they have not been removed to the change control process.
This represents one of the very first things that any web attacker will look for when they first get in. And unfortunately, they're not often disappointed. For those websites where accounts are necessary, account lockout, and extended logging and audit controls should be in place. And as the old saying goes about a chain, we should ensure that the interface itself is at least as secure as the rest of the application since it often proves to be the weakest link in this particular chain.
Now, to the points we were just making, a group known as the Open Web Application Security Project, or OWASP, has built and published documentation outlining a framework that they have developed over the decades. This framework is published in a development guide, a code review guide, a testing guide, they publish periodically a top 10 web application security vulnerabilities and they have a guide for OWASP Mobile. And the OWASP Guide Framework has been widely used and should be more widely used for the development teams that build and publish websites. And as we talk about development, of course, the subject of programming languages must always come up.
Now, a programming language, to put it simply, is a set of rules and the language and syntax that tells the computer what operations to perform, how to perform them, when to go looking for variables, when to bring them in, when to accept various kinds of input and so on. Now, programming languages come in various generations. And the generations that we know are somewhat about a chronological order, stemming from the 40s and 50s when computers were first built to the current day. But this is really more about functional sophistication and their nearness to normal human language.
We begin with first generation, the binary, the zeros and ones, that are very simple instructions that are executed and done so directly by the CPU. Following that came the assembly language, which most computers have built for them. This uses hexadecimal mnemonics and meta-statements as instructions and commands. This language is interpreted and reduced to the zeros and ones of binary to enable them to be executed. Some computers can execute this directly. We have, of course, our third generation, known as high-level, and these have meaningful words as the commands.
Languages that are examples of this are Basic, COBOL, C++, Pascal, and about 300 others and they are still the most commonly used. From there we went to fourth generation called very high level. These are more refined with more abstracted evolutions of third-generation type languages and databases including SQL, FOCUS and PowerBuilder. We find these most often in third- and fourth-generation databases. Then we have the fifth generation which are natural language. And these statements very closely resemble human speech. Examples include Lisp, ProLog and OPS5.
Now, to restate, these generations are not about chronological order nearly as much as they are about this sophistication of functionality and the nearness of human speech. Third-generation languages are being developed even now, and are still the most popular forms. But the fifth-generation languages, as they're known, are many times contemporaries of the third-generation languages. One very popular language environment used predominantly on the web, but in many other places as well, is, of course, Java. Now Java has had a bad reputation for ignoring or circumventing security for quite some time. But in recent years, Java has been brought under control through the use of verifiers or interpreters that helps to ensure type safety and it is this that is primarily responsible for memory and bounds checking.
It is, of course, an object-oriented type of programming language environment. It loads classes and unloads classes dynamically from its runtime environment, or the Java Virtual Machine. And it has a security manager function that actions and acts as the gatekeeper to protect against rogue functionality, which for a while, was what Java was famous for. Or perhaps I should say, infamous.
Now in the world of object-oriented technology, we have the thought here that programming environments are now putting together production level programs as though we are assembling Legos. As such, we have different kinds of techniques that we have to use to simplify how we're going to put these things together, how we will build with these Legos of programming.
One technique that we have to use is encapsulation. Encapsulation is where a class defines only the data that it needs to be concerned with when it has an instance of that class, in other words, an object is run, the code will not be able to accidentally access any other data. If we think of it as a black box, the object sets encapsulated, knowing what it must do when it receives a certain input. It receives a certain input processes that input and generates its required output. Anything else that happens, it ignores and is ignored by everything else.
We have inheritance, which is the concept of a data class deriving certain attributes from a class rather than declaring them directly. It also makes it possible for a subclass to inherit some or all of the main characteristics from a superclass above it. Other characteristics their inherent and object-oriented include polymorphism, which is literally translated many forms or many shapes. It means that different objects may respond to the same command in different ways. So using the example of the black box, if the black box has a certain amount of data within it, and it's manipulated in one way, a different black box handled by the same program produces a different result. And the programmer will select which black box they want, depending upon which result they need.
Then we have polyinstantiation, which again, literally translated means many instances. This one creates a new version of an object by changing the attributes that it contains. This is oftentimes done in response to a change in the privileges or the user doing the accessing. Now in distributed object-oriented systems, it allows for the division of these components and spread across multiple either physical or logical or both locations. In other words, we distribute the processing to share the load and perhaps increase the processing efficiency, while hopefully not expanding the latency in doing so.
Now in these distributed systems, we must find the structure through which we bind these different components together. one form of this is called CORBA, or the Common Object Request Broker Architecture. And this is put out by a group called the OMG, the Object Management Group and it published this a set of standards that addresses the need for interoperability between hardware and software components. Most languages that are published have libraries and tool sets that accompany the language itself.
Typically, they have a library that consists of pre-written code, classes, procedures, scripts, and configuration data that can be accessed by the developer as they design and build their application. These standard libraries contained usually within the program environment for the given language is a library that contains multiple forms of routines, multiple executions that are possible.
What this brings to the development world, though, is very important because by a programmer designing and building and using the stored routines that are within the standard libraries, it allows them to more rapidly bring things together to produce a finished product. Thus, say increase the dependability by having well tested and consistent performance. Using these tested components, it reduces the process risk of how they build whatever they're building.
Having the effective use of the specialists enable them to draw on that experience and that program as a result, again, enhancing the standardization, reducing process risk, and increasing the expected dependability in performance. It allows them to get into closer compliance with public standards and not to be understated. It helps to accelerate the development process.
Now these are some examples of Common Programming Language Standard Libraries. We have, of course, the standard library that comes with C, C++, The Framework Class Library, Java, Ruby, and many other languages. Some of the more ancient languages like COBOL, and Fortran are among those that also have very plentiful standard libraries that can derive the benefits of accelerated development while reducing risk and making them more compliant with existing standards. And we have a lot of programming tools that are out there, programming tools, enable an application developer to code, test, debug, and maintain these programs that might otherwise have to be broken down entirely by hand each and every time. The program has to be investigated and brought up to date by the programmer.
Now one thing that exists in many environments is what is called a runtime. Now runtime environment in the days of DOS on the PC would create a small and fully self contained environment that contained all the components to run, say a game program within it without troubling the rest of the computer. And in doing so isolating what is contained within the runtime all the components to make the program appear that it owns the entire computer.
Now this concept, of course, has been expanded. A Java Virtual Machine is itself a runtime. So it allows by having all of the components within it to enable the Java applet to run fully self contained. It also sandboxes the application to prevented from interfering with any of the operations elsewhere being executed in the system. We have of course, one of our most non-favorite types of software, the malicious software or malware.
Now malware, of course, is a program that has hostile intent and executes in a way that we absolutely do not want. It compromises data and programs to the point where they become untrustworthy or no longer available. Generally they use the resources of the system that it has attacked in ways that we would not want. And of those, viruses are the largest class. So let's investigate more of the types that we've got of malware. Malware types and I'm sure you're well familiar with these are worms, hoaxes, Trojans, distributed denial-of-services zombies, logic bombs, spyware and adware, pranks, and botnets.
Now over the next few slides, to describe each one of these types. But the worms are things that transport themselves. Hoaxes are not actually software, but things that cause a response in the rumors that received these hoaxes. Whenever we see headlines about something, broke the internet, it's usually the response of people typing in messages and sending them taking up a lot of bandwidth rather uselessly on the web in response to whatever the hoax said. Same is true pranks. And spyware and adware, while perhaps not malware themselves are sources of great irritation, and they consume space and resources that might be better used elsewhere. So let's investigate some of the other types.
Now, viruses are very aptly named. Even though they're a digital counterpart, they conform to the definition of a virus rather well, because a virus is defined by its ability to reproduce, but like the biological counterpart, it must have a host program or file in which to do that. It is spread through the use of that host program or file because it cannot do so on its own. But once it attaches itself invades, so to speak the DNA of that program or that file. It then replicates and then is spread through the dispersion of its container.
Now the kind of advice that we always give, we see here on this slide, and yet these not being followed still account for why most of this malware still succeeds. This is given to the users in the form of training and issued in the form of policies by company management. The advice of course, is do not double click on attachments, which of course activates the program and opens the file. This will activate a virus or other malware that is embedded within it.
We should be able to describe the content of the attachments before any of this happens, such that we get an idea of what the attachment is, and whether or not it contains any form of live code. If we blindly use the most widely used product as company standards without investigating them further, and in fact, using a due diligence process to properly select we may find ourselves less secure than we might wish.
Therefore, we want to use different kinds tools, scanners Of course, we want to be sure that we have a way of detecting changes. There can be heuristic scanners embedded in the malware, protection products to look for a behavior. And if used, they can be brought into control where false positives and false negatives are greatly reduced. There are activity monitors looking for various kinds of behavior, things that are behaving outside of normal boundaries. Having anti-malware policies is part of a normal policy Program. But they themselves establish policy and don't do anything to scan the software.
Doing reputation monitoring, monitoring for Zero-day or Zero-hour types of malware is also a practice that needs to be developed and put in place. Zero-days, of course, cannot be actively scanned for because as the name indicates, no one has a definition for it yet. However, looking for non-normal behavior, looking for symptoms that indicated abnormal behavior of your system can very easily be taken to indicate that something non-normal or a Zero-day is in fact at work, and so it should still be part of a program. And that brings us to the end of this section of Domain 8: Software Development Security. So be sure to return for the final chapter of the CISSP examination preparation review seminar. Thank you.
Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years. He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant. His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International. A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center. From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.
Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004. During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide. He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004. Mr. leo is an ISC2 Certified Instructor.