The course is part of this learning path
This course is the fourth installment of four courses covering Domain 1 of the CSSLP, covering the topic of software development methodologies.
- Learn about the secure development lifecycle and the implications it has on your software
- Understand the various software development methods for keeping your environments secure
- Learn about the software development lifecycle
This course is designed for those looking to take the Certified Secure Software Lifecycle Professional (CSSLP) certification, or for anyone interested in the topics it covers.
Any experience relating to information security would be advantageous, but not essential. All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.
If you have thoughts or suggestions for this course, please contact Cloud Academy at email@example.com.
Now the development methods that we're going to consider is to improve how we're going to do this to make more routine, more systematized, if you will, how we're going to put this together. Any development method is going to be a methodological approach as opposed to something more ad hoc, which of course is what we're attempting to migrate away from. It means that we're going to examine all the various ingredients that are going to be used to produce the ultimate endpoint program, including languages, testing, examination tools, project management methods, risk registers, and a host of other things.
Now, the methods that we have used over time, we're going to introduce and discuss briefly. We will talk about the family of waterfall methods, the spiral type, prototyping, and then agile. But first let's examine the systems development lifecycle. We're going to have a concept definition. What is the thing that we're trying to produce? We're going to look at functional requirements determination because we need to look at what are the actions it's going to take. And of course, along with that will be the non-functional ones that represent state or conditions that must be present in order for the program to function correctly.
Then we're going to go into a deeper level and start specifying the details of the controls and the interactions. Once we have these put together, we're going to do a design review and start looking at how has it taken shape? Is there anything we've missed? Do we need to add something or change something? From there we go into the build phase.
Then we do a code review walk-through as we produce elements. This is to make sure that what we have produced does what we think it's supposed to and produces the deliverables and the products and the functions that we need. As we do this, we will compile them all, and then ultimately do testing. We'll do testing at the unit level, the stream level, the threat level, the process level, all the way up to the system level, and then ultimately the customer acceptance test.
Then we'll move it into its operations phase, and then we will maintain it through change management. So that is to introduce the normal systems development cycle over which we're going to lay any of our design and development methods. So here we have the classic waterfall lifecycle method. This method has been around for a very long time and did not originate with software, of course. It was a project management method that showed how we were going to do things in a sequential way. And with each phase leading to the next one in a sequential order, we also developed the notion of the feedback loop, where we ask the question, what did we do? Did we do what we intended to do? And did we achieve the kind of results that we intended to achieve? If not, we go back, we check it again, maybe do it again, until we get an answer everything we did was complete and accomplished everything we intended to.
So each phase depends upon the prior one completing successfully, and then moving into the next phase. We go through the standard lifecycle from conception to initiation, analysis and design, and in software, we then shift to coding and development, then testing, and then operationalization and maintenance.
From there, we moved into a more iterative cycle, having learned the lessons that everything in software development is going to be iterative to a degree and not sequential and going only in one direction. So we developed a number of different ones. The spiral model, the meta model and prototyping all became part of this particular methodology.
So as you see with the spiral model, we begin at the center, and as it literally spirals out, a new prototype is produced at the end of each spiral cycle. Then as we test, we will find we need to put it back through another cycle. And then as it spirals out, we're doing all of these things in an iterative fashion, developing a new prototype with each turn on the spiral until we come up with the prototype that meets all the criteria, has met all of the requirements, performs as intended, and is prepared for operationalization.
Now, as I said earlier, many of the methods of project management that were put in place to design and build and deliver the product of software really were built for managing other kinds of products, things that were more predictable, such as dams, railroads, buildings, and roads. Over time, it became evident that traditional project management methods were going to be inadequate as being completely ill-suited to the very real, the highly variable nature of software design and build.
So at Carnegie Mellon University, the Software Engineering Institute came up with the capability maturity model, which has spread far and wide in a wide variety of areas, known now as CMM integration. We still begin with level one called initiating, where if we get it right, it's probably an accident. And that means that the process of development at this level is informal and largely ad hoc.
So we go through the process of getting to the end of level one and moving into level two, where now having learned our lessons, it now becomes more repeatable. Our products, our process becomes more about planning and tracking what we're actually doing, more of a formalized process than it was in level one. Completing level two, we move to level three, which is now formally defined, and our processes, having been put through various forms of development, refinement, flaw finding, and so on, are now much more well-defined and employed on a regular basis.
So unlike level one, where, as I said, getting it right was almost an accident, at level three, we're getting it right the majority of the time now as a programmed response to our development processes having become much more organized. Moving from level three to level four, it now becomes quantitatively managed and controlled like many other of the more mature processes our business uses.
By the time we get to level five, now it is a case of focusing on continued delivery and maintenance of the processes through level four that have produced far improved quality of our software programs. And now we focus on the process of doing all this to continually refine and develop the process itself.
Now within each of the five levels of the CMMI, we employ the IDEAL model developed by the very same people at the Software Engineering Institute. The IDEAL steps are used in each phase of the CMMI to assure that everything done in each level is accomplished before moving on to the next one. The process here focuses on everything to be done at the given level. The intention is to discover and implement the requirements at each level so that, like the beginning of the process itself, at the beginning of each of the levels, we initiate to discover everything that we can about how we are at that level starting out.
Then we diagnose whatever is going on to find flaws, and pick out the places where we can improve. Then we establish the foundation for what we're going to do in this level and how we're going to take what is good and improve upon it, formalize it, and how we are going to eliminate the various things that prove to be weaknesses or flaws. Then we act to operationalize all of that within the context of that level. And then from it all, we have the continuous learning cycle so that as we move into the succeeding phase, we take our lessons learned and operationalize them to improve our performance in the next level.
Now we come to the lifecycle model of agile, what seems to be a very popular and very productive method of developing software now. Now, agile development is based on the agile manifesto that defines 12 principles, of which we have a few examples. In the agile manifesto, it emphasizes individuals and interactions over processes and tools. In other words, we're not putting it down to rigor of process. We're looking at the highest and best use of individuals, the interactions that produce the products, and rather than make things conform to a specific tool set or a specific process, we work through it so that we produce the product and measure our success of the product against the model that the product was based on.
We also look at working software over comprehensive documentation. A common complaint in development in IT and in security specifically is, are you trying to write a great American novel, or are you trying to produce something that is actually usable that isn't itself an unnecessary complication? So by producing comprehensive software over comprehensive documentation, we're writing what we need to, to describe what we're doing, but not to the point that we've written something the length of "War and Peace". Instead, we've gone with something far briefer, far more practical, and thus far more usable that also doesn't create a maintenance problem all on its own.
We look at customer collaboration over contract negotiation. Change management has for many decades been one of the bugaboos of software development and projects of all kinds. By having customer collaboration, it is an ongoing process of doing, seeing, evaluating, and then acting to modify where necessary so that by collaborating with the customer, we are dealing more effectively with change, adapting it to what we're doing so that as things change as new requirements arise, we can cope with change much more effectively and integrate it with what we're doing rather than worrying about whether it's a term in the contract and whether or not we can do that and all the bureaucracy that typically goes along with contract negotiation and change management. And by responding effectively to change rather than being bound to a rigorous, rigid plan, we're better able to make the software itself adapt more to the changinqg world and circumstances, in which it finds itself, rather than being stuck with something that is bound to a rigid plan, that is brittle and breaks at the least provocation.
Along with agile, we find that we have DevOps or development and operations. DevOps seeks to combine the three elements of software development, operations management, and quality assurance, so that we get the best of all three from the beginning in a continuum of steps and processes, all the way through operations, with the according feedback loops to make sure that everybody is on the same page at all times.
And so that each component informs the others to make for a smoother, better integrated flow. When we adapt that to even a further step, we integrate security with it all. By aligning it to agile and the technique of project management that brings to this, we now are bringing together what might be considered a perfect world of bringing all of the operational elements that we need together much earlier in the cycle, integrating them in a way that takes the best of each, integrates it with the best of each of the others, so that we get a better product as a result.
It must be said that moving from what we conceive as traditional development to a DevOps or DevSecOps approach can bring about some cultural disruption and cultural changes. Historically, these elements were found to be in competition or in opposition to each other, but by bringing them together in a way that integrates what they do, the lessons learned and the experience that each can bring to the overall problem, the integration produces a far better integration of the working elements, the people involved, as well as bringing together their experience, their knowledge and contributions, to ultimately produce a better operating, more realistically integrated program that serves the business with much greater efficiency and effectiveness.
Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years. He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant. His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International. A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center. From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.
Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004. During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide. He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004. Mr. leo is an ISC2 Certified Instructor.