Security Quality Assurance Testing
The course is part of this learning path
This course covers section one of CSSLP Domain Five and looks at security quality assurance testing. You'll learn about important and foundational concepts on the process and execution of testing, topics regarding quality and product integrity, and various other considerations.
Obtain a solid understanding of the following topics:
- Security testing use cases
- Software quality assurance standards
- Testing methodologies and documentation
- Problem management
- The impact of environmental factors on security
This course is intended for anyone looking to develop secure software as well as those studying for the CSSLP certification.
Any experience relating to information security would be advantageous, but not essential. All topics discussed are thoroughly explained and presented in a way allowing the information to be absorbed by everyone, regardless of experience within the security field.
Now, some environmental factors that we need to consider. One is the attack surface evaluation. Now here, we need to validate that what we imagine or suspect might be our attack surface is in fact that or make the determination of what it actually is. Now, the surface of the area that we're examining typically represents a spot or multiple places where authenticated users can't run or input code into the system that we're testing, and thus extract data from the system.
Now, the security testing that we've been doing is about confirming resiliency with the intention to discover if the software can be made to fail or permit or even enable exploitation. Now, the security testing validates the presence and effectiveness of the security controls in the test artifact or in its environment. And through this, we are able to discern what the attack surface is, validate what is genuinely a problem in the attack surface. And that should lead to a program of shrinking the attack surface to the smallest feasible attack surface that we can.
Now, in the test artifacts, our environmental factors to be considered are, never to use actual live data or live modules. Now, of course, this can pose a bit of a problem, or at least a puzzle, because what we're trying to do is work with something that simulates real conditions as nearly as possible. So we need to use something that authentically represents real world or live operations, but we can never use the actual live data itself, because anything less, we would have to question, whether or not we were given true representations of cause, effect and behaviors that we would encounter in the real environment?
To do this, we have to create proper test data and there is a life cycle that is involved in this. The artifacts that we're using as the test objects must be protected like everything else in our project to make sure that they stay authentic, and that they will produce, every time that they're used, reliable results so that we can be sure that what we're working with is a true and fair representation of the conditions.
Now, it has to be looked at in the same way that any other craftsman, for the people doing this such as yourself are indeed craftsman. We need to be sure that we calibrate or sharpen our tools and test data is certainly one of the most important aspects of this tool, sharpening. We have to protect it from any modifications, not put under proper configuration management. If we test with inaccurately or poorly built data, our results are going to be equally poor. So it needs to be properly constructed and then protected very much like the actual operational system in its data would be.
Mr. Leo has been in Information System for 38 years, and an Information Security professional for over 36 years. He has worked internationally as a Systems Analyst/Engineer, and as a Security and Privacy Consultant. His past employers include IBM, St. Luke’s Episcopal Hospital, Computer Sciences Corporation, and Rockwell International. A NASA contractor for 22 years, from 1998 to 2002 he was Director of Security Engineering and Chief Security Architect for Mission Control at the Johnson Space Center. From 2002 to 2006 Mr. Leo was the Director of Information Systems, and Chief Information Security Officer for the Managed Care Division of the University of Texas Medical Branch in Galveston, Texas.
Upon attaining his CISSP license in 1997, Mr. Leo joined ISC2 (a professional role) as Chairman of the Curriculum Development Committee, and served in this role until 2004. During this time, he formulated and directed the effort that produced what became and remains the standard curriculum used to train CISSP candidates worldwide. He has maintained his professional standards as a professional educator and has since trained and certified nearly 8500 CISSP candidates since 1998, and nearly 2500 in HIPAA compliance certification since 2004. Mr. leo is an ISC2 Certified Instructor.