image
Virtualization and cloud technologies

Contents

Foundation Certificate in Cyber Security (FCCS)
4
5
Internet Primer
PREVIEW13m 34s
6
Network Security
PREVIEW38m 41s
7
Modern Communications
PREVIEW25m 21s

The course is part of this learning path

Start course
Difficulty
Beginner
Duration
2h 45m
Students
3345
Ratings
4.6/5
starstarstarstarstar-half
Description

Course Description 

This course introduces the basic ideas of computing, networking, communications, security, and virtualization, and will provide you with an important foundation for the rest of the course.  

 

Learning Objectives 

The objectives of this course are to provide you with an understanding of: 

  • Computer system components, operating systems (Windows, Linux and Mac), different types of storage, file systems (FAT and NTFS), memory management. The core concepts and definitions used in information security 
  • Switched networks, packet switching vs circuit switching, packet routing delivery, routing, internetworking standards, OSI model, and 7 layers. The benefits of information security  
  • TCP/IP protocol suite, types of addresses, physical address, logical address, IPv4, IPv6, port address, specific address, network access control, and how an organisation can make information security an integral part of its business 
  • Network fundamentals, network types (advantages and disadvantages), WAN vs LAN, DHCP 
  • How data travels across the internet. End to end examples for web browsing, sending emails, using applications - explaining internet architecture, routing, DNS 
  • Secure planning, policies, and mechanisms, Active Directory structure, introducing Group Policy (containers, templates, GPO), security and network layers, IPSEC, SSL / TLS (flaws and comparisons), SSH, Firewalls (packet filtering, state full inspection), application gateways, ACL's 
  • VoIP, wireless LAN, Network Analysis and Sniffing, Wireshark 
  • Virtualization definitions, virtualization models, terminologies, virtual models, virtual platforms, what is cloud computing, cloud essentials, cloud service models, security amd privacy in the cloud, multi-tenancy issues, infrastructure vs data security, privacy concerns 

 

Intended Audience 

This course is ideal for members of cybersecurity management teams, IT managers, security and systems managers, information asset owners, and employees with legal compliance responsibilities. This course acts as a foundation for more advanced managerial or technical qualifications. 

  

Prerequisites  

There are no specific pre-requisites to study this course, however, a basic knowledge of IT, an understanding of the general principles of information technology security, and awareness of the issues involved with security control activity would be advantageous. 

 

Feedback 

We welcome all feedback and suggestions - please contact us at support@cloudacademy.com if you are unsure about where to start or if would like help getting started. 

Transcript

Welcome to this video on Virtualization and Cloud Technologies. 

In it you’ll learn about what we mean by these two terms, and what they mean for you in terms of security considerations. Specifically, you will learn about Virtualization, Containers, Cloud computing, Cloud services, Security considerations, Privacy concerns and Auditing. 

This diagram shows the basic differences between a physical machine and a virtual machine. 

In a physical machine there is just one operating system, talking directly to the hardware such as the CPU or memory, and an application running on top of this operating system. 

In a virtual machine, we still have the same base hardware, but we now have a virtualization layer sat between the operating system and the hardware which allows for multiple operating systems. Each operating system will request that the virtualization layer hand over a portion of the underlying hardware resources to be used solely by that individual operating system. 

There are several ways to achieve the virtual machine architecture. In this diagram, you can see two different takes on implementing virtualization. On the right, you can see the same setup that we discussed previously.  

A bare metal environment is a type of virtualization environment in which the virtualization hypervisor is directly installed and executed from the hardware. It eliminates the need for a host operating system by directly interfacing with the underlying hardware to accomplish virtual machine specific processes. 

A bare metal environment is also called a tier-1 environment. On the left you can see that the computer is set up in a standard manner, with an operating system sitting on top of the hardware.  

The big difference here is that we have inserted the virtualization layer on top of the main, or host, operating system. You can have multiple virtual machines sat here, all requesting access to hardware resources through the virtualization layer. 

From a resilience perspective, there is a clear difference between the two. On the left, if the host operating system suffers any sort of catastrophic failure, it will also take down any of the virtual machines that are sat on it. 

On the right, the failure of one VM will have no effect on any of the others around it. Virtual machines rely on a full operating system being used, but Containers are far more lightweight, using just enough functionality to present one single application. 

Containers and container platforms provide many advantages over traditional virtualization. Isolation is done at the kernel level without the need for a guest operating system, so containers are much more efficient, fast, and lightweight.  

Allowing applications to become encapsulated in self-contained environments comes with many advantages, such as quicker deployments, scalability, and closer parity between development environments. 

Docker is currently the most popular container platform. Although the idea of isolating environments dates quite far back, and there has been other container software in the past, Docker appeared on the market at the right time, and was open source from the beginning, which likely led to its current market domination. 

Docker features the Docker Engine, which is a runtime and allows you to build and run containers, and includes Docker Hub, a service for storing and sharing images. 

With containers such as Docker, multiple containers are installed on the guest operating system, each container running a single application.  

This diagram shows the differences between a VM and a Container. 

A traditional virtual machine consists of a guest operating system, on which multiple applications are installed. 

With containers such as Docker, multiple containers are installed on the guest operating system, each container running a single application.  

The container presents a view of the operating system to the application.  

Each containerized application can be rapidly developed and deployed in an automated manner with auto-discovery of other applications, auto-scaling, ease of upgrades and recovery. 

Tools such as Kubernetes are used to orchestrate container deployments and play an important role in enforcing the security of the container. 

The deployment of container technology brings a number of security concerns.  

Many of these concerns are exactly the same as any other network connected device, but due to the nature of containers the mitigation may be slightly different. 

Malicious containers are similar to malicious applications that you may inadvertently install onto your mobile phone. You should always use containers from trusted sources, or registries.  

Docker Content Trust - Before a publisher pushes an image to a remote registry, Docker Engine signs the image locally with the publisher’s private key. When you pull this image, Docker Engine uses the publisher’s public key to verify that the image you are about to run is exactly what the publisher created, has not been tampered with and is up to date.  

One of the hot technologies of the past decade has been Cloud Computing.  

It seems like the whole world has been desperate to find some reason to implement something ‘in the cloud’, so what is it? 

Cloud computing is a general term for the delivery of hosted services over the internet, as detailed in the NIST definition. 

Cloud computing enables companies to consume computer resources in the same way you would use a household utility - such as gas, electricity or water. These computer resources are accessed over the internet, such as a virtual machine (VM), storage solutions or an application. These resources are used instead of having to build and maintain computing infrastructures on-site. 

As with any other new technology, cloud computing comes with many new acronyms and concepts. 

Cloud service providers, or CSPs, can supply cloud services in a number of ways, with each model giving the customer a different level of control over their data and cloud experience. 

At the basic level, Infrastructure as a Service means that the CSP merely supplies the hardware, and the means to enable the customer to install their own operating system and software. 

Up from this, Platform as a Service provides the hardware and the operating system, allowing the customer to install and manage their own application. 

Finally, Software as a Service supplies the hardware, operating system and application, allowing the customer to make use of the application but not to maintain it in anyway. 

If this seems confusing, several analogies for these models are available on the Internet, such as Pizza as a Service, and Car as a Service which do a great job of explaining the various models using more relatable subjects. 

The cloud model matrix details the four ways in which a cloud solution can be deployed: 

  • A Public cloud is available for the public at large to access. It is deployed on the premises of the CSP, rather than within your own network and behind your firewall. 
  • A Private cloud will be deployed within your own network, and on your premises. Your firewalls will protect it. 
  • A Community cloud is similar to a Private cloud, except that it shared amongst a community of private customers, all of whom have shared requirements for cloud computing. 
  • A Hybrid cloud is a mixture of Public cloud and Private/Community cloud. 

In a moment we will look at the risks associated with these models, correlated with the type of service they provide. 

Whilst it may be very easy for the CEO of an organization to tell their IT department that ‘we must be doing something in the cloud’ there are several very important considerations. 

The cloud does allow for some great economies of scale, not least the costs of power and cooling for the equipment required, along with the supply and maintenance of that kit.  

A CSP will always be able to supply and run equipment more cheaply than an organization can for itself. But will the service be suitable for the customers of the entity buying the cloud service provision?  

Will it give timely responses to requests? Will it be reliable, both in its everyday operation but also in the connection to the service? If data is being held in the cloud, how secure is it and who is responsible for that security? Can we trust our cloud service provider to vet their staff to the same standard that we would? 

The concentration of data in one location is a natural target for cyber threat actors, so does the CSP take extra security precautions? 

This diagram shows the risk levels generated at the intersections between service models and deployment models. 

When data is held in the cloud the CSP is responsible for its security.  

We are relying on a trust relationship existing between us and the CSP, perhaps even directly linking our internal network to the cloud infrastructure. There have been instances of highly sophisticated cyber threat actors abusing the trust relationship between CSPs and their customers in order to access the customer’s internal networks and steal information. 

Even if we run our own private cloud, we still have the usual network security issues to consider. 

In order to mitigate some of this risk, the National Cyber Security Centre (NCSC), a UK government organization, published 14 principles for cloud security.  

They are:  

Data in transit protection: User data moving between networks should be protected as much as possible. 

Asset protection and resilience: user data, and the computer storing or processing it, should be protected against any attacks or damage. 

Separation between users: Malicious or compromised users shouldn’t be able to access or impact other users.  

Governance framework: Security needs to be dealt with at a governance level.  

Operational security: The service must be operated securely.  

Personnel security: Service provider personnel must be trustworthy.  

Secure development: From inception, services should be designed to be secure.  

Supply chain security: The service provider must make sure that its supply chain is secure.  

Secure user management: Service providers must empower customers to manage their own users.  

Identity and authentication: Only authenticated and authorized users should have access to the service.  

External interface protection: Any external interface must be identified and defended properly.  

Secure service administration: Administration services must be very secure as they have privileged access to the service.  

Audit information for users: The service should provide the audit records you need to monitor the access to your service.  

Secure use of the service: Take responsibility for using the service securely, as poor use of the service can compromise its security.  

Having considered the security issues around cloud, let’s look at exactly why we have those security concerns. The crux of the matter is protecting the privacy of our data – although we must acknowledge that usually it’s not ‘our’ data, per se. It’s data that belongs to our customers or suppliers. The introduction of the General Data Protection Regulations, or GDPR and the Data Protection Act 2018 mean that data privacy is now a central priority for all organizations.   

Beyond any legal considerations we also have further issues. Who owns the uploaded data? Are any rights lost in uploading data? There are regular Internet rumours circulating about Facebook owning any photo’s that you upload to it. These rumours are not true in the sense that the authors intend, but a quick read of Facebooks terms and conditions show that Facebook (and other similar websites) can make use of your photographs and other intellectual property in whatever way they see fit. 

Can the service provider claim ownership or use uploaded content in marketing or for other purposes? What rights does the provider have to access customer data and what rules govern such access? Should the provider ensure the customer is not abusing the service or breaking the law? What can the provider do with customer data for research, service improvement, customer tracking or marketing purposes? Which law applies? If a UK company keeps data about German citizens with a US based cloud provider, stored in a data centre in India – who is able to access the data and under which laws? What will the provider do in the event of a cyber incident? Will they inform their customers, and if so in what timescale? Will the customer be able to instigate their own incident handling and investigation process? 

As you can imagine, the big questions addressed on the previous slide are putting pressure on governments and industry. The UK government has tasked the NCSC, in conjunction with the industry, to come up with plans to address these major concerns. The cloud industry is being asked to provide assurance that its services are safe, secure and reliable. 

One approach to ensuring that a cloud service remains safe, secure, and reliable is to implement a Cloud Continuous Compliance Framework. An effective Continuous Compliance Framework needs to include controls in all three domains, or areas of concern: 

  • Prevention – make your cloud secure from the start. Create solid policies relating to your development process and include Continuous Integration/Delivery, or CI/CD, testing within your development timeline. This type of testing means that code is subject to rigorous evaluation prior to its release to the world. 
  • Detection – Monitor your cloud infrastructure regularly, and check where your data is actually stored. 
  • Remediation – Where you have deployed serverless functions or containers, know how to remediate any problems that might occur with them.  

One way to ensure that you have a procedure for mitigating any problem is to write this into a runbook, which can be followed by anyone who needs to respond to any issue. 

Whilst it is good practice to have internal policies and procedures to help in securing your cloud, there are times when you will need to get some external validation. In some cases, it may be a statutory or regulatory requirement. 

A cloud security audit will be performed by an external third party, who will review all of the soft protections that are in place, such as policies and procedures, as well as the more technical protections such as firewalls or account security. 

Methods to achieve this can vary but can include investigating the methods used to access the cloud data, or the code that makes the cloud application actually work.  

The third party could conduct a penetration test against the application, looking to see if they can access data that should not be available to them or anyone else without the correct authorization.  

Any test like this would generally not be carried out against the cloud service provider themselves – they will arrange their own penetration tests to verify the security of their infrastructure. 

This brings us to the end of this video.

About the Author
Students
8399
Courses
5
Learning Paths
12

Paul began his career in digital forensics in 2001, joining the Kent Police Computer Crime Unit. In his time with the unit, he dealt with investigations covering the full range of criminality, from fraud to murder, preparing hundreds of expert witness reports and presenting his evidence at Magistrates, Family and Crown Courts. During his time with Kent, Paul gained an MSc in Forensic Computing and CyberCrime Investigation from University College Dublin.

On leaving Kent Police, Paul worked in the private sector, carrying on his digital forensics work but also expanding into eDiscovery work. He also worked for a company that developed forensic software, carrying out Research and Development work as well as training other forensic practitioners in web-browser forensics. Prior to joining QA, Paul worked at the Bank of England as a forensic investigator. Whilst with the Bank, Paul was trained in malware analysis, ethical hacking and incident response, and earned qualifications as a Certified Malware Investigator, Certified Security Testing Associate - Ethical Hacker and GIAC Certified Incident Handler. To assist with the teams malware analysis work, Paul learnt how to program in VB.Net and created a number of utilities to assist with the de-obfuscation and decoding of malware code.

Covered Topics