This course is designed to give you a solid understanding of containers and how they are used in Azure DevOps. It begins by looking at creating deployable images through Docker containers, microservices, and at the various container-related services available in Azure, including Azure Container Instances, the Azure Kubernetes Service, the Azure Container Registry, Azure Service Fabric, and Azure App Service.
The course also looks at Dockerfile and Docker multi-stage builds before finishing with a hands-on demonstration that shows you how to create an Azure Container Registry.
For any feedback relating to this course, please contact us at support@cloudacademy.com.
Learning Objectives
- Learn about Docker and its role in deploying containerized apps
- Understand how microservices can be used for deploying apps
- Learn about the container-related services available in Azure
- Learn about using multi-stage builds when working with Docker
- Gain a practical understanding of how to create an Azure Container Registry
- Gain a practical understanding of how to add Docker support to an application
Intended Audience
This course is intended for DevOps professionals who wish to learn how to use containers to design and implement strategies for developing application code and infrastructure that allow for continuous integration, testing, delivery, monitoring, and feedback.
Prerequisites
To get the most from this course, you should have a basic understanding of the Azure platform and of container concepts.
Hi there. Welcome to Microservices and Containers. Although containers have been most commonly used to simplify DevOps through simplified developer-to-test-to-production flows, there are other uses for them. Microservices, for example, is a quickly-growing use-case for containers.
The term “Microservices” refers to an application development strategy where each part of an application is actually deployed as a completely self-contained component (or microservice). The microservices that comprise an application can then be individually scaled and updated.
The easiest way to explain the concept of microservices is to use an example scenario.
Let’s imagine for a second that your organization is the author of a large, monolithic tax application. As part of the development of the next revision of the software, your organization wants to migrate it to a collection of microservices.
Now, the current app might include a piece of code that does some specific tax calculation in certain circumstances – and this code may exist in several spots within the app. Whenever new tax laws are approved or changed, changes are needed in the calculation of taxes. This means that the same changes need to be made anywhere this tax calculation is performed within the app.
By moving to a collection of microservices, you could allow the application to create a notification that a tax calculation needs to be made in response to some scenario. Microservices involved in that calculation can be subscribed to those notifications – and then those individual microservices can do what they need to do to perform the tax calculations. You’d likely have one specific microservice that does the actual calculation.
Whenever a new tax law is enacted that affects tax calculations, you would only need to update the microservice(s) responsible for the calculations. You wouldn’t need to make changes throughout the app code base.
While you might operate a development or testing environment on a single server or by using a single instance of each microservice, production is typically a different story. In such an environment, you are likely going to want to be able to scale things out. You’ll probably want to scale out to multiple instances across a cluster of servers, rather than running things on one server. You’ll want the ability to scale in as well. You may also want different teams within your development department to be able to independently work on, and update, different microservices that each team is responsible for.
This is where microservices can really shine. By leveraging the benefits of Docker containers when working with more complex microservice-based applications, organizations can become more agile, because those microservices can be quickly scaled out and in to meet the loads on the application. While doing so, however, the isolation of resources and namespaces offered by containers ensures that one microservice instance does not interfere with any other instances. This makes it easier to design a solid microservice architecture, which, in turn, allows organizations to deal with the management, deployment, orchestration, and patching needs of container-based services – while limiting the risks to availability.
Tom is a 25+ year veteran of the IT industry, having worked in environments as large as 40k seats and as small as 50 seats. Throughout the course of a long an interesting career, he has built an in-depth skillset that spans numerous IT disciplines. Tom has designed and architected small, large, and global IT solutions.
In addition to the Cloud Platform and Infrastructure MCSE certification, Tom also carries several other Microsoft certifications. His ability to see things from a strategic perspective allows Tom to architect solutions that closely align with business needs.
In his spare time, Tom enjoys camping, fishing, and playing poker.