The Docker containers Webinar: on October 19, I held Part 1 of a three-part webinar series on Docker. For those of you who could not attend, this post summarizes the webinar material. It also includes some additional items that I’ve added based on the QA session. Finally, I will highlight some of the best audience questions and wrap up with our plans for Part 2. You can watch part 1 here.
Docker & Containers
I’m guessing that you’ve heard about Docker by now. Docker is taking the industry by storm by making container technologies accessible to IT professionals. First of all, let’s start with the basics: What is Docker, what are Docker containers, and how are they related?
Docker is one part of the suite of tools provided by Docker Inc. to build, ship, and run Docker containers. Docker containers start from Docker images. A Docker image includes everything needed to start the process in an isolated container. It includes all of the source code, supporting libraries, and other binaries required to start the process. Docker containers run Docker images. Docker containers build on Linux kernel features such as LXC (Linux containers), Cgroups (control groups), and namespaces to fully isolate the container from other processes (or containers) running on the same kernel. That’s a lot to unpack. Here’s how Docker Inc. describes Docker:
“Docker containers wrap a piece of software in a complete filesystem that contains everything needed to run: code, runtime, system tools, system libraries – anything that can be installed on a server. This guarantees that the software will always run the same, regardless of its environment. — What’s Docker?“
Docker Containers Examples
Here are some real-world examples. Say your team is building several different web applications, which are most likely written in different languages. One application may be written in Ruby and another in Node.js.
Each application requires its own system packages to compile things like libraries. Deploying these types of polyglot applications makes infrastructure more complex. As a result, Docker solves this problem by allowing each team to package the entire application as a Docker image. The image can be used to start your application where it runs the same regardless of the environment. The benefits are a clean hand-off between development and production (build images, then deploy them), development and production parity, and infrastructure standardization.
Best of all, each Docker container will be fully isolated from others so that engineers can allocate more or fewer compute resources (such as CPU, Memory, or IO) to individual containers. This ensures that each Docker container has the exact amount of resources that it requires.
First-time users and those considering Docker (or any other container technology) ask the question: What’s the difference between containers and virtual machines?
Containers vs. Virtual Machines
Naturally, this question came up in the webinar. I answered it in the QA on the community forum as well:
“Docker’s about page has a good summary. Docker containers (and other container technologies) work by isolating processes and their resources using kernel features. This allows running multiple containers on a single kernel. Virtual machines are different. In this scenario there are multiple independent kernel with running on a single hypervisor. Each kernel running on the hypervisor sees a complete set of virtualized hardware. Nothing is virtualized in Docker’s case. Containers vs VMs also have different compute footprints–notably in memory. They use less memory because they don’t need to run an entire operating system. Finally, containers are also intended to run a single process. Virtual machines on the other hand can run many processes. Docker focuses on “operating system” virtualization while virtual machines focus on hardware virtualization.”
In summary, containers run a single process. Virtual machines may run any number of processes. Containers run a single kernel. Virtual machines run on a hypervisor. Containers require less memory because you don’t need to allocate memory to a completely separate kernel. Both technologies allow resource control.
Use Cases for the Development Phase
Building software is one of the most complex human activities. It is constantly changing and full of complications. Complexity multiplies when engineers use multiple languages and data stores. Consequently, workflows become more complicated and bootstrapping new team members never go as expected. Containers may be applied to the software development phase for drastic productivity increases—especially in polyglot teams. Docker is a great tool to leverage during the development phase. Here are some examples:
- Automating development environments. Say you’re building a C program. You’ll need a bunch of libraries and other things installed on the system. This can be packaged as a Dockerfile and committed to source control. As a result, every team member will have the same environment independent of their own system.
- Managing data stores. Perhaps you have one project that depends on database version A. Another project runs on version B. Running both versions may not be possible with your package manager. However, it’s trivial to start a container for version A and B, and then point the application to talk to the containers.
- Improve cross OS development. Consider a team using Linux, OSX, and Windows. Building the application on each platform will create many problems. Instead, if you package the application as a Dockerfile, each team member can always run the same thing.
- Development & production parity. Build and use an image in development. Then use it for staging and production. You can be certain that the same code is running the same way.
Use Cases for the Deployment Phase
Building software is only half the battle. After we’ve created it, we’ve got to deploy. This is where containers really shine. I’m a bit production biased these days so I’ll list the most important (and my favorite!) point first:
- Standardizing deployment infrastructure. This one is massive! DevOps and traditional teams can build standardized infrastructure to run and scale any application. Even if a new language comes out, it’s no problem. Deploy with Docker and it doesn’t matter what’s inside the container. Running and orchestrating containers in production is the hottest topic right now. Watch this space.
- Isolating CI builds. CI systems can be fickle. Each project may change the machine in some way: You may need to install some random software or drop artifacts everywhere. Don’t even get me started on project dependencies. With containers, all of these problems are a thing of the past. Run each build in an ephemeral container and throw it away afterward. No fuss, no muss.
- Testing new versions. It’s a happy day. The newest version of Language X was just released and it’s time to migrate. You just want to test it out, so you set up a virtual machine to not break your existing setup. This is a resource heavy and time-consuming process. Docker makes this easy. Simply change the image tag from language:x to language:y.
- Distributing software. You’ve just finished your tool in language X. Unfortunately, your tool has a ton of dependencies that your users may not be knowledgeable enough to install. Build a Docker image and push it to a Docker registry. Now anyone can pull down your image and run your software. This is especially nice for handing builds over to your QA team.
Installation & Toolchain
Docker can be installed on Windows, OSX, and Linux systems. The Windows and OSX versions run a Linux system with a Docker daemon. The Docker client is configured to talk to the virtual machine. The distribution’s package manager makes it easy to install Docker on Linux. Once Docker is installed, you can start using the larger Docker toolchain components.
Everything is built on top of the Docker Engine. The Docker Engine is the daemon running on a computer that manages all containers. The docker command is a client. It makes API requests to the Docker Engine. This means that Docker follows a client/server model. They communicate over HTTP.
Next comes Docker Registry. The Docker Registry is an image store. Users can push images to the registry so that other users can pull images to their installations. Users may employ the official registry for distributing public images. Paid plans are available if you need private images. You can also host your own Docker registry. The Docker community maintains a set of official images, including those for databases like MySQL, PostgreSQL, MongoDB, and many languages. Odds are, there is an official image for your use case.
Docker Compose is a tool for developing and shipping multi-container applications based on a configuration file. You’ll definitely come into contact with this common tool. Docker Compose does all the heavy lifting and makes it easier to share and develop more complex applications.
Docker Machine is a tool for bootstrapping Docker hosts. A Docker host is a machine that runs the Docker Engine. Docker Machine can create machines on cloud providers like AWS, Azure, GCP, and Rackspace. It can also create “local” machines using VirtualBox or VMWare.
It’s hard to cover these tools well in a text format. Therefore, I recommend that you check out the Introduction to Docker course or watch the demo in the webinar. Both of these resources demonstrate basic Docker functionalities and how to use Docker Compose to build a multi-container application.
Part 2: From Dev to Production
The first session introduced the Docker concept and how to develop applications using Docker. The next session will focus on deploying Docker applications. I’ll cover production orchestration tools and wrap up with a cool demo on creating a multi-stage application with Docker Compose and Docker Machine. The webinar is currently planned for November, so stay tuned for the announcement. I hope to see you there!
The Positive Side of 2020: People — and Their Tech Skills — Are Everyone’s Priority
I am starting to write this while reading my message last year, and it definitely feels a bit weird considering what 2020 turned out to be for all of us. It’s been a very strange year for all of us — Cloud Academy customers, colleagues, and friends have been impacted in different way...
Which Certifications Should I Get?
The old AWS slogan, “Cloud is the new normal” is indeed a reality today. Really, cloud has been the new normal for a while now and getting credentials has become an increasingly effective way to quickly showcase your abilities to recruiters and companies. With all that in mind, the s...
Web Hosting vs. Cloud Hosting: What’s the Difference?
A growing number of businesses go online annually. This is not surprising because the future is in online sales. According to forecasts, only in the U.S., the number of online shoppers will increase to 300 million by 2023, which is 91% of the total population of the country. The begi...
10 Benefits of Using Cloud Storage
It’s 2020, and now cloud storage has become one of the most convenient and efficient methods to store data online. There are many storage service providers on the internet, and this area is so vast now every big tech company owns a separate storage facility, which helps to generate a si...
Learn Cloud Computing: Prerequisites
What are the prerequisites and requirements to learn cloud computing? This is the first article in a series to introduce our members to the prerequisites to learning cloud computing. This was a question I was emailed countless times from our users, and while we have Learning Paths, AWS...
8 Financial Benefits of Cloud Migration
Companies that have long migrated to the cloud many times have confirmed the effectiveness of this solution from a practical point of view. This gives you more flexibility to perform tasks, work with data is organized more quickly and efficiently, and the data itself is stored under rel...
10 Reasons Digital Marketing Is More Successful With Cloud Computing
Cloud computing and digital marketing Cloud computing is a technology that serves extensive benefits to businesses. It empowers them to operate more effectively and improve their productivity as well. This is because the tools and applications that are integrated into the cloud can be ...
Cloud Computing: Can It Be a Solution for Your Marketing Strategy?
The competition in the business landscape is daunting and you need to go the extra mile to establish your presence in the market. Besides just ensuring that the products you offer are of the best quality, your marketing strategy should also be better than the rest. Basically, it is all ...
Cloud Computing Solutions: 7 Trends for the Future
The world of cloud computing is in a state of flux. Not long ago, the cloud was considered an emerging technology, known only to IT specialists. Today it is a part of everyday life – 96% of businesses use the cloud in one form or another, and this number only looks set to grow. Whether ...
8 Surprising Ways Cloud Computing Is Changing Education
Cloud computing: Empowering the education industry Over the years, the education industry has come a long way. Teaching and learning are no longer confined to textbooks and classrooms and now reaches computers and mobile devices. Today, learners are always connected — whether they are ...
What Exactly Is a Cloud Architect and How Do You Become One?
One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...
Disadvantages of Cloud Computing
If you want to deliver digital services of any kind, you’ll need to estimate all types of resources, not the least of which are CPU, memory, storage, and network connectivity. Which resources you choose for your delivery — cloud-based or local — is up to you. But you’ll definitely want...