Docker Image Security: Get it in Your Sights

For organizations and individuals alike, the adoption of Docker is increasing exponentially with no signs of slowing down. Why is this? Because Docker provides a whole host of features that make it easy to create, deploy, and manage your applications. This useful technology is especially essential as your organization scales and your infrastructure grows. 

Want to jump right in and explore everything about Docker? Try the Cloud Academy Learning Path Docker in DepthYou’ll get courses, labs, quizzes, and exams — everything you need in one place minus the coffee.

Docker in Depth learning path

Technically speaking, Docker is a set of PaaS (Platform-as-a-Service) products that uses OS-level virtualization for delivering software in containers that communicate with one another, but when these container ecosystems aren’t designed cautiously and managed with care, they can lead to some pretty risky security issues.

While Docker offers a wide range of benefits, the security challenges that come with containerized environments shouldn’t be overlooked. To help you increase the security standards of your Docker-based containerized environments, we’ve outlined our best practices to maintain Docker image security below.

Docker Image Security Best Practices

Always Verify Images Prior to Using

By default, Docker allows you to pull Docker images without validating their authenticity first. This exposes you to Docker images with unverified author and origin details. To avoid this, use the following command to temporarily enable Docker Content Trust:

export DOCKER_CONTENT_TRUST=1

Always ensure that the image you are pulling is published by a reliable publisher and that no third parties have modified it. Docker-certified images are provided by trusted partners and curated by the official Docker Hub, and using certified images is critical when considering the code in your production environment. To add another layer of protection, Dockers allows images to be signed using Docker Notary. Notary also verifies the image signature and prevents an image from running if it has an invalid signature.

Efficiently Handle the Container Life cycle

The way a user decides to handle the container life cycle greatly determines the Docker image security. While updating a container, it is recommended to not only to check the updated layer for security, but to also test the entire stack.

Establish a Thorough and Standardized Access Management Solution

When you have a strong access management solution for your Docker implementation — and really, across your cloud infrastructure — your containers can (and should) operate with minimal privileges and access in order to reduce risk. Organizations can use Role-Based Access Control (RBAC) and Active Directory solutions to manage permissions for the entire organization easily and effectively.

Find and Fix Open-Source Vulnerabilities

Open-source resources are extremely popular in Docker. The upside is that these are free to use, and they support modifications to a great extent. However, the downside is potential unexposed vulnerabilities, which can easily take your system down. Does that mean you shouldn’t use these free readily available resources? Of course not! But you should do your due diligence first and use them with caution. One way to do this is by using tools that allow continuous scanning and monitoring of vulnerabilities across all in-use Docker image layers, such as Snyk.

Limit the System Resources Consumed by Containers

Another good Docker security practice is to implement limits on the system resources that are to be consumed by containers. This not only helps in reducing performance impacts, but also lessens the risk of DoS (denial of service) attacks.

No Root User by Default/Least Privileged Access

When there is no USER specified in the Dockerfile, the default mechanism is to execute the container using the root user. This means that the running Docker container could potentially have root access on the Docker host. This is a problem! Allowing an application on the container with the root user running increases the exposed attack surface and makes the application vulnerable to exploitation. To avoid this undesirable scenario, create a both a dedicated user and a dedicated group in the Docker image. Next, use the USER directive in the Dockerfile to make sure that the container runs the application with the least privileged access possible, as is best practice.

Use a Linter

Using a linter not only helps in avoiding common mistakes, but also in establishing best practice guidelines that can be followed in an automated, convenient way. One recommended linter to use with Docker is Hadolint, which parses a Dockerfile and generates warnings for errors that don’t follow its best practice rules. It’s even more useful when used in conjunction with an (Integrated Development Environment (IDE).

Use Multi-Stage Builds and Secrets Manager

Even when deleted, sensitive data like an SSH private key or tokens may still persist on the layer they were added to due to caching. This poses a great security risk. The solution is to keep secret information outside the Dockerfile by using multi-stage builds. Using Docker support for multi-stage builds allows fetching and managing private data in an intermediate layer that is disposed of later, ensuring that any sensitive information doesn’t reach the image build. You can also use Secrets Manager for the same purpose.

You should also be wary of a recursive copy. When you have files containing sensitive information in the folder, either remove them or use .dockerignore.

Whenever Possible, Avoid ADD and Use COPY Instead

Copying files from the host into a Docker image at build time can be accomplished by using either the ADD or COPY command. When executed, either command will perform a recursive copy. Note that unlike the ADD command, the COPY command requires that you declare a destination file.

If the destination directory doesn’t exist for the ADD command, then (unlike with the COPY command) the ADD command will create a directory. When you copy resources with URLs, it is important to reference them over secured TLS connections (HTTPS) for enhanced security. In addition, the source/origins of the resources should be validated when using the COPY command.

When you copy archives, the ADD command automatically extracts the archive into the destination directory. This is something you do not want, so it’s to be avoided. The reason is that it increases the risk of “zip bombs” and “zip slip vulnerabilities.” The COPY command allows you to separate the addition of an archive from remote locations, as well as unpack it as different layers. This optimizes the image cache. Avoid using ADD whenever possible, as using the command makes Docker susceptible to attacks via remote URLs and Zip files.

Yes to Minimal Base Images

When the project doesn’t necessitate any general system libraries or utilities, it is better to pick a minimal base image rather than a full-fledged operating system. According to Snyk’s annual state of open-source security report in 2019, several of the popular Docker containers featured on the Docker Hub contain many known vulnerabilities. Minimal base images for Docker containers feature only the necessary system libraries and tools to run the project, and therefore minimize the attack surface. Less attack surface means less risk.

Docker Image Security Tips

Employ a Robust Third-Party Security Tool

When using containers from untrusted public repositories, it is vital to assess the degree of security risk first. Use a multi-purpose security tool that offers dev-to-production security features for assessing risk.

Pay Attention to Vulnerability

You should have a robust vulnerability management program that performs multiple checks during the entire container life cycle. It must include quality gates — these will help to detect any issues related to access for detecting access issues, as well as any weak points where there could be a  potential exploit from development-to-production environments.

Monitor and Assess Container Activity

Monitoring the container ecosystem to detect and manage any suspicious activity must not be overlooked. Container monitoring tools offer real-time reports, which are helpful in reacting rightly against security breaches.

Enable Docker Content Trust

Introduced in Docker 1.8, the Docker Content Trust feature helps in verifying the authenticity, integrity, and publication date of all Docker images from the Docker Hub Registry.

Use the Docker Bench for Security Script

For further securing your Docker server and containers, be sure to run the Docker Bench for Security script. This script checks for a plethora of configuration best practices when deploying Docker containers.

Be Careful When Using a Web Server 

Always check the parameters carefully when using a web server and API for creating containers. This will prevent you from creating undesirable, or even harmful, containers.

Avoid Using the Default Bridge Network When Using a Single-Host App with Networking

When using a single-host app with networking, the use of the default bridge network must be avoided. If you do use the default bridge network and then publish a port, all containers on the bridge network become undesirably accessible. Additionally,  other technical disadvantages make it a non-recommended practice for production use.

More Important Docker Image Security Tips:

  • When there is only the need to read from volumes, mount them as read-only. There are several ways to do this, and you can choose one that best fits your process and requirements.
  • Use Docker containers for running other processes on the same server that you are using for your Docker project(s).
  • Secure API endpoints with HTTPS or SSH when exposing a REST API.
  • Never store sensitive data in a container, but only in volumes.
  • For serving, use Lets Encrypt for HTTPS certificates.
  • Keep your Docker, system libraries, and utilities up-to-date for the latest bug fixes and security enhancements.
  • Consider using Docker Enterprise when dealing with multiple or large teams and/or many Docker containers.

In conclusion

Docker is, without a doubt, one of the best options when dealing with cloud-powered technologies and applications that are meant to be readily deployable and quick to act. You need to double-check your security measures, though, to make the most out of containerized environments.

Hopefully, you will find the security practices and tips put up together in the article useful. Always remember, it’s essential to continuously upgrade your security standards to keep your applications, data, and—most importantly—your clients safe at all times.

Simran Arora

Written by

Simran Arora

Simran, born in Delhi, pursued a Bachelor's in Computer Science degree in India. Curious and passionate about technology, she continued her education to earn a Master's in Computer Science degree in Silicon Valley, California. Simran now works as a freelance technical content developer.


Related Posts

Joe Nemer
Joe Nemer
— September 15, 2020

New Content: Azure DP-100 Certification, Alibaba Cloud Certified Associate Prep, 13 Security Labs, and Much More

This past month our Content Team served up a heaping spoonful of new and updated content. Not only did our experts release the brand new Azure DP-100 Certification Learning Path, but they also created 18 new hands-on labs — and so much more! New content on Cloud Academy At any time, y...

Read more
  • AWS
  • Azure
  • DevOps
  • Google Cloud Platform
  • Machine Learning
  • programming
Avatar
Andrew Larkin
— August 18, 2020

Constant Content: Cloud Academy’s Q3 2020 Roadmap

Hello —  Andy Larkin here, VP of Content at Cloud Academy. I am pleased to release our roadmap for the next three months of 2020 — August through October. Let me walk you through the content we have planned for you and how this content can help you gain skills, get certified, and...

Read more
  • alibaba
  • AWS
  • Azure
  • content roadmap
  • Content updates
  • DevOps
  • GCP
  • Google Cloud
  • New content
Alisha Reyes
Alisha Reyes
— August 5, 2020

New Content: Alibaba, Azure AZ-303 and AZ-304, Site Reliability Engineering (SRE) Foundation, Python 3 Programming, 16 Hands-on Labs, and Much More

This month our Content Team did an amazing job at publishing and updating a ton of new content. Not only did our experts release the brand new AZ-303 and AZ-304 Certification Learning Paths, but they also created 16 new hands-on labs — and so much more! New content on Cloud Academy At...

Read more
  • AWS
  • Azure
  • DevOps
  • Google Cloud Platform
  • Machine Learning
  • programming
Alisha Reyes
Alisha Reyes
— July 2, 2020

New Content: AWS, Azure, Typescript, Java, Docker, 13 New Labs, and Much More

This month, our Content Team released a whopping 13 new labs in real cloud environments! If you haven't tried out our labs, you might not understand why we think that number is so impressive. Our labs are not “simulated” experiences — they are real cloud environments using accounts on A...

Read more
  • AWS
  • Azure
  • DevOps
  • Google Cloud Platform
  • Machine Learning
  • programming
Alisha Reyes
Alisha Reyes
— June 11, 2020

New Content: AZ-500 and AZ-400 Updates, 3 Google Professional Exam Preps, Practical ML Learning Path, C# Programming, and More

This month, our Content Team released tons of new content and labs in real cloud environments. Not only that, but we introduced our very first highly interactive "Office Hours" webinar. This webinar, Acing the AWS Solutions Architect Associate Certification, started with a quick overvie...

Read more
  • AWS
  • Azure
  • DevOps
  • Google Cloud Platform
  • Machine Learning
  • programming
Luca Casartelli
Luca Casartelli
— June 1, 2020

DevOps: Why Is It Important to Decouple Deployment From Release?

Deployment and release In enterprise organizations, releases are the final step of a long process that, historically, could take months — or even worse — years. Small companies and startups aren’t immune to this. Minimum viable product (MVP) over MVP and fast iterations could lead to t...

Read more
  • decoupling
  • Deployment
  • DevOps
  • engineering
  • Release
Luca Casartelli
Luca Casartelli
— May 14, 2020

DevOps Principles: My Journey as a Software Engineer

I spent the last month reading The DevOps Handbook, a great book regarding DevOps principles, and how tech organizations evolved and succeeded in applying them. As a software engineer, you may think that DevOps is a bunch of people that deploy your code on production, and who are alw...

Read more
  • DevOps
  • DevOps principles
Michael Dehoyos
Michael Dehoyos
— May 13, 2020

Linux and DevOps: The Most Suitable Distribution

Modern Linux and DevOps have much in common from a philosophy perspective. Both are focused on functionality, scalability, as well as on the constant possibility of growth and improvement. While Windows may still be the most widely used operating system, and by extension the most common...

Read more
  • DevOps
  • Linux
Avatar
Logan Rakai
— April 7, 2020

How to Effectively Use Azure DevOps

Azure DevOps is a suite of services that collaborate on software development following DevOps principles. The services in Azure DevOps are: Azure Repos for hosting Git repositories for source control of your code Azure Boards for planning and tracking your work using proven agil...

Read more
  • Azure
  • DevOps
Simran Arora
Simran Arora
— October 29, 2019

Docker vs. Virtual Machines: Differences You Should Know

What are the differences between Docker and virtual machines? In this article, we'll compare the differences and provide our insights to help you decide between the two. Before we get started discussing Docker vs. Virtual Machines comparisons, let us first explain the basics.  What is ...

Read more
  • Containers
  • DevOps
  • Docker
  • virtual machines
Avatar
Adam Hawkins
— October 24, 2019

DevOps: From Continuous Delivery to Continuous Experimentation

Imagine this scenario. Your team built a continuous delivery pipeline. Team members deploy multiple times a day. Telemetry warns the team about production issues before they become outages. Automated tests ensure known regressions don't enter production. Team velocity is consistent and ...

Read more
  • continuous delivery
  • continuous experimentation
  • DevOps
Avatar
Adam Hawkins
— September 13, 2019

How Google, HP, and Etsy Succeed with DevOps

DevOps is currently well developed, and there are many examples of companies adopting it to improve their existing practices and explore new frontiers. In this article, we'll take a look at case studies and use cases from Google, HP, and Etsy. These companies are having success with Dev...

Read more
  • Continuous Learning
  • DevOps
  • Velocity