Defining Development Environments
Working with Kubernetes Locally
Integrating GCP into our Development Workflow
Deploying to Google Cloud Platform
The course is part of these learning paths
In this course, we will explore some of the tools available to build and manage development environments intended for deployment on Google Cloud Platform products. We will also demonstrate how to easily push builds from our local machine to Google-hosted services.
We will start the course by covering the different types of development environments and their purposes. We will touch briefly on popular software methodologies and frameworks as they relate to choices in number and type of development environments.
This course will focus on container-based application development environments, tools, and services. We will first walk through installing and using Docker and Kubernetes on your local machine. Then we will explore how to push projects to Google Cloud Run and Google Kubernetes Engine.
Writing applications using Kubernetes or Cloud Run can be further streamlined with Google Cloud Code, which provides direct IDE support for development on these platforms. We will examine how to install and use Google Cloud Code with Visual Studio Code.
- Understand the types of development environments and when to use them
- Install a container-based local development environment
- Add Google Cloud Code support to VS Code
- Push code from a local development environment and run on Google Cloud Platform using:
- Google Cloud Run
- Google Kubernetes Engine
- Google Deployment Manager
- Programmers interested in developing containerized applications on Google Cloud Platform
- Solo developers new to working on a development team
- Anyone preparing for the Google Professional Cloud DevOps Engineer certification
To get the most out of this course, you should:
- Have a Google Cloud Platform account
- Have Google Cloud SDK installed and initialized
- Be familiar with IAM role management for GCP resources
- Have Visual Studio Code, Python 3, and Git installed
Knowledge of Python would also be beneficial for scripting with GCP, but it's not essential.
Working with containers can be more complicated and less intuitive than working with virtual machines, but can often provide superior scalability and resource efficiency. Let’s get started with containers by installing Docker on our local development machine.
We’ll be working on a Windows 10 machine, so first things first, we need to make sure Windows is up to date. Docker 3 utilizes Windows Subsystem for Linux 2 to power its containers, and no longer requires Hyper-V as a backend. WSL 2 is quickly becoming the standard for containerized development on Windows 10, with significant performance benefits compared to version 1, but is only available on Windows 10 Version 1903 or higher. While it’s possible to continue working with Docker using Hyper-V, there are a number of extensions available in Visual Studio Code for working with containers that require WSL 2.
Now, we can grab the Docker Desktop installer from docker.com, and run it once it’s downloaded. The Docker installer will enable the Windows components needed for WSL 2, assuming we are working with an up to date Windows 10 build that supports it. A reboot may be required to enable these components. After the installation completes, you may get a message about installing a WSL 2 kernel update - just follow the link in the popup to download and then run the update installer.
Next, as strange as this sounds, we will need to install a Linux distribution from the Microsoft Store. There are a few different distros available, and you can install and use them all side by side if you want, but we’ll be focusing on Ubuntu for this series because this is the distribution used by Google.
If we take a look in the Docker settings now, we should see the “Use the WSL 2 based engine” option enabled. We might still find a message under the WSL Integration settings telling us that we don’t have a WSL 2 distro though. We can run the
wsl -l -v command to list all linux distros installed and display which WSL version they use. We can see our Ubuntu distro is on version 1, so we can upgrade it with the command
wsl --set-version Ubuntu 2 and the distro will be upgraded for us. We can then run the command
wsl --set-default-version 2 to make sure we are using WSL 2 by default in the future. Since Ubuntu is the main distro we’ll be using with Google, we can also set it as our default distro here with the command
wsl --set-default Ubuntu. A quick Docker restart, and we can see Ubuntu is now available as a WSL Integration in Docker.
Docker Desktop has its own implementation of Kubernetes built in, but not enabled by default. For our purposes, we can actually leave this disabled. We will be working with minikube on our development machine later in this course, which handles Kubernetes orchestration tasks on its own without needing this feature turned on. Docker Desktop will install its own instance of kubectl with this feature enabled, which might also cause conflicts and confusion with our PATH variables later when we install the same tool using the Google Cloud SDK.
We now have Windows Subsystem for Linux 2 and Docker 3 installed, and we are ready to start deploying containers on our development machine. In the next video, we’ll learn a little more about container management, why exactly we need Kubernetes, and what Kubernetes actually does.
Arthur spent seven years managing the IT infrastructure for a large entertainment complex in Arizona where he oversaw all network and server equipment and updated many on-premise systems to cloud-based solutions with Google Cloud Platform. Arthur is also a PHP and Python developer who specializes in database and API integrations. He has written several WordPress plugins, created an SDK for the Infusionsoft API, and built a custom digital signage management system powered by Raspberry Pis. Most recently, Arthur has been building Discord bots and attempting to teach a Python AI program how to compose music.