Defining Development Environments
Working with Kubernetes Locally
Integrating GCP into our Development Workflow
Deploying to Google Cloud Platform
The course is part of these learning paths
In this course, we will explore some of the tools available to build and manage development environments intended for deployment on Google Cloud Platform products. We will also demonstrate how to easily push builds from our local machine to Google-hosted services.
We will start the course by covering the different types of development environments and their purposes. We will touch briefly on popular software methodologies and frameworks as they relate to choices in number and type of development environments.
This course will focus on container-based application development environments, tools, and services. We will first walk through installing and using Docker and Kubernetes on your local machine. Then we will explore how to push projects to Google Cloud Run and Google Kubernetes Engine.
Writing applications using Kubernetes or Cloud Run can be further streamlined with Google Cloud Code, which provides direct IDE support for development on these platforms. We will examine how to install and use Google Cloud Code with Visual Studio Code.
- Understand the types of development environments and when to use them
- Install a container-based local development environment
- Add Google Cloud Code support to VS Code
- Push code from a local development environment and run on Google Cloud Platform using:
- Google Cloud Run
- Google Kubernetes Engine
- Google Deployment Manager
- Programmers interested in developing containerized applications on Google Cloud Platform
- Solo developers new to working on a development team
- Anyone preparing for the Google Professional Cloud DevOps Engineer certification
To get the most out of this course, you should:
- Have a Google Cloud Platform account
- Have Google Cloud SDK installed and initialized
- Be familiar with IAM role management for GCP resources
- Have Visual Studio Code, Python 3, and Git installed
Knowledge of Python would also be beneficial for scripting with GCP, but it's not essential.
In the previous lecture group, we learned how to use Visual Studio Code to edit and test our containerized Python application in our development environment. We also connected VS Code to our GCP account and deployed an application for testing on both GKE and Cloud Run.
We are now ready to push our application to a live production environment, but first we need to get serious about version control. We need to clearly define the software development life cycle for our project so changes flow smoothly from start to finish.
For our simple demo project, we only needed DEVELOPMENT, STAGING, and PRODUCTION environments. We have already seen how to work with our project on our local machine, and how to deploy our project to run on GCP.
By using a source repository and tagging our revisions, we can easily identify which version is still in development, which build is being tested in staging, and which build is our current stable production version. By tagging our builds this way we can make sure each environment is running its own distinct and separate version of code from our project.
We can safely make changes to our development version without impacting our staging or production environments until we are ready to update versions there.
Since we are working with GCP, Google Cloud Source Repositories is the obvious choice for us to use to track our code changes. If you’re familiar with Git, then you basically already know how to use Cloud Source Repositories, as it’s essentially a private Git repository with built in integrations with other Google Cloud products.
It can also connect to Github or Bitbucket, so you can still benefit from the deployment automation features in Google Cloud Source Repositories even if you prefer to host your code elsewhere.
We won’t spend too much time talking about Git here, but we can use branching with Git to further help us keep our code organized between our different environments and trigger automation events in our deployment process.
For more complex applications with larger development teams, you will likely use a combination of both branches and tags to keep code revisions organized. First, let’s just get our Cloud Source Repository connected to VS Code.
We don’t have any version control in place for our project currently, so let’s click “Initialize Repository”. You will need to already have Git installed locally for Visual Studio Code to work with here. Next, we need to head over to our GCP account, find “Source Repositories”, then click “Add repository”.
We’ll select “create new repository”, then continue here. Then we’ll give a name to our repository, and assign it to a GCP project. Next we’ll be given some authentication options to get our code into our new repository. We’ll follow the instructions to manually generate credentials for Git on our local machine here.
Once that’s done, we can grab the URL for our new repository, go back to VS Code, and add as a Remote to our source control there. Now we can commit our changes, push, and if we refresh our browser, we can see our repository is now there on Cloud Source Repositories.
We can run Google Cloud Build using our Cloud Source Repository to build and deploy our application to production automatically, which we’ll demonstrate in the next lecture. Depending on the scope of the application though, it may be more convenient or cost-effective to build our containers locally, then push them to Google Artifact Registry instead.
Creating a new repository on Artifact Registry is much the same as using Cloud Source Repository. Artifact Registry will be replacing Google Container Registry, as it has expanded support for many more build types.
Artifact Registry is very new and not well documented yet though, so there isn’t a lot of good information available on how to get it connected to your IDE.
Even the instructions provided by Google here aren’t very clear, and there’s a lot of confusion about authenticating with the repository to get started. There’s a very simple way we can get this working though, by using
gcloud auth login to authenticate with our user credentials when running GCloud SDK commands on our machine.
As long as our user account has the correct IAM permissions on the GCP project, we no longer have to worry about service accounts or access tokens on our development machine. Then we just need to run this
auth configure-docker command and pass it a list of any regions we want to be able to use. This registers Docker on our computer to use our Google credentials when interacting with these registries.
This doesn’t show up in the list as a registry if we are using the Docker extension in VS Code, but we are still able to interact with our Artifact Registry now. Let’s go to our Dockerfile and build our image, and include the full address to our new repository in the tag name.
Now if we push that build, we can see Docker has connected to our Artifact Registry repository and is now pushing our container image to Google. If we hop back over to our browser window afterwards, we can see our container image is now in the Artifact Registry!
In the next video, we’ll complete our CI/CD pipeline by creating a Google Cloud Build Trigger to automatically update our production environment to our latest stable app version.
Arthur spent seven years managing the IT infrastructure for a large entertainment complex in Arizona where he oversaw all network and server equipment and updated many on-premise systems to cloud-based solutions with Google Cloud Platform. Arthur is also a PHP and Python developer who specializes in database and API integrations. He has written several WordPress plugins, created an SDK for the Infusionsoft API, and built a custom digital signage management system powered by Raspberry Pis. Most recently, Arthur has been building Discord bots and attempting to teach a Python AI program how to compose music.