Anthos is an enterprise-grade solution from Google aimed at nothing less than modernizing and unifying your entire server infrastructure, wherever it currently exists. Anthos encompasses a very broad spectrum of components, yet it’s still very new, so there isn’t a lot of good documentation and training material available for it yet. This can all make Anthos seem very daunting to learn, but this course aims to show you that the very purpose of Anthos is to simplify your infrastructure complexities for you.
Learning Objectives
- Understand what Anthos is and does
- Identify how Anthos fits in with other existing hybrid and multi-cloud solutions
- Investigate options to modernize existing infrastructure configurations to use Anthos
- Learn about the key components that make up Anthos, and how to configure them
- Build and test a modern microservice application for Anthos on GCP
- Create a CI/CD pipeline for deploying to Anthos
Intended Audience
- Developers interested in learning about the latest in modern cloud-based development strategies
Prerequisites
- Familiarity with Kubernetes and GKE
- Have a Google Cloud Platform account
- Have the Google Cloud SDK installed and initialized
- Have Git installed
It is also highly recommended that you have Docker Desktop and Visual Studio Code pre-installed as well.
Creating a CI/CD pipeline for Anthos with Google Cloud Build. In the previous lecture, we learned how easily we can create and test a new microservice locally, then deploy it to Cloud Run for Anthos without even leaving our IDE using the Google Cloud Code extension. In this final lecture, we'll complete our DevOps pipeline by adding version control to our project, and automating deployment of new builds to a production environment whenever we create new versions of our application code.
To start this process, first we need to add version control to our project source code. If you already have Git installed, enabling version control on your project in VS Code is as simple as clicking here, then clicking on Initialize Repository. Let's save our changes, then commit to our local repository. Now that we have a working Git repository on our development machine, we need to create a repository on GCP where we can push new versions of our code. We could navigate through the Google Cloud Console to set up a Cloud Source Repository, or we can do it more quickly from the terminal.
First, we need to make sure that the services we need are enabled on our project with these commands. Then we can authenticate our local Git installation with our Google account using this credential helper command. Now, let's create a repository on Google Cloud Source Repositories with this command. Next, we add it as a remote to our local repository, then push our local repository code to our new remote repository with these commands. Be sure to change the project ID here to match your own project. We are able to issue these push commands to our remote repository directly from VS Code as well now that this is set up.
If we go back to the Google Cloud Console and check our Cloud Source Repositories, we can see our application code is now there. Now that we have our source code connected to Cloud Source Repositories, we can create a Cloud Build Trigger that monitors this repository for changes and performs some actions when a new version number is detected. We just have to navigate to Cloud Build in the Google Cloud Console, click on Create trigger, and fill out a few fields here. We're telling Cloud Build that when a new version number is detected in our Cloud Source Repository, and read the configuration files in that repository to build a new Docker container for our application.
Let's go back to VS Code now and test our new pipeline. Let's save and commit our changes, add a new tag to our version, then use the Git: Push, Follow Tags command to push our changes to our Google Source Repository. Now, if we go back to the Cloud Console, we can see our new tagged version is in our repository, and our Cloud Build trigger has detected this and created a new container build automatically. This trigger only builds our new container image though, it doesn't actually deploy it to Cloud Run for us. Let's go back to VS Code and add a cloudbuild.yaml file to our project. You'll need to update this file to match your own service name, cluster, and cluster location.
Next, we need to update our build trigger and tell it that instead of just creating a Dockerfile from our repository, follow the instructions from our cloudbuild.yaml file whenever a new version is detected. Let's save and commit our changes and push to our remote repository, and we can see our updated trigger running. If this build fails the first time, you probably need to enable service account permissions for Cloud Build, which we can do from the settings here. Let's try pushing a version once more, and we can see Cloud Build now deploying our container to Cloud Run for Anthos on our GKE cluster.
Arthur spent seven years managing the IT infrastructure for a large entertainment complex in Arizona where he oversaw all network and server equipment and updated many on-premise systems to cloud-based solutions with Google Cloud Platform. Arthur is also a PHP and Python developer who specializes in database and API integrations. He has written several WordPress plugins, created an SDK for the Infusionsoft API, and built a custom digital signage management system powered by Raspberry Pis. Most recently, Arthur has been building Discord bots and attempting to teach a Python AI program how to compose music.