CI/CD: Integrating with GitHub and Google Cloud
Putting it all Together
Integrating with GitHub and Google Cloud
The course is part of this learning path
This course explores how to build microservices in Go and deploy them on Kubernetes to create reusable components that can be fully managed in the cloud. We'll talk about what a microservice is and its overall architecture. We'll then take a look at the Go programming language, its benefits, and why it works well for building microservices.
You'll also get an intro to Kubernetes including what it is, what it is used for, and the key components that are needed to get a microservice from code to be exposed on the internet.
We'll then combine these three services to do an example use case where we'll build a microservice in Go and deploy it on Kubernetes. Finally, we'll look at CI/CD integration with GitHub and Google Cloud and how you can automate your deployments.
- Learn about microservices and their overall architecture
- Learn about the Go programming language and why it's good for building microservices
- Understand how Kubernetes can be used to deploy microservices
- Learn about CI/CD with GitHub and Google Cloud
This course is intended for engineers throughout the tech stack or anyone who wants to get their feet wet in DevOps and learn how programs can be managed in the cloud.
There are no essential prerequisites for this course. However, we recommend that you have:
- Experience with at least one high-level programming language, whether that be Java, Python, or Ruby
- A conceptual understanding of Linux containers and/or Docker
Our final topic for this course is CI/CD or continuous integration/ continuous delivery with GitHub and Google Cloud. Fortunately, these two work really well together and make it really easy to get connected and up and running.
There are going to be times when you'll be manually updating Kubernetes, and it will feel as if you're spending more time waiting for instances to be created, deleted or destroyed rather than actual time spent on development. You also run the risk of pushing to the wrong environment and this can cause you even more problems.
So what's the process if you need to make a change? What happens when other developers expand the code and need to push an update? If a dedicated team is working on a microservice you could have multiple commits a day that need to be deployed and tested and that can get very complicated, very quickly. We can make our lives easier by using some automation and leveraging continuous delivery.
If you're new to continuous integration/continuous delivery, or CI/CD, this process bridges the gap between development and operations or DevOps. CI/CD is an automated pipeline you create to get your application from code to deployed. You'll usually have CI/CD handle the following tasks, obtaining the code base, building the code, running tests, releasing or tagging, deployment of the application, operations, and monitoring. And then the cycle repeats itself.
You'll find a lot of different CI/CD tools available ranging from open source solutions, such as Jenkins to paid solutions like Travis CI or CircleCI. The CI/CD project available on Google Cloud is called Cloud Build. And as you might expect, it ties nicely with the other projects Google Cloud offers. This is great for us since we're using Google Cloud as our main cloud provider. But if your tech stack is located elsewhere Google Cloud Build also allows you to work across multiple cloud providers.
Cloud Build allows you to build and deploy across multiple different programming languages, such as Go, Python or Node.JS, allowing you to build all components of your application easily. It allows you to fully customize each of your pipelines as well. The builds are quite fast and once you've set it up the automation takes care of the rest.
To get connected, it's best to already be logged in into your GitHub account before starting this procedure because you'll be initiating the connection from the Google Cloud side. Start by navigating to the left side menu and from the list of products select Cloud Build, it'll be under Tools. It will ask you to set up build triggers. And trigger, as the name suggests is fired off by GitHub to inform Cloud Build that something has happened.
You can create build triggers to customize which builds to run on specific repository events. For example, you can set up build triggers to execute only on pull requests, pushes to master or release tags. You can take this one step further by specifying different build configurations to use for each trigger, letting you customize which build steps to run depending on the branch, tag or pull request the change was made to.
To make it even more flexible, Cloud Build allows you to customize when it runs based on the files changed. No need to build if only documentation was updated. To get started from the Cloud Build menu, select Triggers. We're actually going to ignore creating a trigger for the moment, since we still need to connect our GitHub repositories.
At the top of the page, you'll see Manage Repositories. In the middle of the page, click the button to Connect Repository. Follow the steps by selecting GitHub authorizing Cloud Build to access your repos. And finally selecting which repos you want to have automatically built.
Let's take a look at an example of what the build pipeline might look like. This as an example workflow that we can utilize based on the time zone or example we did earlier. You'll notice that in this example it's been expanded to use a production and a test instance. It's also been expanded to show multiple branches being developed on.
This shows how one can customize the builds to build and push to the Container Registry on any branch update, but only deploy to the QA and prod clusters when changes are pushed to specific branches. However, regardless of the branch bank pushed to GitHub, Cloud Build will always run the build and test processes and deploy to the Container Registry.
So how do we actually build something? This can be done through two ways, a Dockerfile or a cloudbuild.yaml file in the root of your repository. If you have a very simple built process you can tell Cloud Build to just use a Dockerfile. If you have a more complex setup or really just need more than one step, you'll want to use the cloudbuild.yaml file. This is an example of the build process for one of my projects.
In the example below the first few steps are purely just preparation steps. The first step sets up environment variables to be used throughout the entire build process. The second step grabs SSH keys from the Google Cloud Secret's Manager. The other steps pull down Docker images needed for the build, but the actual building of the application is done through a Docker file. But the steps use a Docker image containing the Docker runtime to do the actual build. The final step is pushing the container to the registry and informing Kubernetes that there's a change and it needs to be updated.
Even though the entire building procedure's done using the Docker file, we're still leveraging Cloud Build to prep everything we need for a successful build. In this case, it's grabbing SSH keys from a Secret's Manager and setting up environment variables. This can easily be expanded though to run tests using your favorite test suite or add notifications to your Slack channel.
The last piece of the puzzle is actually to create the trigger on Google Cloud Build. You'll need to give it a name and possibly a description. Referring to our project from earlier, this one will be called production-timezoner. And for our case we'll want to handle the build with any push to a branch. Select the Repository from GitHub you want to build and define what branch you want the trigger to listen on.
In our example, we're going to only have this trigger when pushes to the master branch happen. If we wanted this trigger for all other branches, there's a handy Invert Regex checkbox. You can also add filters to ignore certain files but we're gonna skip this part for now.
Lastly, tell Cloud Build which instruction sets to follow. It's best to keep the build configuration to auto as it will prioritize cloudbuild.yaml files over Docker files. If you want different triggers for the same branch you can have multiple cloudbuilt.yaml files and select the appropriate build file for the appropriate trigger.
Once this trigger is built, you will have a fully automated deployment in the cloud. This gives developers more time to focus on development. It also removes the complexity of trying to manually make deployments to different environments, and provides you with automated builds, unit tests and container management.
We've only covered the very basics of Cloud Build. There is a lot more that it can provide to you and your team, ranging from Slack notifications to publishing external releases. I highly recommend you look into Cloud Build and see how you can incorporate it into your current development cycle.
With the Cloud Build trigger finally in place, to recap our final deployment flow is as follows. A developer will push Code to a Repository on GitHub. GitHub triggers an event to Google Cloud Build. Cloud Build follows the instructions to build the application and run tests against it. The newly built container is then pushed to the Container Registry with the proper tags. Kubernetes is informed of the change and deploys the new image to the cluster. And this is the beauty of DevOps.
Calculated Systems was founded by experts in Hadoop, Google Cloud and AWS. Calculated Systems enables code-free capture, mapping and transformation of data in the cloud based on Apache NiFi, an open source project originally developed within the NSA. Calculated Systems accelerates time to market for new innovations while maintaining data integrity. With cloud automation tools, deep industry expertise, and experience productionalizing workloads development cycles are cut down to a fraction of their normal time. The ability to quickly develop large scale data ingestion and processing decreases the risk companies face in long development cycles. Calculated Systems is one of the industry leaders in Big Data transformation and education of these complex technologies.