Start course

Kubernetes has become one of the most common container orchestration platforms. Google Kubernetes Engine makes managing a Kubernetes cluster much easier by handling most system management tasks for you. It also offers more advanced cluster management features. In this course, we’ll explore how to set up a cluster using GKE as well as how to deploy and run container workloads.

Learning Objectives

  • What Google Kubernetes Engine is and what it can be used for
  • How to create, edit and delete a GKE cluster
  • How to deploy a containerized application to GKE
  • How to connect to a GKE cluster using kubectl

Intended Audience

  • Developers of containerized applications
  • Engineers responsible for deploying to Kubernetes
  • Anyone preparing for a Google Cloud certification


  • Basic understanding of Docker and Kubernetes
  • Some experience building and deploying containers

So you should now have everything you need to get started playing around with Google Kubernetes Engine.  Let me do a quick review over everything that was covered.

First, we started out with a review of some Kubernetes basics.  I talked about what a cluster is, and how they are composed of a control plane and worker nodes.  You tell the control plane what you want to do, and then the work is split up and handed off to the nodes.

Second, I talked about pods and workloads.  When working with a Kubernetes cluster, you aren’t going to see a list of running containers.  Instead, you will see pods.  A pod is the smallest deployable unit of computing in Kubernetes.  Typically, they are composed of a single container.  But they can have multiple containers as well.  Generally, you won’t be creating pods yourself.  Instead, you will create a workload and that will create the pods for you.  A workload represents the application you wish to run and can require multiple pods and multiple containers.

Third, I showed you how to create and manage clusters in GKE.  You learned how to create clusters using both the console as well as the command line.  And I also showed you how to add and remove nodes using node pools.  And showed you the various auto-scaling options for both pods and nodes.

Fourth, I showed you how to deploy your applications to a GKE cluster using both the console and the command line.  And fifth, I gave some pointers on detecting and fixing any issues you might encounter on GKE.  You saw how to access logging and monitoring, as well as how to create an alert.

Well, that’s all I have for you today.  Remember to give this course a rating, and if you have any questions or comments, please let us know.  Thanks for watching, and make sure to check out our many other courses on Cloud Academy!


About the Author
Learning Paths

Daniel began his career as a Software Engineer, focusing mostly on web and mobile development. After twenty years of dealing with insufficient training and fragmented documentation, he decided to use his extensive experience to help the next generation of engineers.

Daniel has spent his most recent years designing and running technical classes for both Amazon and Microsoft. Today at Cloud Academy, he is working on building out an extensive Google Cloud training library.

When he isn’t working or tinkering in his home lab, Daniel enjoys BBQing, target shooting, and watching classic movies.