1. Home
  2. Training Library
  3. Google Cloud Platform
  4. Courses
  5. Introduction to Google AI Platform

Hyperparameter Tuning


Training Your First Neural Network
12m 40s
Improving Accuracy
8m 4s
Start course
1h 3m

Machine learning is a hot topic these days and Google has been one of the biggest newsmakers. Google’s machine learning is being used behind the scenes every day by millions of people. When you search for an image on the web or use Google Translate on foreign language text or use voice dictation on your Android phone, you’re using machine learning. Now Google has launched AI Platform to give its customers the power to train their own neural networks.

This is a hands-on course where you can follow along with the demos using your own Google Cloud account or a trial account.

Learning Objectives

  • Describe how an artificial neural network functions
  • Run a simple TensorFlow program
  • Train a model using a distributed cluster on AI Platform
  • Increase prediction accuracy using feature engineering and hyperparameter tuning
  • Deploy a trained model on AI Platform to make predictions with new data



  • December 20, 2020: Completely revamped the course due to Google AI Platform replacing Cloud ML Engine and the release of TensorFlow 2.
  • Nov. 16, 2018: Updated 90% of the lessons due to major changes in TensorFlow and Google Cloud ML Engine. All of the demos and code walkthroughs were completely redone.

In the last lesson, I talked about how to engineer features to improve a machine learning model. Another way to improve a model is by experimenting with the hyperparameters. These are essentially the settings for the training run. For example, the number of hidden layers is a hyperparameter. It’s part of the model, but it’s something you set ahead of time, rather than something that the model learns during its training run. Another example is the batch size.

Hyperparameters are not the same as parameters. When you hear references to parameters in neural networks, that means the weights that the model learns during training. In contrast, hyperparameters are set manually and do not change during training.

Deciding what settings to use for the various hyperparameters is often a guess. To tune a hyperparameter, you need to do an entire training run and see how it performs, then adjust the hyperparameter and do another training run, and so on. This can be time-consuming and tedious, so AI Platform provides a way to tune hyperparameters automatically. It does require code changes, though. It can also be costly because AI Platform has to run many trials with different values for the hyperparameter in question to see which value gives the best results. For this reason, you can set the maximum number of trials to run.

Here’s how it works. First, you tell it what you’re trying to optimize, which is called the hyperparameter metric. This is typically set to “accuracy”. That is, you want your model to get the highest accuracy possible. Then you tell it which hyperparameters you want to tune. You have to be careful not to choose too many because that could dramatically increase the number of trial runs and the associated cost.

Finally, you need to tell it which search algorithm to use, that is, how it should decide which hyperparameter values to try. There are three choices: Random, Grid, and Bayesian. Random is pretty self-explanatory. It chooses hyperparameter values at random. A grid search is the opposite. It tries all values within a grid that you define. In contrast to these simple approaches, a Bayesian search is a sophisticated algorithm that makes intelligent guesses as to which are the best values to try. It’s the default algorithm because it’s usually the most efficient and cost-effective.

And that’s it for this lesson.

About the Author
Learning Paths

Guy launched his first training website in 1995 and he's been helping people learn IT technologies ever since. He has been a sysadmin, instructor, sales engineer, IT manager, and entrepreneur. In his most recent venture, he founded and led a cloud-based training infrastructure company that provided virtual labs for some of the largest software vendors in the world. Guy’s passion is making complex technology easy to understand. His activities outside of work have included riding an elephant and skydiving (although not at the same time).