The course is part of these learning paths
Cloud platforms are continuing to grow and evolve. There was a time when cloud platforms consisted of a few core services: virtual machines, blob storage, relational databases, etc. Cloud platforms are now much more complex, with services being built on top of other services. Kubernetes Engine, for example, runs on top of Compute Engine and integrates with the Container Registry, load balancers, and other services. With so many services of varying levels of complexity, it can be overwhelming to develop cloud-based solutions.
Throughout this course, we’ll cover some of the topics that will help you to integrate your applications with Google Cloud Platform’s compute services and REST API.
If you have any feedback related to this course, please contact us at support@cloudacademy.com.
Learning Objectives
- Implementing service discovery with Kubernetes Engine and Compute Engine
- Configuring applications with instance metadata
- Authenticating users with Identity Aware Proxy
- Using the CLI and Cloud Shell
- Integrating with the GCP API
Intended Audience
- Developers looking to integrate with GCP compute services
Prerequisites
To get the most out of this course, you should already have some development experience and an understanding of Google Cloud Platform.
Hello and welcome! When developing systems that leverage external services, we as developers need to make sure that we're aware of how to manage that integration.
Google presents its consumable cloud-based services through REST APIs. Each service has its own endpoint with a name that doubles as a base URL for the API.
Services consist of resources, methods, and types. Resources are specific components of a service. For example, two of the resources for Cloud Pub/Sub are topics and subscriptions. Each resource has one or more methods. And methods are named using the REST naming conventions where the common task of resource, listing, getting a single result, creation, updating, and deletion, are respectively named list, get, create, update, and delete.
Each method has an assigned HTTP verb that hints at the safety of the method as well as the idempotency. Google considers the verbs get and head to be safe according to their design guidelines, and get, put and delete are all idempotent, the rest are not. Google publishes a discovery document for each service. This is a machine-readable file which defines the resources, methods, and types of the service. It details out the data structure for the request and response, including data type details.
Google uses this file to auto-generate client libraries in specific programming languages. When you need to interact with a service which is consumed programmatically by the API, Google advises the use of these client libraries. Though, we can interact directly with the REST API.
Let's talk about some of the specific aspects of developing with the REST API. Let's review batching, restricting data, pagination, caching, and error handling.
Some services such as Compute Engine and Cloud Storage include a batch URL that allows us to send multiple requests inside of a multi-part request. These allow for multiple supported methods to be called inside of a batch. These still count as individual requests. However, these can be useful in select circumstances.
Other services such as the different API-consumed services might include their own batch version of a specific method.
The API includes a parameter named fields that we can use to specify the resource-keys to return. By setting the Fields parameter to voices.languageCodes we can have the API return only the language codes.
The REST API uses a built-in pagination system that is based on page tokens. Using the Compute Engine API as an example. I'm passing in a max results value here and this is limiting the amount of results that we can get returned. Scrolling down to the bottom, notice this pageToken here. We can use this value here to pass to the pageToken parameter which will return the next page. If we were actually implementing this from inside of an application, we could follow this value of nextPageToken until it no longer exists. In which case we've hit our final page.
Keep in mind, not all APIs are going to page data in this exact same way. Though, it will be roughly similar.
When we're developing with different services, we need to consider the service's limitations as well as cost. Using caching can help with these. There are too many possible caching implementations to cover here, however, it is worth mentioning.
Error handling is an important part of application development. When we interact with external services, it's important to consider how their errors are going to impact us. When the REST API is unavailable, it returns a status code of 503. In cases like this, we can use exponential back-off to allow the service to continue to try and reestablish a connection. The HTTP status code of 429 is returned when a resource quota is close to, or has reached its limit.
By default, most APIs are disabled from inside a given project. In order to use an API, we need to enable it through the console or CLI. Once an API is enabled we can interact with it directly, or through the client libraries.
The client libraries are built using the idioms of the given language and are installed using the tooling of the language. Let's take a look at a few examples showing how to use the client libraries. The documentation for each library tends to have at least a few examples. And this is the Google Translate Python API. The client libraries tend to follow this pattern here, where we import the library, create a client and then use the client to interact with the API. The client is responsible for managing the connection between our code and the API. And that includes things like authentication. Notice that this doesn't have any credentials.
By default the client APIs look for the credentials in the known locations that Google Services use. Which means this is going to attempt to use the Service Account for the given service. This makes it easy to get up and running in environments such as Cloud Functions, Compute Engine, and any other service that supports Application Default Credentials.
If we're not running on Google Cloud, we can alternatively specify the credentials, which can be generated from a service account JSON file.
Recall that there are two forms of batching with GCP. One is the service's batch URL, and the other form is specific to a given resource. Using the translate API as an example, we can see there's a TranslateText method and there's a BatchTranslateText method.
The translate text method accepts input and returns our results. The batch method accepts the input and returns an operation. A successfully returned operation allows us to know that the bulk operation is being processed without having to wait for it to complete.
We can use the returned operation to query the API about its status. Operations are widely used with GCP to allow for asynchronous tasks. So if you notice that a method returns an operation, know that you can use that operation to poll for its status.
Previously, I showed how to limit returned fields by setting the fields parameter of the API. While the different client APIs might have different ways of actually surfacing these standard parameters, once you've located it for your language, you can use it to filter the return data. This might make for a lot of effort to maintain, though, it could reduce a lot of network traffic if whatever you're doing is generating a lot of unneeded return data.
Pagination is going to follow the idioms of the given language. Notice here in this Go code, we have an iterator that breaks when it sees the error of Done.
Depending on the language you use for your client library, you might end up programming against the gRPC API rather than the REST API. For example, it appears that the Go API uses gRPC. The reason I mention this is simply because gRPC requires the connection to be closed once it's no longer required, where the REST API is stateless and doesn't have a persistent connection to close.
There are some Google recommended best practices when using these client libraries. They recommend that you reuse the client throughout the lifecycle of the application and not to create a new client for each use.
The client performs authentication when the first request is made. And then it saves that token. So by reusing that client, we prevent that client from needing to authenticate over and over again. When the access token expires, the client's gonna refresh it automatically, so we're not losing anything by doing this. They also recommend pinning the API version in our code. A pinned version allows us to ensure that upstream changes aren't gonna break our code. Now that said, we should also upgrade as often as is practical for the application. The more time that passes between upgrade, the more drift we're going to have between our current state and the future state.
Alright, with that, let's wrap up this lesson. Thank you so very much for watching, and I will see you in another lesson.
Lectures
Ben Lambert is a software engineer and was previously the lead author for DevOps and Microsoft Azure training content at Cloud Academy. His courses and learning paths covered Cloud Ecosystem technologies such as DC/OS, configuration management tools, and containers. As a software engineer, Ben’s experience includes building highly available web and mobile apps. When he’s not building software, he’s hiking, camping, or creating video games.