The course is part of these learning paths
Modern software systems have become increasingly complex. Cloud platforms have helped tame some of the complexity by providing both managed and unmanaged services. So it’s no surprise that companies have shifted workloads to cloud platforms. As cloud platforms continue to grow, knowing when and how to use these services is important for developers.
This course is intended to help prepare individuals seeking to pass the Google Cloud Professional Cloud Developer Certification Exam. The Cloud Developer Certification requires a working knowledge of building cloud-native systems on GCP. That covers a wide variety of topics, from designing distributed systems to debugging apps with Stackdriver.
This course focuses on the third section of the exam overview, more specifically the first five points, which cover deploying applications using GCP compute services.
- Implement appropriate deployment strategies based on the target compute environment
- Deploy applications and services on Compute Engine and Google Kubernetes Engine
- Deploy an application to App Engine
- Deploy a Cloud Function
- IT professionals who want to become cloud-native developers
- IT professionals preparing for Google’s Professional Cloud Developer Exam
- Software development experience
- Docker experience
- Kubernetes experience
- GCP experience
Hello and welcome. In this lesson, we'll be talking about Cloud Functions. More specifically, we'll talk about deploying them. Cloud Functions are Google's event-triggered serverless compute offering, and they're executed in response to certain events. Cloud Functions basically fall into two groups, functions triggered directly and functions that are triggered indirectly. In the documentation, Google refers to these as HTTP triggers and background functions. HTTP functions are executed directly in response to an HTTP request. Common HTTP verbs are supported, and each function supports TLS. Background functions are executed indirectly in response to cloud infrastructure events. That's things such as a change to objects inside of a storage bucket, Pub/Sub events, etc. Now, because these functions run in the background, we as developers need to make sure our code terminates once it's complete. The method for terminating the functions, it differs from runtimes, so Cloud Function supports three runtimes, Node.js, Python, and Go, and we can talk about code for a minute before we actually start deploying.
When we create Cloud Functions, we need to specify an entry point. The entry point function parameters differ between HTTP and background functions, as well as between runtime. With Node.js HTTP functions, you use Express and the entry point function accepts a request and response, which is probably going to be familiar to most Node developers. Background functions have a different entry point signature. They accept the event data and the event context. Dependencies are specified in the package.json, and that's used by npm. So again, this is all familiar to Node developers. Python uses Flask for HTTP functions, and its entry point accepts a Flask request, and it can return a Flask response. Python background entry points accept event data and context. Dependencies are handled by pip, and specified in the requirements.txt file. Golang uses the default net HTTP library for HTTP functions, accepting a ResponseWriter and a Request. It accepts the built-in context and an event for the background functions. Dependencies are handled by Go modules or by vendoring the dependencies.
All right, so that's kind of a high level of the function types and runtimes, and with that in mind, let's create a couple of functions. Here's what I want to do first. First, let's check out HTTP functions, and then, let's go and test out a background function that will use cloud storage to trigger. We're going to use the console to create this, however, at the end, we'll check out how it would actually be created on the command line. First up, we need a name, and for each project, the function name has to be regionally unique. The trigger drop-down here, determines the type of function, and changing this will change the form accordingly. By default, the assumption is that HTTP functions should be public, leaving this as an unauthenticated call, however, you can change that. Cloud Functions offer the inline code editor for basic functionality and testing, though it also supports zipped and uploaded files. It can pull from cloud source repositories as well. Let's use the default runtime here. It's Node 8 and this placeholder code, it's pretty basic, it's just going to print 'Hello World!' or a user-supplied message, and this field here is asking for the entry point name. There are additional settings that we can configure, and we can set the region, a timeout, max number of function instances, which helps maintain cost. We can set the service account that's used for the code inside of the function, and we can set a VPC connector, allowing us network access, and importantly we have environmental variables that we can pass to our code. Creating this function will deploy our placeholder code and make it accessible from its own URL.
Let's test this by drilling in and clicking the URL to trigger the code, and we have our Hello World! text that we would expect. Here in the console, creating a function performs an initial deployment and editing a function is how we perform subsequent deployments. The command-line interface combines this into one concept, it's the gcloud functions deploy command, and when using the CLI to deploy a function, you pass in the name of the function, the runtime, the trigger, and the flags, which we saw in the console. The runtime only needs to be included when you actually want to initially set up the runtime or if you want to change it, and the trigger is the same as the drop-down, it's a list of HTTP, Cloud Storage, etc, and the flags are those contextual fields that change based on the trigger type.
Let's deploy another function. This time we're going to use a Cloud Storage trigger. We'll create a trigger that will execute in response to a Cloud Storage object being deleted and it's just going to print the name of the file that was deleted. We'll set the trigger to Cloud Storage, the Event Type to Delete, though you can see here there are other types, and I'm going to use a bucket that was created previously for another demo. There is nothing special about it. There isn't any special setting that you need to configure for these events to work. So this function is going to listen for the delete events from Cloud Storage, and it's going to run this code here, which will print the deleted file's name. Let's jump ahead to when it's complete. Okay, and now, let's test this by deleting a file, and we're going to head over to Stackdriver so that we can see the logs. All right, it took a moment and some refreshing, but here is the results from the function. All functions require a trigger. When using the command-line interface, the initial deployment requires a trigger, after that, you can omit it unless you want to change it. Each function can only have one trigger, and each trigger may have its own required parameters, such as the storage bucket name, as in our example. All right, that is going to wrap up this lesson. Thank you so much for watching, and I will see you in another lesson.
Ben Lambert is a software engineer and was previously the lead author for DevOps and Microsoft Azure training content at Cloud Academy. His courses and learning paths covered Cloud Ecosystem technologies such as DC/OS, configuration management tools, and containers. As a software engineer, Ben’s experience includes building highly available web and mobile apps. When he’s not building software, he’s hiking, camping, or creating video games.