Skip to main content

Microsoft Azure Functions vs. Google Cloud Functions vs. AWS Lambda

Fight for Serverless Cloud Domination Continues

In this post, we’ll take a close look at Azure Functions, and we’ll extend it to the analysis that my colleague Alex started earlier in a previous post on Google Cloud Functions vs AWS Lambda. Rather than declaring a “winner,” we will highlight some of the key points of interest in serverless custom code execution as implemented in each cloud platform, taking a closer look at a comparison between Microsoft Azure Functions vs Google Cloud Functions vs AWS Lambda.
Serverless is one of the hottest topics in cloud computing. As we have recently discussed on the blog, serverless doesn’t actually mean that no servers are involved, it just means that developers don’t have to worry about servers or any other infrastructure issues or operational details. Instead, serverless computing allows developers to focus on the code they build to expose different functions.
Among the major cloud platforms, Microsoft Azure’s release of its Azure Functions in May 2016 follows the “serverless revolution” that started with AWS Lambda in 2014, and the release of Google Cloud Functions in February 2016.

Introducing Azure Functions

What Azure was missing prior to 2015 was related to serverless custom code execution, an equivalent of API Apps, like AWS Lambda and Google Cloud Functions. Prior to the official release of Azure Functions, Microsoft Azure already had a variety of serverless services. Its platform as a service (PaaS) features performed some functions for you and allowed you to scale up as you needed. As a developer, you could just size and pay depending on what you need to process, not based on the physical resources you use.
But what about execution? When does your application really need to run some code?
In an application, your code typically does not run continuously if it doesn’t have many data batches to process. Some code runs because something has happened and you need to handle it. Basically, your application can be described in terms of events. Even if you don’t have any events, you can create them with a timer that continuously triggers your code at predefined time intervals, and that can run forever. “Events” mean that you run code when you really need it to run. If you need to generate custom events to trigger your code, this is probably not the correct model to choose.
Even if you need to scale up and you have many requests to handle, for much of the time, at the sub-second interval (milliseconds), your application is just waiting. This is an incredible waste of time and money, as you are traditionally charged by the minute. So, you have a single piece of code that is triggered to handle an event, and you will be charged based on the effective time that it is executed and for the effective resources that it is requested to run  (CPU and memory). Sub-second billing is a requirement and also a feature of serverless computing.
This is what Azure Functions is all about: abstraction from the server, event-driven development, and sub-second billing.

Azure Functions is a Worker Role, which is a job that runs in the background. Worker roles in App Services are implemented with WebJobs, a feature of Web Apps, because they run on the same tenant of a deployed Web App. That’s why you can consider Azure Functions like an Azure WebJobs SDK as a Service, because you write a Web Job in a PaaS manner, without having to consider how a WebJob works or how it is structured. Azure Functions obtains the two models that it is able to run from Azure WebJobs: continuous running and triggered running.
Only the triggered running scenario is implemented. Instead, for continuous running tasks, you have to deal with WebJobs directly

Scalability, availability, and resource limits

Azure Function relies on the Azure App Services infrastructure. Single functions are aggregated in Function Apps, which contain multiple functions that share allocated memory in multiples of 128Mb, up to 1.5Gb.
There are two options for Azure Function hosting: handling resources with an App Service Plan, and a new Consumption Plan.
In the App Service Plan, you choose the size of the dedicated instances underlying the service and the scaling management (manual or autoscaling). The service scales out manually or automatically depending on the options chosen. Autoscaling allocates machines on some metrics by usage, for example, testing CPU load on a five-minute window. The service is priced by the hour and charged by the minute. This is the best choice if you already have an underutilized service plan, or you think you will run your functions almost continuously.
With the new Consumption Plan, you do not have control of the auto scaling host size or the used host size, and instances are added or removed automatically, minimizing the total time for the request to be handled. You rely on a sub-second based billing, using the number of real executions of the Functions app and the duration of an execution (not idle time) multiplied by the reserved memory, measured as Gb per seconds.
Resource limits are not clearly declared and tested at the moment. From various sources, we can find that Azure Functions backs a Function App with a maximum number of 10 instances.
The execution is limited to 5 minutes per invocation. There are interesting discussions in forums where some say that this can be a limitation, especially regarding long execution tasks.

Supported Languages and Dependencies management

On top of the Web Jobs service, Azure has added the new Web Jobs Script runtime. It implements the Azure Functions Host, the support for Dynamic Compilation, and the language abstraction to implement multi-language support. You can use many languages such as F#, Python, Batch, PHP, PowerShell, and new languages will be added in the future. For now, only C# and Node.js are officially supported.
For C#, your code is written in .csx files, which are C# scripting files introduced in a .NET world with the advent of Roslyn technology and Roslyn-based compilers. C# is treated as a dynamic language where code assembly is dynamically generated just before the execution request.
All dependencies are managed with NuGet service. The single function contains a project.json file where all of the required libraries are described and downloaded automatically at first execution.
You can find more details on .NET Core on our blog.
Node.js, JavaScript or its TypeScript language superset are supported natively by App Services from day one.

Deployments and versioning

Azure Functions rely on the App Services infrastructure to implement deployment. It already implements support on various code sources such as BitBucket, Dropbox, GitHub, and Visual Studio Team Services. When this support is configured, it is no longer possible to edit functions from the portal: the code is read-only.
Deployments work on an app basis, not single functions. Therefore, when you deploy, you impact all of your functions. The following structure represents a folder containing all of the code for a Function App. As you can see, the content of wwwroot contains a host.json file that defines global and common parameters for the app and all the single functions.
Different functions in the same app can be implemented in different languages.

| - host.json
| - mynodefunction
| | - function.json
| | - index.js
| | - node_modules
| | | - ... packages ...
| | - package.json
| - mycsharpfunction
| | - function.json
| | - run.csx

There is not direct support for versioning in the App, therefore you can only run the last or a specific version of your functions. Versioning is obviously supported from the source code repository such as Git repositories. Other services will probably support versioning in a more structured way, with versioning and aliases for different requirements like testing or staging, as Functions is becoming so pervasive in many different scenarios.

Invocations, events, and logging

Azure Functions support the event-driven approach. This means that you can trigger a function whenever something interesting happens within the cloud environment. A lot of different services can trigger a function. Many of them come directly from the WebJobs nature of Azure Function. It supports HTTP triggers either from a REST API or from a WebHook.

All triggers are bound to function parameters and natively supported by the environment.
You don’t need manual configuration as the environment helps you with mapping. Every single function describes bindings with parameters in a function.json file where you can configure their name, trigger type direction, and configuration of the sourcing parameter.

  "bindings": [
      "authLevel": "function",
      "type": "httpTrigger",
      "direction": "in",
      "name": "req"
      "type": "http",
      "direction": "out",
      "name": "$return"
  "disabled": false

Azure Functions inherits all App Services features for logging. On the remote console, you can inspect diagnostic messages such as debug messages and function timestamps and invocations. You can also connect to a streaming log to see the full content at the HTTP level.

Load testing and statistics

We tested Function App with the same code that we used in our previous post comparing Google Cloud Functions with Lambda (i.e. generating one thousand md5 hashes for each invocation).
We configured a linearly incremental load of 5 minutes, up to about 70 requests per second.
Please note that we have not re-executed the tests on AWS Lambda and Google Cloud Function. Therefore, we cannot say whether these platforms have improved since last year. We can summarize that AWS resulted in an average of 400 to 600ms, substantially independent from the increasing load. Google Cloud Function had a better response time, remaining on an average of 200ms, increasing rapidly after reaching 60 req/s.
Below, we have plotted a chart that shows the average response time and the average number of requests per second. As with the previous test, we have deployed the function in the North Europe region.
The average response time remains under 400ms and no more than 300ms. We noticed an equivalent change in performance (like with Google) at about 60 req/s when the average response time increases from about 80/100ms to 200ms on average. We are still investigating the strange peaks in response time that happened twice in this test, which affects the number of requests handled per second.

Microsoft Azure Functions vs Google Cloud Functions: Function code compatibility 

There is no support or effort to make Azure Functions compatible with AWS Lambda and Google Cloud Functions. From the code point of view, the function content can be refactored to be mostly independent from the infrastructure. Trigger messages are handled automatically and mapped to parameters.
Language support is different because of different developer communities. Node.js is supported by all of the platforms (with the obvious difference and similarities in function content implementation) and thanks to the release of .NET Core, also released as open source, C# also has gained global interest.
We can improve code modularity, moving content code in separate functions, packages, or libraries, using the function code for parameter handling and content code invocation. So, when you really need to migrate, you will need to make minor changes to adapt your code.


We can summarize the functionalities discussed in this post with the following table:

FeatureAWS LambdaGoogle CloudAzure Functions
Scalability & availabilityAutomatic scaling (transparently)Automatic scalingManual or metered scaling (App Service Plan), or sub-second automatic scaling (Consumption Plan)
Max # of functionsUnlimited functions1000 functions per projectUnlimited functions
Concurrent executions1000 parallel executions per account, per region (soft limit)No limitNo limit
Max execution 300 sec (5 min)540 seconds (9 minutes)300 sec (5 min)
Supported languagesJavaScript, Java, C#, and PythonOnly JavaScriptC#, JavaScript, F#, Python, Batch, PHP, PowerShell
DependenciesDeployment Packagesnpm package.jsonNpm, NuGet
DeploymentsOnly ZIP upload (to Lambda or S3)ZIP upload, Cloud Storage or Cloud Source RepositoriesVisual Studio Team Services, OneDrive, Local Git repository, GitHub, Bitbucket, Dropbox, External repository
Environment variablesYesNot yetApp Settings and ConnectionStrings from App Services
VersioningVersions and aliasesCloud Source branch/tagCloud Source branch/tag
Event-drivenS3, SNS, SES, DynamoDB, Kinesis, CloudWatch, Cognito, API Gateway, CodeCommit, etc.Cloud Pub/Sub or Cloud Storage Object Change NotificationsBlob, EventHub, Generic WebHook, GitHub WebHook, Queue, Http, ServiceBus Queue, Service Bus Topic, Timer triggers
HTTP(S) invocationAPI GatewayHTTP triggerHTTP trigger
OrchestrationAWS Step FunctionsNot yetAzure Logic Apps
LoggingCloudWatch LogsStackdriver LoggingApp Services monitoring
MonitoringCloudWatch & X-RayStackdriver Monitoring Application Insights
In-browser code editorYesOnly with Cloud Source RepositoriesFunctions environment, App Service editor
Granular IAMIAM rolesNot yetIAM roles
Pricing1M requests for free, then $0.20/1M invocations, plus $0.00001667/GB-sec1M requests for free, then $0.40/1M invocations, plus $0.00000231/GB-sec1 million requests for free, then $0.20/1M invocations, plus $0.000016/GB-s

Azure Functions is the logical evolution of PaaS programming in Azure and the serverless proposal for custom code execution. As with many other services in Azure, it will improve over time as development continues with feedback from early adopters.
If you enjoyed this post, feel free to comment and let us know what you think of the serverless revolution. We are happily using Lambda Functions in the Cloud Academy platform as well, and we can’t wait to see what will happen in the near future (if you don’t know it yet, have a look at the Serverless Framework).

Written by

Marco Parenzan is a Research Lead for Microsoft Azure in Cloud Academy. He has been awarded three times as a Microsoft MVP on Microsoft Azure. He is a speaker in major community events in Italy about Azure and .NET development and he is a community lead for 1nn0va, an official Microsoft community in Pordenone, Italy. He has written a book on Azure in 2016. He loves IoT and retrogaming.

Related Posts

— November 21, 2018

Google Cloud Certification: Preparation and Prerequisites

Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...

Read more
  • AWS
  • Azure
  • Google Cloud
— October 30, 2018

Azure Stack Use Cases and Applications

This is the second of a two-part series covering Azure Stack. Our first post provided an introduction to Azure Stack. Why would your organization consider using Azure Stack? What are the key differences between Azure Stack and Microsoft Azure? In this post, we'll begin to answer bot...

Read more
  • Azure
  • Hybrid Cloud
  • Virtualization
— October 3, 2018

Highlights from Microsoft Ignite 2018

Microsoft Ignite 2018 was a big success. Over 26,000 people attended Microsoft’s flagship conference for IT professionals in sunny Orlando, Florida. As usual, Microsoft made a huge number of announcements, ranging from minor to major in importance. To save you the trouble of sifting thr...

Read more
  • Azure
  • Ignite
— September 20, 2018

Planning for Microsoft Ignite 2018 Sessions: What Not to Miss

Cloud Academy is proud to be a sponsor of the Microsoft Ignite Conference to be held September 24 - 28 in Orlando, Florida. This is Microsoft’s biggest event of the year and is a great way to stay up to date on how to get the most from Microsoft’s products. In this post, I’ll help you p...

Read more
  • Azure
— September 18, 2018

How to Optimize Cloud Costs with Spot Instances: New on Cloud Academy

One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...

Read more
  • AWS
  • Azure
  • Google Cloud
— August 23, 2018

What are the Benefits of Machine Learning in the Cloud?

A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Machine Learning
— July 5, 2018

How Does Azure Encrypt Data?

In on-premises environments, data security is typically a siloed activity, with a company's security team telling the internal technology groups (server administration, database, networking, and so on) what needs to be protected against intrusion.This approach is absolutely a bad...

Read more
  • Azure
— June 26, 2018

Disadvantages of Cloud Computing

If you want to deliver digital services of any kind, you’ll need to compute resources including CPU, memory, storage, and network connectivity. Which resources you choose for your delivery, cloud-based or local, is up to you. But you’ll definitely want to do your homework first.Cloud ...

Read more
  • AWS
  • Azure
  • Cloud Computing
  • Google Cloud
Albert Qian
— June 19, 2018

Preparing for the Microsoft Azure 70-535 Exam

The credibility of Microsoft Azure continues to grow in the first quarter of 2018 with an increasing number of enterprises migrating their workloads, resulting in a jump for Azure from 10% to 13% in market share. Most organizations will find that simply “lifting and shifting” applicatio...

Read more
  • Azure
  • Compute
  • Database
  • Security
— April 12, 2018

Azure Migration Strategy: A Checklist to Get Started

By now, you’ve heard it many times and from many sources: cloud technology is the future of IT. If your organization isn’t already running critical workloads on a cloud platform (and, if your career isn’t cloud-focused), you’re running the very real risk of being overtaken by nimbler co...

Read more
  • Azure
— March 2, 2018

Three Must-Use Azure Security Services

Keeping your cloud environment safe continues to be the top priority for the enterprise, followed by spending, according to RightScale’s 2018 State of the Cloud report.The safety of your cloud environment—and the data and applications that your business runs on—depends on how well you...

Read more
  • Azure
  • Security
— February 15, 2018

Is Multi-Cloud a Solution for High Availability?

With the average cost of downtime estimated at $8,850 per minute, businesses can’t afford to risk system failure. Full access to services and data anytime, anywhere is one of the main benefits of cloud computing.By design, many of the core services with the public cloud and its underl...

Read more
  • AWS
  • Azure
  • Cloud Adoption
  • Google Cloud