Microsoft Azure offers services for a wide variety of compute-related needs, including traditional compute resources like virtual machines, as well as serverless and container-based services. In this course, you will learn how to design a compute infrastructure using the appropriate Azure services.
Some of the highlights include:
- Designing highly available implementations using fault domains, update domains, availability sets, scale sets, availability zones, and multi-region deployments
- Ensuring business continuity and disaster recovery using Azure Backup, System Center DPM, and Azure Recovery Services
- Creating event-driven functions in a serverless environment using Azure Functions and Azure Log Apps
- Designing microservices-based applications using Azure Container Service, which supports Kubernetes, and Azure Service Fabric, which is Microsoft’s proprietary container orchestrator
- Deploying high-performance web applications with autoscaling using Azure App Service
- Managing and securing APIs using Azure API Management and Azure Active Directory
- Running compute-intensive jobs on clusters of servers using Azure Batch and Azure Batch AI
Learning Objectives
- Design Azure solutions using virtual machines, serverless computing, and microservices
- Design web solutions using Azure App Service
- Run compute-intensive applications using Azure Batch
Intended Audience
- People who want to become Azure cloud architects
- People preparing for a Microsoft Azure certification exam
Prerequisites
- General knowledge of IT architecture
Serverless computing is a hot trend. The idea is that you don’t have to worry about the infrastructure, such as virtual machines, underlying the service because the service takes care of that for you. Although there are a wide variety of Azure services that do this for you, such as the Bot Service and Stream Analytics, the one that people usually mean when they say “serverless” is Azure Functions. That’s because with Azure Functions, you can write almost any kind of code that you want executed and the service will do the rest. It’s more of a general-purpose serverless environment than the others.
Azure Functions is event-driven. In other words, your code will only run when it’s triggered by a certain event. For example, if you need to process images as users upload them to a Blob storage container, then you can use the BlobTrigger to execute your image processing code. There are triggers for events occurring in many other Azure services as well, such as Event Hub, Service Bus, and Cosmos DB. Triggers don’t always come from other services, though. For example, with the HTTPTrigger, you can call your function from an application by sending an HTTP request to it. Yet another possibility is to have your function run on a fixed schedule, such as every day at midnight, by using the TimerTrigger. A function can only have one trigger.
A trigger will include input data, such as the name of the new blob that fired the trigger. You can easily refer to this data in your code because Azure Functions automatically creates what it calls an input binding. This saves you the trouble of having to write code to connect to the input source.
You can also create your own input and output bindings, either in the Azure portal or in the function.json file. Bindings use a declarative syntax, so you don’t have to say how to connect to the data source or sink. You only have to give basic details about its type and where it is.
A typical function gets triggered, performs some simple operations, and then ends, but there are many cases where you would need to do something more complex. For example, suppose you need to run a sequence of functions in a particular order and the output of one function is the input for another function. You could do this by writing code to maintain state and orchestrate the execution of the various functions, but it would be much simpler to use Durable Functions. This is an extension to Azure Functions that lets you create orchestrator functions. This allows you to create workflows and call other functions synchronously and asynchronously.
Most functions don’t run continuously. They’re triggered when an event occurs and they only run briefly. That’s why the most common pricing plan for Azure Functions is the Consumption plan. Under this model, it allocates compute resources when the function is triggered and removes them when the function is finished. It even scales out to handle high loads. You only pay for resources when your function is running.
This works very well most of the time, but there are circumstances when you need to use an App Service plan. Under this model, your functions run on dedicated VMs. Here are some of the reasons why you might want to do this:
- Your function will run almost continuously. In this case, it would be cheaper to use dedicated VMs than pay-as-you-go.
- Your function needs to run for longer than 10 minutes, which is the maximum allowed under the Consumption plan.
- Or your function needs to run on Linux. The Consumption plan only supports Windows.
Microsoft also provides a simpler service if you don’t want or need to write custom code. It’s called Azure Logic Apps. Like Azure Functions, it’s invoked using triggers. The difference is that the actions executed by Logic Apps are not written in code. For example, an action can send an email or push an item onto a queue.
With Logic Apps, you can create workflows visually in the Azure portal. For example, suppose you want to make a copy of every file that gets uploaded into a Blob storage container. First, you create a Logic App. Then you create a trigger. There are lots available, so it’s usually easiest to search for the one you need. I’ll type “blob”. This trigger gets invoked when a blob is added or modified, which is what we want. Then you have to select the container where the trigger will look for new or modified files.
Next, you add an action. You can either search for an action or narrow down the list by selecting the connector first. Then select the “Copy blob” action. Now paste the URL for your storage account. To get the path of the specific blob from the previous step, you can go over to the dynamic content list and select “Path”.
Then you put in the path of the container where you want the blob to be copied to. This time, we need to select the name from the dynamic content list.
That’s all you have to do. No coding required. Just save it and run it. Now whenever you upload a file to the source blob container, it will get copied to the destination blob container.
Logic Apps is integrated with lots of other Azure services, such as Event Grid and Machine Learning. It also has connectors for products and services from lots of other vendors, including everything from Twitter to Salesforce. It’ll even connect to on-premises servers, such as an Oracle database.
And that’s it for serverless solutions.
Guy launched his first training website in 1995 and he's been helping people learn IT technologies ever since. He has been a sysadmin, instructor, sales engineer, IT manager, and entrepreneur. In his most recent venture, he founded and led a cloud-based training infrastructure company that provided virtual labs for some of the largest software vendors in the world. Guy’s passion is making complex technology easy to understand. His activities outside of work have included riding an elephant and skydiving (although not at the same time).