The Azure cloud is a collection of resources that work in unison to deliver a product or a service. Hardware is virtual and services can be created and destroyed at the stroke of a key. In the context of DevOps, resources can be "spun up" as part of a pipeline. It is crucial that when resources are deployed multiple times, those deployments are consistent, repeatable, and can be automated. Doing it manually through the Azure portal isn’t practical. Azure Resource Manager (ARM) has an interface for processing resource templates that specify resource deployments.
In this course, we look at how those templates can be built and deployed. We start with a simple template and move on to more complex examples to illustrate many of the useful features available when deploying resources with templates. This course contains plenty of demonstrations from the Azure platform so you can see exactly how to use Azure Resource Manager in practice.
If you have any feedback on this course, please reach out to us at email@example.com.
- Understand what Azure Resource Manager (ARM) is and its use cases
- Learn about the different ARM templates available and how they can be used
- Deploy databases using an ARM template
- Export a template and create templates using QuickStart templates
- Deploy resources using a script
This is a beginner-level course aimed at anyone who wants to learn more about managing and configuring their Azure environment, and fast-tracking their deployments.
To get the most out of this course, you should have a basic understanding of the Azure platform.
Obviously, using a default parameter in our linked template, which is essentially the same as hard coding, the value is completely unacceptable and almost pointless. Before I show you how to use a parameter file to dynamically change parameters, I'll just delete the web app and plan. Let's go back to visual studio code and create a new JSON file called deploy parameters, and from the autocomplete, I will choose the parameter template. We just need the app name and plan name parameters with values inside the parameter template and add the plan name parameter to the website deploy template.
With the parameters set up I’ll save and close the the template file, and go back to appsplandeploy.json and add the plan name parameter to the website deploy template. The plan name parameter in this template, website deploy is where the plan name comes in from externally. Then we will pass it to the linked or nested template via a parameter within the linked template resource below.
Now in the parameters section of the linked template resource I will just add the plan name parameter and pass it the parameters planname value. This is a completely ridiculous and circular scenario in reality as the plan name parameter is being passed to the link template and then subsequently retrieved through the outputs value. But this is just to give you an idea of how to use parameters in linked templates using the previous example. Now, to deploy the resources, we’ve got the parameters in a file. That file will be passed as a parameter to the new-azresourcegroupdeployment command. The parameter values get pulled out of the JSON file and matched by name to the parameters defined in the ARM template file. Those parameters, can in turn be passed to linked or nested templates, by matching on name.
Let’s switch over to the PowerShell command prompt. First I’ll test the deployment, specifying the resource group name, the ARM template file, and the parameters file. Okay having successfully tested that I will now redeploy by executing the new resource group deployment command with the template parameter. We can see in the PowerShell CLI and the portal that it has successfully executed.
Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard.