The course is part of these learning pathsSee 3 more
Microsoft Azure Storage Accounts are cloud-based data storage repositories for a variety of data types, including blobs, queues, tables, and files. Managing the data in these accounts is often the responsibility of the application developer who uses this data. This course focuses on blob storage and the tools and methods developers can use to manage blobs in Azure Storage Accounts.
The course begins with a brief review of Azure Storage Accounts and then drills down into the details of blobs storage services, highlighting the different kinds of blobs.
The course then focuses on moving blobs between storage containers within a storage account and moving blobs between different storage accounts, using the AZCopy tool, using PowerShell, and programmatically using C#.NET. Next, the course dives into blob properties and metadata and how to set and retrieve this information using the Azure Portal, PowerShell, and programmatically in C#.NET. The course then moves into blob leasing, what it is used for, and how to obtain and manage blob leases using the Azure CLI, the REST API, and C#.NET. The last topic in this course covers data archiving and retention by levering Storage Tiers, the new Lifecycle Management feature in the Azure Portal, and using the immutable storage policies feature.
- Moving items in blob storage between storage containers
- Moving items in blob storage between storage accounts
- Setting and retrieving blob properties and metadata
- Implementing blob leasing
- Implementing data archiving and retention
- Azure developers who design and build cloud-based applications and services
- People preparing for Microsoft’s Azure AZ-203 exam
You’ll need to have a basic understanding of Azure, have some experience developing scalable solutions, and be skilled in at least one cloud-supported programming language.
Azure Storage offers different storage tiers allowing for the storage of blob objects cost-effectively. The offered tiers for block blob storage account include Premium, Hot, Cold and Archive. While the general-purpose v2 storage account offers Hot, Cold and Archive tiers.
The Premium storage tier is hosted on high-performance solid-state drives, optimized for low latency and high transactional rates and is intended for frequently accessed data. This tier has the highest cost of the four tiers.
The Hot storage tier is optimized for storing frequently accessed data and is designed for actively read and modified data. It incurs a higher storage cost than the Cold or Archive tiers but has the lowest access cost.
The Cold tier has a lower storage cost than the Hot tier, but a higher access cost. This tier is designed for data that will remain in this tier for at least 30 days, such as short term backups.
The Archive tier has the lowest storage cost but the highest access cost. This tier is intended for data that will remain in this tier for at least 180 days and can tolerate several hours of retrieval latency. Data in this tier is offline and cannot be copied, overwritten, or modified. And snapshots cannot be taken of a blob in this tier. Blob metadata does remain available allowing for the retrieval of blob properties, metadata and lists.
To access the data in the Archive tier, the blob must be rehydrated. This is done by changing the tier for the blob to either Cold or Hot. Note that it can take up to 15 hours to complete the rehydration process.
Storage tiers can be used as part of an archiving policy for blobs moving less frequently accessed blobs to progressively lower-cost storage tiers. Changing the storage tier for blobs can be done from the Azure portal at blob level by accessing the properties of the blob and choosing the desired tier from the dropdown list.
PowerShell can be used to change the storage tier for a single blob or for all blobs in a container by using the Get-AzureStorageBlob commandlet. By passing the objects to the Where-Object filter you can filter the blobs against any of the blob properties, most notably the LastModified property.
In this demonstration, I'll show you how to change the storage tier for blobs using PowerShell. First I need to provide the storage context to the new Azure storage context commandlet, consisting of the storage account, which I'll pass into a variable called stgacc, and the storage account key, which I'll pass into its own variable. I'll also pass in the container I wish to work with into its own variable.
Now I'm ready to execute the New Azure Storage Context commandlet. I'll pass in my storage account name and the storage account key. And then I'll store the results into a variable ctx. So I'll execute that much of it.
Now I'm ready to retrieve the blobs that I wish to work with. I'll call the Get-AzureStorageBlob commandlet, pass in the container, and the context. And I'll store those blobs in a variable so that I can work with them in the next command.
Now I'm ready to change the tier for the blob. I call the icloud blob method for the blobs, and I call the property set standard blob tier and I'm setting it to two, which is the Cold tier. And I'll go ahead and execute that.
Switching to the Azure portal and navigating to the archived images container I can see that my access tier has indeed changed. It was originally set to Hot and it is now set to Cold.
Lifecycle management is another Azure storage account feature that can be used for implementing a retention and archive policy. This tool allows you to create rule-based policies for general-purpose v2 and blob storage accounts, to transition data to appropriate storage tiers and expire data at the end of its lifecycle.
Lifecycle management policies can be managed using PowerShell, the Azure Command Line Interface, REST APIs, or programming client tools in .NET, Python, Node.js, and Ruby. A policy consists of a version and a collection of rules. Each rule consists of a filter set and an action set. The filter set limits the rule actions to a filtered set of objects in a container, or to a specific object name. The action set defines the storage tier or the delete action for the objects to find in the filter set.
Lifecycle management can automate the task of placing objects in the appropriate storage tier allowing, for example, for data that hasn't been accessed in the last 30 days to be moved to the Cold tier. And data that hasn't been accessed for 180 days to be moved to the Archive tier.
Immutable storage policies for Azure blobs is an Azure storage setting that provides for two types of retention policies, time-based retention and legal holds. These two policies can be used for regulatory compliance, to secure document retention, and ensure that documents that are critical in litigation or criminal investigations are retained a tamper-proof state.
A legal hold policy places existing blobs in the container in a non-writable, non-deletable state until the legal hold is cleared. But blobs can be created and read. Legal holds are associated with a user-defined tag that is used as an identifier string.
Time-based retention policies place the blobs in a container in a non-writable, non-deletable state for a specified interval. Time-based retention policies must be locked for the blob to be in an immutable state. In order to be SCC compliant, policies should be locked within 24 hours.
Immutable storage is usually configured. Simply navigate to the storage container and from the Settings blade choose Access policy. Select Add policy from the immutable blob storage section and select the type of policy, time-based or legal hold.
- Course Introduction
- Creating an Azure Storage Account
- Moving Blobs Between Storage Containers
- Moving Blobs Between Storage Accounts
- Setting and Retrieving Blob Properties and Metadata
- Implementing Blob Leasing
- Implementing Data Archiving and Retention
Jeff is a technical trainer and developer residing in Arizona, USA. He has been a Microsoft Certified Trainer for the past 18 years, providing in-house development and training on Microsoft server operating systems, PowerShell, SQL Server and Azure. When he’s not developing and delivering courses on Azure, he’s photographing galaxies, nebulae and star formations from his computer-automated observatory in Chino Valley, Arizona using a 14” Schmidt Cassegrain telescope.