CloudAcademy
  1. Home
  2. Training Library
  3. Microsoft Azure
  4. Courses
  5. Design and Implement a Storage Strategy for Azure 70-532 Certification

Monitoring Storage

Contents

keyboard_tab
Implement Azure Storage Blobs and Azure Files
Implement Storage Tables
10
Tables8m 50s
Implement Azure Storage Queues
Manage Access
Monitor Storage
Implement SQL Databases
14
Conclusion
play-arrow
Start course
Overview
DifficultyIntermediate
Duration1h 18m
Students151

Description

Course Description

This course teaches you how to work with Azure Storage and its associated services.

Course Objectives

By the end of this course, you'll have gained a firm understanding of the key components that comprise the Azure Storage platform. Ideally, you will achieve the following learning objectives:

  • How to comprehend the various components of Azure storage services.
  • How to implement and configure Azure storage services. 
  • How to manage access and monitor your implementation. 

Intended Audience

This course is intended for individuals who wish to pursue the Azure 70-532 certification.

Prerequisites

You should have work experience with Azure and general cloud computing knowledge.

This Course Includes

  • 1 hour and 17 minutes of high-definition video.
  • Expert-led instruction and exploration of important concepts surrounding Azure storage services.

What You Will Learn

  • An introduction to Azure storage services.
  • How to implement Azure storage blobs and Azure files.
  • How to implement storage tables.
  • How to implement storage queues.
  • How to manage access and monitor storage.
  • How to implement SQL databases.  

Transcript

Hello and welcome back. We'll now cover how you can monitor the Azure storage using metrics and logging. In this section we'll discuss metrics and logging and how these can be explored using the analytics features provided through Azure storage and analytics. We'll then look at configuring the retention policy for logs and metrics generated by Azure. And finally, we'll look at how you can analyze the metrics and log data.

Azure Storage Analytics provides a vast amount of metrics and logging data. This data is made available through the Azure Portal, which offers many facilities to help explore key metrics and logs. Monitoring metrics can be subdivided into categories useful to checking on service health, capacity, availability, and performance. Log data can be stored on a local file system or in a blob container. These logs can be loaded into the Microsoft Message Analyzer, which is a tool that consumes log files and displays log data in a visual format that makes it easy to filter, search, and group log data into useful sets that you can use to analyze errors and performance issues.

Let's look at some of the monitoring features in detail. Metrics can be set up to collect data and requests made to storage, following transactions and overall storage capacity. Metrics allow you to monitor performance of the storage in a similar way to using Performance Monitor on Windows. By default these metrics are not enabled but can be enabled using code or through the portal. The results are stored in Azure tables against specified storage accounts, and you can choose how frequently the metrics are collected. For example, hourly or every minute.

You can configure logging on all requests made. For all types of access such as reads, writes, or delete operations. By default, this logging is not enabled. Once configured, the logs are stored as blobs in a folder called $logs. Retention can be configured separately for each service. Blobs, files, tables or queues. By default, nothing is deleted until the volume of data reaches 20 terabytes. After that, metrics and logging data will cease to be collected until some other data is deleted, and when you specify a retention period you can choose to keep the data from zero to 365 days with expired data getting automatically deleted after this time.

The Azure portal provides comprehensive facilities for analyzing metrics. Including text and graphical displays. Metrics are shown and can be analyzed separately by the different storage types. Logs are produced as delimited text files. These can be downloaded and analyzed using any suitable tool. One option is to use Excel to deliver a purpose built logging summary tool. There's also a tool called Azure HDInsights Log Analysis Toolkit, or L A T, LAT. Which provides a more sophisticated log analysis. Though this tool has not been updated recently. A newer option is to use the Microsoft Message Analyzer. Which provides facilities to filter, search, and group log data.

In this demo we'll cover setting up Azure storage to generate metrics and logging and then we'll demonstrate how you can use the message analyzer to examine the contents of a log. Let's now have a look at how to view metrics in the Azure portal. We've got the storage account open already, and if you click on the graph, it will bring up a focus panel including the edit chart option. This allows you to select the metrics and the time frame displayed on the graph. Let's add the metric for success. Here are all the metrics that are available to select. The ones that you can't select are grayed out so you can't change those, but if we go down to success, click success, click okay. There should be a small change on the graph when it's finished reloading it, and there's 31 success entries there and you can see it just here where my mouse cursor is right now.

To enable logging using the portal, select the storage account and diagnostics. So let's go to settings and we'll click on diagnostics. You then have the option to set up logs for the blob, table, or queue storage types. Once you activate these options, it may take a few hours before logs become available in the $logs folder.

To analyze logs you need to download them. You can use the utility AZED copy which is part of the Azure storage tools and is typically installed in the folder C program files X86, Microsoft SDKs, Azure, AZED copy. If you open a command prompt in this folder, you can then run the command AZED copy forward slash source ATPS movies nine blog core windows.net $log with the destination flag set to C logs and the source key flag with your key that you'll need to supply where the key is the access key on the storage account. If you run this command, it will download the data into date and time sub folders under the destination folder specified a C logs in this case, and you can see the output that we've got here.

You can then use the message analyzer to inspect these logs. This tool can be downloaded from the Microsoft download center. Then from the tools menu option, select asset manager. In the asset manager dialogue, start downloads and then search for Azure storage. Then you can sync all displayed items to install the Azure storage related items. Then restart the message analyzer.

You can then proceed to download the files into the message analyzer. Use file, new session, and then click blank session. In the new session dialogue enter a name for your analysis session. In the session details panel, click on the files button, click on add files, browse to your location where you downloaded the logs, select the logs for the time range that you want to analyze, and then click open. In the session details panel, you can set the text log configuration drop down for each server side log file to Azure storage log to ensure the logs are passed correctly, and then select the start button which should trigger the process. The log details will then be shown in summary format as you can see here. To get more details you can use the layout options. We showed the results after choosing the layout, Storage log, key columns.

Stay tuned for the next section where we'll discuss the topic of Azure S core database.

About the Author

Isaac has been using Microsoft Azure for several years now, working across the various aspects of the service for a variety of customers and systems. He’s a Microsoft MVP and a Microsoft Azure Insider, as well as a proponent of functional programming, in particular F#. As a software developer by trade, he’s a big fan of platform services that allow developers to focus on delivering business value.