1. Home
  2. Training Library
  3. Microsoft Azure
  4. Courses
  5. Design and Implement Cloud Services for Azure 70-532 Certification

Monitor and Debug a Cloud Service

Contents

keyboard_tab
Introduction
1
Overview
PREVIEW3m 17s
play-arrow
Start course
Overview
DifficultyIntermediate
Duration1h 1m
Students173

Description

Course Description

This course shows you how to use Azure's Cloud Service platform service offering.

Course Objectives

By the end of this course, you'll have gained a firm understanding of the key components that comprise the Azure Cloud Service platform. Ideally, you will achieve the following learning objectives:

  • How to design and develop an Azure Cloud Service. 
  • How to configure and deploy a cloud service. 
  • How to monitor and debug a cloud service. 

Intended Audience

This course is intended for individuals who wish to pursue the Azure 70-532 certification.

Prerequisites

You should have work experience with Azure and general cloud computing knowledge.

This Course Includes

  • 1 hour and 2 minutes of high-definition video.
  • Expert-led instruction and exploration of important concepts surrounding Azure cloud services.

What You Will Learn

  • A general overview of Azure cloud services.
  • How to design, deploy, and manage cloud services.
  • How to monitor your cloud services and debug them. 

Transcript

Hello, and welcome back. In this section we'll see how we're able to monitor and debug cloud services.

We'll start by showing how we're able to configure diagnostic information and login, as well as how we're able to profile the resources that our cloud service is using. Next, we'll see a number of debugging techniques, including remote debugging the cloud service instance running in the cloud directly from Visual Studio, and we'll see how we're able to remote desktop into a cloud service. We'll see how we're able to get Intellitrace events from Azure Service into Visual Studio, and finally, we'll see how we're able to debug a cloud service locally using the local emulator.

There are a wide variety of diagnostics that can be captured, from raw metrics to event, infrastructure, and application logs. In addition, there's also support for crash dumps, web server, and other failure logs. Note that metrics are mostly captured to Azure storage tables that begin with the WAD acronym. The only exception is the IIS logs, which are written as blobs in the WAD IIS local files container.

In this demo, we'll see how to configure a cloud service to work with diagnostics. Within our cloud service, we can configure diagnostics data for a given cloud service role by opening the diagnostics WAD config's X file in the cloud services project. Here, we are able to specify any configuration relating to diagnostics that we might want to capture, such as crash dumps, performance counters, and even Windows event logs. As you can see, it might be quite laborious having to go through an XML file and modifying it by hand, so Visual Studio allows you to do this via a GUI. By opening the role properties and choosing Configure Diagnostics, we are presented with a window here, where we're able to specify the configuration for the diagnostics visually.

Once we have deployed our cloud service, we can access diagnostics data by navigating through the server explorer in VS, and selecting the specific role we're interested in. We can then right-click on that role to view more diagnostics data. We can see here any events that have occurred, such as errors, app logs, crash dumps, and infrastructure logs. We can also choose to export data to CSV or view all data from the table. And this concludes the demo.

There are four key methods of profiling an Azure Cloud service supported in Visual Studio. Firstly, we can use CPU sampling, which allows us to click CPU statistics, such as CPU usage, in a low-impact manner. We're also able to use full instrumentation, which records details of method entry and exit within the framework, as well as the associated timings for them, which allows us to see the methods that are doing the most work. We're also able to track the memory usage of the application by profiling the memory allocation. If we have a multithreaded application, we're also able to profile concurrency, which allows us to easily detect dead locks and see which locks are causing contention.

In order to configure profiling, we need to specify which profiling information we want to be collected when we deploy the application. As we publish the application, we need to mark that the cloud service should be profiled. We can also select the type of profiling we want by clicking the Settings button, which presents us with a window allowing us to choose the specific profiling type.

Profiling reports are viewed in a similar way to the diagnostics information. We navigate through the cloud explorer to the specific cloud role we're interested in, and right-click and choose to see the profiling data. This open the Visual Studio profiling window, which presents the profiling information that was collected.

We can also remote debug a .NET cloud service application from Visual Studio. This is extremely useful if you need to identify an issue with a running application that's already been deployed. As from your local machine, you can debug code running remotely in Azure. Firstly, we need to ensure that we have a build of our application, which we're able to debug. To do this, we need to publish a debug build of our application. This is performed as we have when we deployed our cloud service previously, except that we need to specify the debug configuration. Next, we need to navigate to the advanced settings to specify that Azure should enable remote debugging for all roles in the cloud service. This will install the remote debugger and allow us to connect to it from our Visual Studio instance. Finally, once we've successfully deployed the application, we can locate it within the server explorer of Visual Studio and select Attach Debugger. We can choose to attach the debugger to a specific role, which means breakpoints will be hit on any instance, or we can attach the debugger to a specific instance, where breakpoints will only be hit when running that specific VM. Having attached the debugger, we're then able to hit breakpoints as though we were running the cloud service locally. There may be occasions when you need to remote onto a cloud service machine. For example, you might need to diagnose the machine by hand, particularly if you installed some software as a startup task.

In this demo, we'll see how we're able to remote desktop into a cloud service instance which is running in Azure. I'm locked into the manage.windowsazure.com portal. This is not the same portal as we've been using throughout these demos. This is in fact the previous version of the Azure management portal. However, whilst most things can be done with the newer portal, some things have not yet been migrated over to the new portal. One of these features that is not available on the newer portal is remote desktop, therefore we'll be using the old portal to configure remote desktop for the cloud service. I first navigated into the portal and clicked on the cloud services node, and then selected Cloud Academy, my cloud service that we created earlier. From there, I went to Configure and select remote desktop on the bottom taskbar to enable remote desktop.

We're now presented with this popup window, which allows us to configure the remote desktop settings. The first choice is whether to enable for all roles or just the specific role. The default is all roles. We then need to specify username and password, which will be used to authenticate when we try to remote into our remote instance. Then we need to configure a certificate for remote desktop. If you already have a certificate in the cloud service, you can use that or you can create a new one.

Finally, we pick the expiry date for the certificate. This stops any remote attempts after the date specified. Then we'll hit the tick to apply the changes. The changes have now been applied, I've navigated to the Instances tab and selected the work role. I can now click the Connect button on the bottom taskbar to download an RDP profile to machine I can use to log in. Having run the profile, it's now asking for my credentials in order to log in. After I enter my username and password, I'll be logged into the machine. And this concludes the demo.

We can also configure remote desktop using PowerShell. We first need to store the credentials used for logging in. We can do this by storing the result of the GetCred commandlet in a variable. This commandlet then securely prompts us for a username and password. Having retrieved our credentials, we can use the Set-AzureServiceRemoteDesktopExtention commandlet to enable remote desktop for the given service with the provided credentials. We can also specify that Azure should enable remote desktop when we publish a new deployment of the cloud service. When we choose to publish our application, we can enable remote desktop. Clicking Settings then allows us to configure the username and password as we saw in the management portal, as well as the expiry date of the account.

We can also use Intellitrace logs as a means of debugging and monitoring cloud services. As before, we need to be able the functionality at deployment time. As we deploy the application, we need to specify that we should enable Intellitrace login. Having then deployed the application, we're able to navigate through the cloud explorer to the specific instance by right-clicking on the specific instance. We're able then to view Intellitrace logs.

The Azure STK ships with two emulators, available to use locally. We can use the classic full compute emulator, which allows for multiple instances, allowing for more production like workloads. However, this requires administrator access. We also have the option of using the emulator express, which is the newer emulator. This is limited to running single instances of services, but does not require the administrator privileges to run.

To debug using the local emulator, we need to specify that we're using express for a local development in the properties of the cloud service. We can then build and run our cloud service as any other application by clicking debug or pressing F5. This then starts the Azure emulator. To access the Azure emulator, having located the Azure icon in the notification tray, right-click and select Show Compute Emulator. We then receive an overview of the deployment currently running in the emulator, including the instances. We're able to click on any instance and get a live-streamed output of any trace messages being output from the application.

If we need to use Azure storage, we're able to run an implementation of the storage emulator locally. If the compute emulator is running, we can then access the storage emulator by right-clicking on the icon in the notification tray and clicking Start Storage Emulator. If that's not running, then we can search in the Start menu for the Azure storage emulator. This will then start the storage emulator and allow us to access it from the local machine.

The Azure storage emulator is also accessible over HTTP at the address shown here. The local emulator provides access to blobs, tables, and queues, with nearly all features of a real storage account. And access is also available directly from Visual Studio, et cetera. All services listening on local host are simply differentiated by the port they listen on. If you're using the Azure storage STKs, then using the connection string UseDevelopmentStorage=true will handle the local URLs for you.

This concludes the design and implement cloud services course, part of the Azure 75 free-to-exam prep by Cloud Academy. We hope you enjoyed it, and look forward to assisting you with your learning in the future.

About the Author

Isaac has been using Microsoft Azure for several years now, working across the various aspects of the service for a variety of customers and systems. He’s a Microsoft MVP and a Microsoft Azure Insider, as well as a proponent of functional programming, in particular F#. As a software developer by trade, he’s a big fan of platform services that allow developers to focus on delivering business value.