image
Demo
Start course
Difficulty
Intermediate
Duration
15m
Students
297
Ratings
5/5
Description

This course covers how to use the text analytics features in Azure to detect language, as well as how to retrieve and process key phrases, entities, and sentiment from a text. We'll provide you with a practical understanding of these features thanks to a real-life demonstration on the Azure platform.

Learning Objectives

  • Understand what text analytics is and its use cases
  • Create the Azure resources for carrying out text analytics
  • Use Azure to retrieve and process phrases, entities, and sentiment from a text

Intended Audience

This course is meant for developers or architects who would like to know more about how to use the text analytics capabilities of Azure Cognitive Service for Language to understand and process text content.

Prerequisites

To get the most out of this course, you should have basic Azure experience, knowledge of Azure Cognitive Services, and some developer experience, including familiarity with terms such as REST API and SDK.

Transcript

So we’re here in the Azure Portal, and now it’s time to create the Azure Resource to use Text Analytics. To do that, I’ll click here at the top, and then select “Create a Resource”. Then in the search box, let’s type “Text Analytics”, and as you can see, Text Analytics is no longer available as an independent resource. I’ll then type “Cognitive Services”, then click on it, and in the description, you can see that this is a consolidated Cognitive Service resource for Vision, Language, Search, and Speech, which would be useful if I wanted to consolidate my billing and resource management in a single place. 

But, as I won’t be using any of these other resources, let’s click on the Back button, and this time I’ll type “Language Service” and click on it. Here in the description, you see that this is the new service that consolidates Language, QnA Maker, and LUIS, so that’s the one I want. Let’s click on Create, and on the next screen I’m provided with a few extra options, not related to Text Analytics, so I’ll just click here to continue. 

On the next screen, let’s select our “CloudAcademy” resource group, and I can leave the default region selected. For the name, I’ll type “CALS” (for Cloud Academy Language Service), and for pricing tier, I’ll select the free level, as it’s perfect for demo environments. Then I’ll confirm both checkboxes below, and click on Review and Create, and then on the Create button.

After a few seconds, my Language Service resource is created, so let’s click here on “Go to Resource”. As with any Cognitive Service resource, I need the API key to be able to access it, so let’s click here on “Keys and Endpoints”, and then copy “Key 1” to the clipboard. Then, let’s switch here to the root of the Language Service documentation at docs.microsoft.com. 

What is great about Cognitive Services is that you can test most, if not all your APIs from within the documentation page, without necessarily having to use external tools such as Postman. They all follow the same structure, which makes it really easy to find. For example, let’s expand Language Detection, then reference, then REST APIs, and then let’s select the current API in production, which is 3.1. This opens the API documentation page, and here on the left side, I can see all the different methods available. 

On the page I’m currently in, for Language Detection, I can scroll down and see all the documentation for it, including the format of the request URL, required and optional parameters, what the JSON request body should look like, and the types of responses I can get. But if I scroll back to the top, here’s a button where I can TEST my requests. Let’s click on it. 

On the next screen that opens, I’ll type my resource name, “CALS”, and then in the Headers section, I’ll paste the subscription key that I have copied from the Azure Portal. Then if I scroll a little bit, notice that I already have a JSON request body, which I can change, if I want. But the default one is already perfect as an example – as it contains sentences in English, French, and Spanish in the same JSON body – so let’s click here on SEND. Notice that the request also has a countryHint for the first sentence, as you can see here.

Then you see that the Language Service returned correctly the languages detected, along with the ISO language name and confidence level for each sentence… pretty cool, huh? 

Let’s now switch to Sentiment and test it. I can click here on the Sentiment link, and then on the test button, but an easier way is to just come here in the URL, and replace Languages with Sentiment. I’ll then type “CALS” as the resource name, and then paste my API key down below. As you can see, the sample text is pretty positive, so let’s just click here on SEND on the bottom to test that. 

In the results, notice that we have two levels of sentiment detection. One is about each sentence – for example, “Great Atmosphere” is 100% positive. The other level, as you can see here at the top, is about the sentiment for the entire text, which is based on each individual sentence. 

We could repeat the same process for each of the other methods – Entity Recognition, Entity Linking, and Key Phrase Detection. But, for those, I want to show you how to use Postman instead, so let’s switch to it.

Here I have three tabs open, one for each of the remaining methods. To make things easier, I already filled up all the information, sent the request, and got the results. On the top, notice that I’m sending a POST method to the URL of my API – remember, all this information I can get from the API page we just came from. For each one of these methods, the only information that I need on the Headers tab is the Content Type and the API Key, as you can see here. On the Body tab, notice that I have the JSON body with some text to be processed. 

Here on Key Phrases, notice that from the sentence above Text Analytics was able to extract “Cloud Academy”, “Great Way” and “Microsoft Azure”. That can give me a quick overview of what the text is about. Let’s now switch to Entity Recognition, and here you can see that it detected “Trip” as an event, “Seattle” as a location, and “Last Week” as a Time Range. 

Finally, let’s see Entity Linking. Here you can see that the request is referring to Mars, the Roman God of War. And, down here, I can see that not only Text Analytics already understood the correct reference, but also even gave me a link to the Wikipedia article related to Mars – not the planet, but the Roman God.

About the Author
Students
2753
Courses
4

Emilio Melo has been involved in IT projects in over 15 countries, with roles ranging across support, consultancy, teaching, project and department management, and sales—mostly focused on Microsoft software. After 15 years of on-premises experience in infrastructure, data, and collaboration, he became fascinated by Cloud technologies and the incredible transformation potential it brings. His passion outside work is to travel and discover the wonderful things this world has to offer.