1. Home
  2. Training Library
  3. Microsoft Azure
  4. Courses
  5. Design an Advanced Application for Azure 70-534 Certification

Azure Services

Contents

keyboard_tab
Introduction
Summary
9
Summary2m 20s
play-arrow
Start course
Overview
DifficultyIntermediate
Duration1h 14m
Students112

Description

This course is focused on the portion of the Azure 70-534 certification exam that covers designing an advanced application. You will learn how to create compute-intensive and long-running applications, select the appropriate storage option, and integrate Azure services in a solution.

Transcript

Welcome back. In this lesson we're going to talk about some of the different services that we can use to build out our solutions. The goal of this lesson is to introduce you to some of the services that are useful though they're rather scenario specific. This won't be a deep dive into any of these since most of these services could take up an entire course. We're gonna talk about things such as Azure Search, machine learning, Event Hub, and more.

First up, let's talk about Azure Search. This is gonna be a service that allows us to integrate search functionality into our own applications. Things such as autosuggestion and autocompletion, faceted searches, result highlighting, and spell correction, as well as geospatial searches. Azure Search also integrates into DocumentDb and Azure and this is gonna allow us to create data-driven applications and benefit from these advanced search features. Having talked about Azure Search just now and DocumentDb earlier, let's take a look at what a data-driven application might look like.

In this example we have an Ingestion Web Role, and this could be any website that accepts data from the end users or internet connected devices, and then it persists that data to DocumentDb. Azure Search will pull DocumentDb on some schedule to check for any newly added documents that haven't already been loaded into search, and then it's gonna import them. Search will classify the data from the defined schema and parse and index the text. Then its last web role is representing some form of user interface for search functionality. It uses Azure Search to allow users to search or maybe it handles some sort of autocomplete search box functionality. Either way, the user sends their search, it checks Azure Search, and once it locates the document from the index, it can then fetch that document from DocumentDb. So this is a fairly simple data-driven web application and one that is fairly common.

Next up, let's cover some services that we might use as part of an IoT processing, and that's Internet of Things processing. Let's start with machine learning. Azure's machine learning platform is a fully managed service that will allow us to build and deploy predictive analytics apps. It allows us to use an intuitive graphic interface to build out our models and even easily deploy them as a web service. This allows us to integrate our machine learning initiatives into our apps by creating a web service end point that we can consume easily.

Okay, the next service is going to be Event Hub. Event Hub is a data ingestion service designed to be both highly scalable and highly available. It can create partitions to receive the data in parallel from millions of devices. It supports multiple protocols to receive data such as HTTP and HTTPS as well as AMPQ which is a popular protocol for Internet of Things solutions. Data gets processed by throughput units which is basically a HAP on throughput that we measure as ingress up to one megabyte of data per second or 1000 events per second, and for egress it's measured as two megabytes per second. And there's also a soft cap of 20 throughput units per Azure subscription. And what that means by soft cap is you can petition support to increase that if you need to. Event Hub allows multiple applications to process the same data through the concept of consumer groups and this is really cool. Different apps can process the same ingested data at their own pace and at the same time. And like everything else in Azure, we have a feature rich API that will allow us to process that data programmatically.

Now the next service in the IoT family is Stream Analytics. This is an important part of the family because it provides us with a stream processing platform and the platform is a service, so this is really cool. What that means for us is that we get to focus on stream processing code rather than worrying about how the system is set up.

If you're not family with stream processing, let me try and describe it with an analogy. Imagine you wanted to measure how much water was in a pool. Now by and large, the volume isn't gonna change that often, so you measure it and you have the results. You can go back and measure it again periodically to check for any changes if you wanted to. The periodic checks work well because the data is slowly changing. However what if you wanted to measure how much water flows through a specific river at any given point? For that you'd need to be measuring constantly, and that's what streaming data is. It's data that is constantly changing and regularly being analyzed.

So Stream Analytics allows you to send massive amounts of events through it and it's gonna analyze that data. There are a lot of use cases for this sort of service such as fraud detection, and that can be things such as fraud detection for credit cards or telecommunications fraud where maybe a cloned SIM card is being used or some other form of identity theft, as well as things such as health and fitness by processing data from fitness devices to identify potential health issues. And we could go on with examples like this all day.

With a large volume of internet connected devices, there's a lot of data that we can use to improve the lives of users. It uses a SQL-like syntax which reduces the learning time for developers. It can handle millions of events per second. Events are processed in order with at-least-once delivery. It can persist data to DocumentDb, SQL, Service Bus, Azure Storage, and even PowerBI which we'll talk about in a moment. So Stream Analytics is going to allow you to switch from batch processing to real-time stream processing for some of the tasks that are well suited for it. Okay, the next service is in the IoT family is PowerBI which is a tool that it's gonna allow you to turn data into useful visuals. It integrates with most Data Sources and will allow you to have a centralized dashboard for the information you need.

Okay, we've talked a lot about different IoT services. Now let's show an example of how they might interact with a basic IoT process. In this diagram we have a basic IoT architecture to handle and process data from disparate devices located all over the world. The devices send data to Event Hub through AMQP which is that protocol that we talked about earlier. We process the event data with Azure Stream Analytics filtering, aggregating, and comparing data, and we can use reference data from static files and that can be useful to decode, filter, and augment event data. Stream Analytics supports multiple data stores to persist processed data for subsequent offline processing. And we can reuse that processed data to make experimentation possible with machine learning offline, and then train functions to be reused inside of Stream Analytics. Because Stream Analytics can send output in multiple formats to different data stores, we can also integrate with things such as PowerBI to create real-time dashboards for events. So this setup allows us to ingest data from devices around the world, process the data in real-time as well as some offline, and visualize the results in a dashboard. Now while there are a few moving pieces here, the results make the effort worthwhile.

Okay, that's gonna wrap up this lesson. In our next lesson, we're gonna talk about next steps, so if you're ready, let's check that out.

About the Author

Students31532
Courses29
Learning paths16

Ben Lambert is the Director of Engineering and was previously the lead author for DevOps and Microsoft Azure training content at Cloud Academy. His courses and learning paths covered Cloud Ecosystem technologies such as DC/OS, configuration management tools, and containers. As a software engineer, Ben’s experience includes building highly available web and mobile apps.

When he’s not building the first platform to run and measure enterprise transformation initiatives at Cloud Academy, he’s hiking, camping, or creating video games.