This course covers the Design advanced applications part of the 70-534 exam, which is worth 20–25% of the exam. The intent of the course is to help fill in an knowledge gaps that you might have, and help to prepare you for the exam.
Welcome back. In this lesson we're going to talk about some of the different services that we can use to build out our solutions.
The goal of this lesson is to introduce you to some of the services that are useful, though they're rather scenario-specific. This won't be a deep dive into any of these, since most of these services could take up an entire course. We're gonna talk about things such as Azure Search, Machine Learning, Event Hub and more.
First up let's talk about Azure Search. This is gonna be a service that allows us to integrate search functionality into our own applications, things such as autosuggestion and autocompletion, faceted searches, result highlighting and spell correction as well as geospatial searches.
Azure Search also integrates into DocumentDB and Azure and this is gonna allow us to create data-driven applications and benefit from these advanced search features. Having talked about Azure Search just now and DocumentDB earlier, let's take a look at what a aided-driven application might look like.
In this example we have an ingestion web role and this could be any website that accepts data from the end users or Internet connected devices and then it persists that data to DocumentDB. Azure Search will hold DocumentDB on some schedule to check for any newly added documents that haven't already been loaded into Search and then it's gonna import them.
Search will classify the data from the defined schema and parse and index the text. Then its last web role is representing some form of user interface for search functionality, it uses Azure Search to allow users to search or maybe it handles some sort of autocomplete search box functionality.
Either way the user sends their search, it checks Azure Search and once it locates the document from the index it can then fetch that document from DocumentDB. So this is a fairly simple data-driven web application and one that is fairly common.
Next up, let's cover some services that we might use as part of an IoT processing, that's Internet of Things processing. Let's start with Machine Learning. Azure Machine Learning platform is a fully managed service that will allow us to build and deploy predictive analytics apps.
It allows us to use an intuitive graphic interface, to build out our models and even easily deploy them as a web service. This allows us to integrate our Machine Learning initiatives into our apps by creating a web service endpoint that we can consume easily.
Okay, the next service is gonna be Event Hub. Event Hub is a data ingestion service designed to both highly scalable and highly available. It can create partitions to receive the data in parallel from millions of devices. It supports multiple protocols to receive data such as HTTP and HTTPS as well as AMQP, which is a popular protocol for Internet of Things solutions.
Data gets processed by throughput units, which is basically a hAP on throughput that we measure as ingress up to one megabyte of data per second or 1,000 events per second and for egress it's measured as two megabytes per second.
And there's also a soft cap of 20 throughput units per Azure subscription, and what that means by soft cap is you can petition Support to increase that if you need to.
Event Hub allows multiple applications to process the same data through the concept of consumer groups, and this is really cool, different apps can process the same ingested data at their own pace and at the same time. And like everything else in Azure we have a feature-rich API that will allow us to process that data programmatically.
Now, the next service in the IoT family is Stream Analytics. This is an important part of the family because it provides us with a stream processing platform and it's a platform is a service, so this is really cool. What that means for us is that we get to focus on stream processing code, rather than worrying about how the system is set up.
If you're not familiar with stream processing let me try and describe it with an analogy. Imagine you wanted to measure how much water was in a pool. Now, by and large the volume isn't gonna change that often, so you measure it and you have the results. You can go back and measure it again periodically to check for any changes if you wanted to.
The period checks work well because the data is slowly changing, however, what if you wanted to measure how much water flows through a specific river at any given point? For that, you'd need to be measuring constantly, and that's what streaming data is, it's data that is constantly changing and regularly being analyzed.
So, Stream Analytics allows you to send massive amounts of events through it and it's gonna analyze that data. There are a lot of use cases for this sort of service such as fraud detection and that can be things such as fraud detection for credit cards or telecommunications fraud where maybe a clone SIM card is being used or some other form of identity theft, as well as things such as health and fitness, by processing data from fitness devices to identify potential health issues.
Now we could go on with examples like this all day. With a large volume of Internet connected devices there's a lot of data that we can use to improve the lives of users, it uses a SQL-like syntax which reduces the learning time for developers, it can handle millions of events per second, events are processed in order with at-least-once delivery, it can persist data to DocumentDB, SQL, Service Bus, Azure Storage and even Power BI, which we'll talk about in a moment.
So, Stream Analytics is going to allow you to switch from batched processing to real-time stream processing for some of the tasks that are well suited for it.
Okay, the next service in the IoT family is Power BI, which is a tool that is gonna allow you to turn data into useful visuals, it integrates with most data sources and will allow you to have a centralized dashboard for the information you need.
Okay, we've talked a lot about different IoT services, now let's show an example of how they might interact with a basic IoT process. In this diagram we have a basic IoT architecture to handle and process data from disparate devices located all over the world.
The devices send data to Event Hub through AMQP, which is that protocol that we talked about earlier, we process the event data with Azure Stream Analytics, filtering, aggregating and comparing data and we can use referenced data from static files and that can be useful to decode filter and augment EventData.
Stream Analytics supports multiple data stores to persist processed data for subsequent offline processing and we can reuse that processed data to make experimentation possible with Machine Learning, offline, and then train functions to be reused inside of Stream Analytics.
Because Stream Analytics can send output in multiple formats to different data stores we can also integrate with things such as Power BI to create real-time dashboards for events.
So this setup allows us to ingest data from devices around the world, process the data in real-time, as well as some offline and visualize the results in a dashboard, and while there are a few moving pieces here, the results make the effort worthwhile.
Okay, that's going to wrap up this lesson. In our next lesson we'll talk about implementing messaging applications. So, if you're ready to keep going then let's get started with the next lesson.
About the Author
Ben Lambert is the Director of Engineering and was previously the lead author for DevOps and Microsoft Azure training content at Cloud Academy. His courses and learning paths covered Cloud Ecosystem technologies such as DC/OS, configuration management tools, and containers. As a software engineer, Ben’s experience includes building highly available web and mobile apps.
When he’s not building the first platform to run and measure enterprise transformation initiatives at Cloud Academy, he’s hiking, camping, or creating video games.