1. Home
  2. Training Library
  3. Microsoft Azure
  4. Courses
  5. Deciphering Data - Optimizing Data Communication to Maximize Your ROI

Deciphering Data - Optimizing Data Communication to Maximize Your ROI

Developed with
Microsoft

Contents

keyboard_tab
Deciphering Data - Optimizing Data Communication to Maximize Your ROI
1
Main Presentation
PREVIEW48m 33s

The course is part of this learning path

Main Presentation
Overview
Difficulty
Intermediate
Duration
49m
Students
15
Ratings
5/5
starstarstarstarstar
Description

Data collection by itself does not provide business value. IoT solutions must ingest, process, make decisions, and take actions to create value. This course focuses on data acquisition, data ingestion, and the data processing aspect of IoT solutions to maximize value from data.

Learning Objectives

  • Learn the fundamentals of IoT
  • Understand IoT data architecture
  • Learn the common data patterns for processing IoT data
  • See these concepts applied to real-world examples

Intended Audience

This course is intended for anyone looking to improve their understanding of Azure IoT and its benefits for organizations.

Prerequisites

To get the most out of this course, you should already have a working knowledge of Microsoft Azure.

Transcript

Hello, and welcome to the extended version of the deciphering data session. My name is Dave Glover and I'm a cloud developer advocate based in Sydney, Australia with a focus on the internet of things. This session is a follow-on session from the discipling data architecture session and in the session that will give a combination of presentation and demos, hopefully, to help you better understand how you can go and build an IT solution with a zero.

In this session, I can't possibly cover every scenario, but one I'm gonna try and do is demonstrate and show commonly used IoT patents in Azure. The agenda for the session we've gonna talk about Azure IoT services, we're gonna look at options for stream processing. We're gonna look at data visualization with Power BI and plug and play and IoT Central.

The first thing we're look at is Azure IoT services. Now, as a quick recap from the deciphering data architecture session, remember we have essentially three components that make up an IoT solution. We've got things that generate data. We've got insights built on that data and we've got actions built on insights. So let's talk about more about things. So we've got IoT devices that are capable of directly connecting to a cloud gateway. And we've also got devices which we might need to connect via gateway being a field gateway for constraint devices, or a custom cloud gateway for devices in other clouds. And we've also got intelligent edge devices providing the ability to go and run things like machine learning models or do local stream processing.

Remember if you're a developer, you should be thinking about message types. Are we sending telemetry messages or raw messages or other types of messages? The data types is a simple text or the binary, what serialization are you gonna be using, JSON, every part above, et cetera. And remember, it's very important to add metadata to these messages 'cause this significantly simplifies the data processing in the cloud next we've got the insights. And of course we wanna do things like stream processing of the data as it's flowing into the system, we invariably will need to do data transformation and of course we wanna store that data. And we wanna think about the volume and the value of the data as this will guide us as to where we're gonna be storing our data. Are we gonna put people together in a warm path or a call path? There are performance and cost implications of doing that. And then finally, we'll go to actions.

So invariably, what we want about to do is integrate those insights into our business systems to maybe help us make better decisions based on data or maybe opportunities to automate. And of course we would like to be able to do things like batch analytics, or maybe train a machine learning model that we could then go and deploy into the cloud, or maybe deploy onto the edge to provide local intelligence. Next, we're gonna talk about how you go about implementing an Azure IoT solution. The scenario I've chosen is a Smart Factory. The reality is the services and patterns I'm gonna be discussing are really applicable across a whole wide range of IoT solutions.

So as we know, we've got things that generate messages and those messages will send a message to a cloud gateway. In the world of Azure, a cloud gateway service is the Azure IoT Hub service. IoT Hub is an internet scale, secure messaging service. It supports bi-directional communications. It supports messaging enhancements and enrichment I should say, and petitioning. I'm gonna talk about message enrichment and message petitioning in more detail shortly. As that data flows through the system, we want to do stream processing on that. And these are common services that you would use for stream processing. So for example, there is stream analytics, there is a patchy stream spark streaming, which is part of Azure Databricks, and there is Azure Functions, which common services used for stream processing. And then available, you wanna be able to store that data, and these are common services that you would go and use for storing data. You've got as your storage, Cosmos DB, putting data into an event type and things like that. And then the actions behind this, is it you would be able to go and then use the insights that you've generated built for driving things like smart reports to better understand how your IT data is flowing through the system, and maybe to be able to make better business decisions on that IT data. Or maybe you want to go and integrate that data, using things like logic apps, which are no code options, and you better integrate the data that's flowing through the system, the IT data flowing through the system and integrate that with backend business systems. And then ideally you want those backend business systems to be able to make decisions, to them gonna send a message back through IoT Hub, back to your Smart Factory, down to the device in the factory to say, hey, let's turn off the air conditioning system because the factory is closed for the day. As I said, I'm gonna talk about all these and more detail.

So the first thing I wanna talk about is message enrichment. So you can see on the left-hand side, we've got adjacent based message that the sense of device, that's monitoring an HVAC system. So an HVAC is a heating, ventilation and air conditioning system. And you can see that the device developer has captured the telemetry, the humidity and the pressure from the HVAC system and the device developer has also added in some metadata. So the properties that have been added, we've got an FID of HVAC, the type of message is a telemetry message, the format that the data has been serialized in is JSON, and the schema version for this is one. And that message will flow into Azure IoT Hub.

Before I go too much further, I wanna talk about another concept, which is called a device twin. Every IoT device that is registered with Azure IoT Hub has a device twin document. A device twin document is adjacent document, which has metadata in it, which can be used to describe the device. And in this case, we've got tags and you can see in this tag, I've got deployment location. So in this case here, I'm saying that this device is named Sydney factory and the longitude and latitude have being recorded, and the idea of the way this would work is that you are going to deploy the sensor into the factory and then you'd have a process, an application that would then say, okay, this device has been deployed, now I need to update its device twin document with this information. There's additional metadata about this device because now we know that which factory has been deployed and we know the longitude, latitude, et cetera. So then information goes into the device twin document, or this JSON document.

Now I can use the tag information and I can use that for message enrichment. So what I can do in a vent hub, I can say as these messages flow through, I want to enrich these messages with data that you can find in the JSON document, in the device twin document. And you can see on the right-hand side, I've got that message you're just flowing through, I've got the body of the messages play to the same, and you can see that there were three new fields added to that message. I've got name latitude and longitude have now been added to that. And now I've got that metadata attached to that message as it's flowing through the system. And that can now become very useful metadata, for example, for driving reports or maybe petitioning data or doing a whole range of things.

So the next thing I wanna talk about is message routing or partitioning. If you think about the Smart Factory, that Smart Factory is gonna have a whole range of sensors. It's gonna have sensors that are monitoring the HVAC system. It's gonna have sensors that are monitoring security systems or lighting systems or process systems, or production line systems. There's gonna be a whole range of sensors that are monitoring activities that are going on inside a factory. Once you need to better do and of course all those messages will have different application end points, they'll have different schemers, they might've been serialized in different formats and things like that.

So what we need to do as now is to separate out those messages, normally the use case, you probably separate out by application, you'd say, okay, all the HVAC messages, I want to petition those messages and I want to store those in a queue, which is for all HVAC messages. The same deal would be for security messages. Highlight all the security messages that come through, I wanna store those all in another location. The lighting messages, the production control systems, et cetera. You wanna basically separate out that data, and normally you probably do it by application, but you could do it by location or device type or the message type as a telemetry messages as an alert message, et cetera.

So once you can define an Azure IoT Hub, are rules based, it's basically using rules for message routing. And you can see an example here. So I've got a rule that I've set up called appID, where appID equals HVAC and the type of message is telemetry. So as those messages flow up through IoT Hub, I've got these routing rules that get applied to all these messages that go through, and I can start separating or petitioning out these messages.

So in this case here, this was an HVAC telemetry message and the rule was true, and I then forward that message into an HVAC telemetry Event hub. The same deal would be for a message that came from a security system, in which case the appID would be security, for example, and then I would know to route all those messages off into a security telemetry Event hub. The same deal for lighting, and I might wanna go and say, well, all messages that come into the system, I actually wanna put those into as your storage, as an audit. And I would normally put those into a cold path. Something like Azure storage. I've got a lot of data, huge amounts of volumes of data, and I need to be able to store it, but I probably don't need to be able to query on that data or have a really accessible access to that data. So you'd normally just go and put that into storage. It's a very cost for cost effective way of storing large amounts of data.

So the next thing I wanna talk about is stream processing. So we've enriched our data, we've now petitioned our data, and in this case here, we're just looking at the HVAC data. So the data that's coming through from the heating ventilation and air conditioning system. So in that Event hub, I have got a whole lot of messages that are coming, that have been flowing through that system. So I've got these HVAC telemetry messages that are outgoing going into my streaming process system. So as I mentioned before, these are the common services that you might use for stream processing in Azure.

So there are broadly three, we've got stream analytics, we've got Apache spark streaming, and we've got Azure Functions. Now I'm gonna kind of give you a rough rule of thumb, where you might use each one of these services. If your requirements are relatively kind of normal, and your experiences are really quite data centric, So the skill set, for example, in your organization, you've got a lot of database skillsets. Then streaming links can be a great choice because it's very SQL-oriented, I'm gonna drill into this in a bit more in a moment, so you'll see that the techniques and the skill sets that you require to go and query a database, are the same skill sets that you'd use to query data that's in flight and moving through a system.

Databricks is more data science. So if you already invested in Apache Spark and Azure Databricks, then maybe your skillset inside your organization is more geared towards Databricks and more data science oriented. Then it makes sense to go and use the spark streaming capabilities of part of Databricks to go into your stream processing. If you are more developer centric, then as your functions are great, because they provide a very cost-effective way of having data coming through a system and being able to process that stream or process that message, maybe you wanna do some transformation of that message and maybe go and store it in various locations. The other way that I think about string processing is stateful and stateless.

Now the first two stream analytics and Databricks and Apache Spark streaming are stateful based stream processing. So for example, they are capable of streaming, let's say I want to average data over a period of time, let's pick five minutes. So we've got maybe thousands of message coming in to the stream processing systems and I want to maintain state for five minutes of all those messages that are passing into that system, and I wanna get an average, the temperature, for example, of all those messages. I'm more interested in the aggregate form of those messages rather than the individual messages. And that's where stateful processing or stream processing can be very useful, and that's where stream analytics and Databricks kind of excell.

Azure Functions are much more about being stateless. So you've got a message coming through, it deals to that message, it doesn't understand the context of this message versus all the other messages that are flowing in from the HVAC system. It just understands, hey, this is a message, I'm gonna do some transformation on that message and then pass it message and store that message somewhere, or maybe passed it into another queue. So that's kind of hopefully roughly a rule of thumb.

So you've got streaming analytics, which is more SQL-centric. We've got Databricks, which is more data science centric, and we've got Azure Functions, which is more developer-centric. We've got Stream Analytics and Databricks, which are more stateful based stream processing, and we've got Azure Functions, which are more stateless. Hopefully, that's clear where you might use those stream technologies. And then of course you wanna better store or process that data and pass that data out through that stream process. This really comes down to where you this data, really comes down to the value and the volume of data that you're dealing with. If you have a large volume of low value data, you'd normally go and put this into your cold path storage. And in Azure, this is Azure's storage and it's very cost-effective storage for storing large volumes of data. And you would then be able to use that data maybe for training machine learning models or batch analytics, or maybe for audit process or things like that.

The other place you might wanna go and put data is into the warm path. So the example I've got here is Cosmos DB. And warm path is much more about interactive reports. So this store is normally quite what I call queryable. You can define queries against that, and get data back in a short amount of time. So Cosmos DB is a no SQL solution, but there's nothing to stop you using something like SQL server or Oracle or whatever it might be, as a backend data store for storing this data in a warm path. And this case here, I'm gonna be using to drive reports. But of course it could be for business integration and things like that.

What I would strongly recommend is really think about your data. I've seen solutions where the developers have decided to come put all the data into the warm path, audit data, everything into the warm path, and that can result in a lot of issues, performance issues. You've now suddenly got a database which is bulging with data, and then really can't do the things that you really wanted to do, which normally interactive reporting or business integration. So you wanna make sure that you separate out, and the other issue with that is it can get very expensive.

So you really wanna think about, okay, this data here, it's audit data. I wanna go and put this in cold path. This data here, I wanna keep for a limited amount of time, and I wanna put it in the warm path 'cause I wanna use it for more interactive use. And we've got sign-ups and Databricks, both of those solutions are really around dealing with big data or data warehousing. So again, the output from that stream process might be into a data warehousing, big data kind of scenario, where your game, you might be going training, machine learning models and things like that, or doing large scale of analytics. And the output might also be into another queue. In this case, I'm moving the message into an Event hub, and I'm gonna use a logic app. And as I mentioned, a logic app is a low code, no code solution where you can define basically a set of rules and say, okay, I've got a message here, it's for the HVAC system, and I've got a set of rules here, and this is how I go and integrate this message into a backend business process.

What I wanna talk about now in more detail is stream analytics jobs. And as I mentioned before, this is kinda more SQL-centric, and it's a great way to get on board with stream processing. So you can see over here on the left-hand side, I've got two inputs. So I've got the input that I partitioned into the HVAC telemetry queue, so that's one import, and I've also got some reference data. In this case, the reference data is weather data. So what I do is I have an Azure Function that I run every 15 minutes, which goes and grabs the current weather conditions around the factory on a 15 minute basis. And the idea behind it is I'm ultimately going to join that weather data with the telemetry that I've got from the sensor that's monitoring the HVAC system and the factory, I'm gonna join that data together because I thought it might be interesting that if I knew the outside temperature of the factory in the inside, then maybe I can start making smarter decisions about what I do with the HVAC system, the air conditioning system inside the factory. 

So for example, if I know the temperatures dropped by five degrees outside, then maybe I can start making some smart decisions and reduce the air conditioning level inside the building, because I know that the outside temperature is dropping. And then what I do is I have a query. So in this case, I said the query looks really like SQL. And I'm gonna show you this in a moment. And you're basically there as a data's flying into the system, you can run a query against that data. And for example, you can aggregate that data. I'm interested in the average temperature of the data of that's flowing into the system over five minute period for example.

Okay. So the output of that query, which is doing this aggregation, I'm actually passing that into three other queries. And what the first query, what that's gonna do is I've got an alert query, which says, hey, if the temperature goes above 40 degrees Celsius, I'm gonna send an alert message because I wanna send that message into a service system to say, hey look, the HVAC system needs to be serviced because temperature has got unacceptably high inside the factory.

The other thing I'm gonna do from the output of this first query into the second query here, is what I'm gonna do, is I'm gonna join the telemetry data with the weather data. And I'm gonna write that joint data into the HVAC Cosmos DB data store. And you can see I'm doing the same principle here, is I'm gonna join the weather data with the telemetry data, and I'm gonna store that into the HVAC storage. Maybe I'm gonna use that for train a machine learning model or use batch analytics against that at some point in the future.

As I mentioned before, stream analytics is very SQL centric in its approach to dealing with data. So in the same way that you're gonna find a SQL query against a relational data store, or a NoSQL data store for that matter, it's very similar concepts. These are the fields I wanna select. I wanna describe where this data is coming from. So you just say, hey, look, this data is coming from the HVAC telemetry Event hub queue. I wanna say where the, in this case here, I'm gonna say where the version of this telemetry message is version one of the schema. And then I wanna do a group by, and in this case down here, what I'm doing is I'm doing a group by, but with a tumbling window and this case I've set it to one minute.

So what's gonna happen is I might have thousands of messages that are appearing on this Event hub that I'm gonna run through the stream processing. And what I'm gonna do is average, those messages are flowing in that system over that one minute. So I've got thousands of messages coming in, and I'm interested to aggregate their data and come up with the average temperature, the average pressure, the average humidity on a minute by minute basis, because it's more interesting and in fact, probably it would normally be more like 10 minutes or 15 minutes or an hour or whatever it makes sense for the business scenario. Okay, so kind of end to end, hopefully that made a reasonable amount of sense.

So we've got things generating data that data has been sent to Azure IoT Hub. IoT Hub has the ability to go enriched messages. And they case I used was, hey, look, I wanna go and enrich that message from that device with its geolocation. I then I wanna go and petition out those messages because remember that from that factory, I've got messages coming from a wide range of sensors, all really targeting different applications, they have different schemers, they might be encoded in different methods and I need to basically separate those out. So that's where you go and use message petitioning. I've got stream processing because maybe I'm interested in the aggregate values of data might be more interesting, or maybe I need to change the shape. Maybe I've got messages that are coming through from one schema, but I need to store them with a different schema. And I might wanna be able to transform that data as it's moving through the system. And then of course, I've got end points for storage.

So again, really emphasize, think about your data. Is this data really cold path data, or is this more warm path data? Do I need data for more interactive reports? Or am I interested in storing this data for long-term? I need cost-effective storage in which case I'll use cold path. Is this data destined for our data warehousing or big data analytics, or is this data destined to go into another part of a business process? So the data might go into an Event hub and then be passed through, into an IntraLogic app and then integrated into a backend business system, which will then be able to make automation decisions, for example, to send it back through IoT Hub, back down to the device to better control that device from the cloud.

So that's the end of the slide session. So what we're gonna do now is a demonstration about how this all hangs together. So I'm gonna start this demonstration from the Azure management portal, and I've actually gone into the IoT Hub management blade. And you can see in here, I've got various monitoring information. I can see how many messages are flying into the system. You can look at rights and miss messages, total messages sent today, et cetera, number of devices connected.

So this is the IoT Hub blade info, and what I'm gonna do now is I'm gonna go down to IoT devices, and as you spotted as only one device to find them the system it's called raspberry RPI-net-core. And I'm gonna go into that device and you'll find various information about that key information, connection information, and things like that. But I'm gonna go to the device twin. There, remember I said that the device twin is adjacent document that is maintained for all devices that are registered in the IoT Hub. And this JSON document will contain metadata. And the tags section, from your application, you can add data into the JSON document and in this case here, I have an application which is added in the display name for the device. I had a friendly name for this one, which is called Sydney Opera House. And I added an longitude latitude and the weather location for this device.

Now, what I'm gonna do now is I'm gonna go across to Visual Studio code. And I've actually got a dot net core application money here, which is connected up to that IoT Hub. And that's for that device. Now this code is available. So there is two learning paths I've written out to support this talk. So there is the dot net core IoT learning path where you can write C-sharp based applications that you can run on Linux, Mac OS, windows, and raspberry PI. The other learning path is use your sphere learning path, and you can use that code base to learn how you'd go and build a C-based application for environments where you weird security is absolutely critical. And that's where you'd use an Azure sphere device for devices to require high degrees of security. So two labs there that you can go through and understand more about how data finds its way from the device up into the cloud. I keep saying up, but to the cloud service, I will have links to both of these hands-on labs in the resources at the end of these slides.

So what I've got here, as I've jumped back to the C-sharp example, which I said, you can run on your desktop, super easy to run Mac, Linux and Windows. And what I'm gonna do here is I'm just gonna step through the code and you can see that I've got some telemetry come through. So I've got temperature of a hover over that. The temperature is 90 degrees. Humidity is 52, and the pressure is 1190. And what I'm gonna do is I'm gonna create adjacent message for that.

So I've serialized that message into adjacent format. If I hover over that, JSON, you'll see, there goes to JSON message. And then what I'm gonna do is create that message and then I'm gonna add in the metadata properties. So I'm gonna say that the metadata for this is FID is HVAC. The type of message is gonna be a telemetry message. This schema version of this message is version one. And the format of this message is JSON.

So I'm using JSON sterilization for this message. And then I will send that message to ID hub. And this repeats every two seconds and we'll send data up to IoT Hub. So I've just taken the break point off. So this application is now gonna be sending telemetry to IT hub every two seconds. A useful tool that you can download, just search across the internet, and I'll put this link in the resources is the Azure IoT Explorer. And this provides a window into IoT Hub. So you can start viewing the telemetry flowing through the system. Now I'm just gonna stop that.

So you can see here that I just gonna pause that, and you can see that I've got the message that the device has sent. So I've got temperature, humidity, pressure, and I've put a message idea in there as well. And you can see these are the properties. So the first four properties have come from the device. So those are those app properties that I added that metadata that as a developer, I added to this device. Now you will see that this message also has four additional properties. It's got the longer chewed, latitude, weather, location, and name. That metadata did not come from the device. This data was enriched in IoT Hub. So we're gonna talk a bit about that.

Now I'm gonna pop across to IoT Hub back to IoT Hub, and we'll go to the management blade and I'm gonna scroll down here to message routing. In the message routing blade. You'll see that there was an enrich messages section and what this is gonna do. So remember before I showed you the JSON document for each one of these devices and that JSON document, there was longer Jude latitude, weather, location, and name and member that application would get there after the device had been deployed. And you'd normally have a management app, which would manage the state of that JSON document for this device. And then this is gonna be known as longitude and latitude and with a location and name in the data stream.

So I popped back to the Explorer. You'll see that those are the names that have been used and it's grabbed the data from that JSON document. So the next thing I wanna talk about is routes. And remember, this is where you'd go and separate out the data from different styles or classes of devices. I've got the HVAC devices, I've got the security devices, the devices monitoring, lighting, et cetera. So in this case here, I've only got one route defined. And you can see if I go into that route, you'll see that I've got a query. This is the rule. And it says that if the appID is HVAC and the type of message is telemetry, if that's true, then I want you to go and route that message to this end point. And if I just come up to this dropdown box up here, you'll see that it's gonna be the HVAC telemetry. And just going back to the Explorer, you'll see that the message type is HVAC and just make this a bit bigger and you'll see that the message type is telemetry. So that's what that is being keyed off. And so that's making a decision and it's gonna that end point.

Now, If you are keen of, I, you would have spotted up here, I've also got these custom end points, and this is where you defined all the end points. I wanna put all the HVAC telemetry onto this queue. I wanna put all the lighting onto another queue, et cetera. So you'd gone to find the end points in here, and you basically just link those routing rules with these custom end points. And that's how the data finds it onto those queues. So the next thing I wanna talk about is stream analytics. So I've now got this data, which I've enriched and which I've petitioned, and I've sent all the HVAC data onto the HVAC telemetry queue.

So I'm now in the stream analytics, a management portal, and if I scroll down to the eight, you can look at the activity of that portal, how many messages are flowing into that stream analytics job, how many messages are going out of it, et cetera. So I can define the inputs. Remember, in the talk I said, I had two inputs. I had the HVAC telemetry. This is the messages that I've petitioned often to that queue and I've got the weather data. And you remember, I collect weather data every 15 minutes and I store into storage location. And this is where the state is coming from. And this is just pointing to that data. And I define my outputs and remember, I had an output for alerts, and this will go into an alert queue, which will need to get picked up by a service system.

Remember I said, I had another output, which was a Cosmos DB. So in this case here, I'm just defining the location and the security context for the Cosmos DB, and you remember, I'm streaming the data into storage as well. So this is just defining the location and the security context for the Azure storage. And if I go and look at the query, now you can actually edit the query when the jobs are running and the job is running at the moment. This is the query, and as I said before, it's very SQL-like, so you've got your select from where group bias kind of notions. And then you'll see if I scroll down a bit further, you can see I'm doing a joined by, and remember I'm doing that join by weather. And I'm joining by the weather ID. So the weather location on the telemetry message, I'm joining to the weather ID in the storage message from the data that I grabbed from open weather map. So that's the data that's running through there. And then I said, one of the outputs is gonna Cosmos DB.

So I've got another blade opened up here, and this is the Cosmos DB management blade. And you'll see, I've already got some data items sitting in there from other devices are these actually from Azure sphere devices. And if I go and do a refresh since I've started this and that job is running, if I just refresh on that data, you'll see that, now I've got the data in there from the dot net core application that I'm running on my Windows desktop. But as I said, that application, you can run on Mac, Linux and windows and your raspberry PI, and you will see that now I've got my telemetry. I've come in, I've got my temperature, I've got my pressure. I've got my humidity and remember, I've augmenting this, originally this with open, with a map data.

So I've got my open weather map, temperature. Pressure, humidity, precipitation, wind cloud, et cetera. So you end up with quite a rich data that you can start doing some really quite interesting things. And the next thing I'm gonna do is I'm gonna show you how you can actually use Power BI to report on that data in the Cosmos DB. Okay, so now we've got our IoT data insight, Cosmos DB, one, I'm gonna show you is how you can go and use Power BI Desktop Designer to come build a report using that data. I'm gonna go through this pretty quickly just to kind of give you a bit of a sense of the art of the possible for using the Power BI Desktop Designer to go and build a report.

So I'm gonna select a data source in this case, it's from Azure, and I'm gonna select as your Cosmos DB click on Connect, I'm gonna paste in the URL to that database. I'm gonna name the database, which in this case is called Factory 10. And the collection is called HVAC State click on Connect. Now I've already authenticated against this data source. So I won't ask me again. I'm gonna click on Transform the data and expand that data. Now it's gonna bring back all the fields in that data store and I don't want them all. I'm just gonna slate, temperature, humidity, name, longitude, and latitude, and or disabled prefix.

Next, I wanna do a transform on their data, and I'm just doing an audit detect on the columns in this table. It just basically are these numeric or their text or their data, et cetera. Now we'll close and apply that. And this will take me back to the design of the Power BI Desktop Designer. And the first thing I need to go and do is I need to tell the system what fields are my longitudes fields and my latitude fields for mapping. And now what I'm gonna do is add a visualization. So from the right-hand side, I'm gonna click on Map. I'm just gonna expand that out a bit, and now I'm gonna set the latitude for this map, the longitude and the value that I wanna plot in this case, it's gonna be temperature by default. It does some, I actually want the average.

Okay, so now I've done that, next I need to go and do is go and set the format. And in this case here, I'm gonna select the color, just make it pop a bit more. I'm gonna select a scaling factor for the bubbles. I'm gonna select the map style, I'm just gonna use gray scale, and I'm gonna turn on shadow and I'm gonna set a border and I'm gonna set the radius for the border, for the corners to be 10. So we get nice curved corners for the map visualization.

So the next visualization what I'm gonna add in is a gauge, and we're just that up to the top there. And we're just resize this round about there, and I'm gonna set temperature to be the value now, by default, it wants sum, but I want average. And the next thing I need to go and do is format this, and I'm gonna turn the data labels off, and I'm gonna sit the gauge axis. So the minimum temperature I wanna display is zero. And the maximum is gonna be 50 degrees centigrade. And I'm gonna say 21, which is kind of like a nice, comfortable temperature. I'm gonna copy this and paste that visualization and just drag this down here and line this up. And in this visualization, I'm gonna put in the humidity and I'm just gonna go across to the Format, Painter go across the Gauge Axis. In this case, I'm gonna select the maximum to be a hundred percent. And we'll say 50% humidity is gonna be comfortable. And in fact, I need to go and set the average for that visualization.

Now, the last visualization I'm gonna add on is a table, and we'll just drag that out to line that up. And we'll just move that up to create a bit more space, and then we'll select the name field, and we're gonna select longer toot, oops. Now we're not gonna set temperature and humidity. So those are all the fields are really gonna plot. And one of the things I wanna add in, is a filter for this page. So I'm gonna drag and drop name to be a filter on this page.

So next thing I'm gonna do is just do a little around just making the report look pretty, and we're gonna select on Format Painter, select the map, and then we're just gonna apply the same format that we're using on the map to the visualization for the gauge, or do the same for the other gauge, and we'll do the same for the table. And then the last thing I wanna go and do is go and select a theme, Imogen, select a dark theme. So that's all I need to go and do. And we're just gonna do a Publish. It will save this and we'll save this as HVAC 11. So this a save that, and what I'm gonna do now is publish this up to my workspace. And what's happening now is that the P the Power BI Desktop Designers connecting up to the Power BI portal. And it's published that report, click on that report to open it. And now you'll see the reports are loading up inside the web portal, and you can see I've got clickable surfaces on there, and I can go around and select a different data points, which is kinda nice. And over on the right-hand side, you can go and do filters and things like that.

So I can go and say, for example, select the home office. So the beauty of obviously of having this report sitting up in Power BI is I can then collaborate with other people or create dashboards or things like that. So it makes a really powerful way of being able to visualize your IoT data. So the next thing I wanna talk about, I think it's a really exciting initiative which is called IoT plug and play. The closest analogy is if you buy a printer and you plug it into your computer, and for example, you can just print from word to that printer, and you don't have to worry about which manufacturer made that printer, that printer can express its capabilities as a color printer, as a black and white printer, et cetera. And that's the same concept with IoT plug and play.

So with IoT plug and play, you define, as a device developer, you define an interface and it's basically a contract which describes the properties and the telemetry and the commands. And there's a schema for me out to define these. You can define what's called a capability model, which is a collection of interfaces, which represent that thing. So for example, it could be a device model for a sensor that you're bringing to the marketplace. And there was a description language of going to find these that say she called the digital twin description language. It's based on an open source standard are called JSON LD and RDF. And you can find more about the specification and the link at the bottom of the page.

Now, to give you an example here, this is actually the plug and play schema that I set up for the C-sharp example that I demonstrated before. And you can see that I've got an ID for this schema. So these have to be unique. So this one here is called DDMI glovebox HVAC. And you'll see if I come down here that you'll see that there is a schema, which describes the capabilities of the sensor. So in here you can see I've got the type called telemetry, the name is called temperature, their full display name is called room temperature, the schema is double and the unit itself Celsius, and the same deal for telemetry down here, which is pressure in this case, ambient air pressure, double and HBA. So the idea behind this is that you can go on as a device developer you can build a device and describe the plug and play model for your device and then as a solution developer, you can go and build back in solutions, which accept plug and play specifications that automatically basically ingests the capabilities of that device and make it much more simple, making it simpler to go and build IoT solutions. So do check out plug and play because I think it's absolutely fascinating.

So the last thing I wanna talk about is something called IoT Central. Now it central is a fully managed IoT service. So no cloud development or expertise required, and there is full support in there for devices, for monitoring, it's extensible so you can extend the telemetry that's flying into IoT Central, you can sit it with flow dynamics, web hogs, et cetera, there's dashboards analytics, and it's really easy to get on board with and give it a try. Central itself is actually built on Azure IoT Hub. There's an implementation of plug and play built into it. And there's a stream analytics underlying it, as well as Azure, our time series insights, and those building blocks go to makeup IoT Central. So we're gonna have a quick demonstration about how IoT Central works. So this is the IoT Central dashboard it's fully configurable, and you can put information on there, that you deem to be high priority. I've got a number of devices are set up in here. The device I'm gonna look at is a carbon dioxide monitor, for dropdown click on that, if I can expand this, you can see that this is the readings of carbon dioxide in the room where I'm working over the last close to last 12 hours. And you can see while I've been presenting, the carbon dioxide levels have really gone up to really quite unhealthy levels. I can go and look at analytics. So underpinning, this is time series insights. I can look at the particular group in this case, it's carbon dioxide. I will look at the parts per million, analyze this, and you can see, I can now drill into this. I can look at the last seven days of data. This data is all real. And hey, look, I might be interested in this particular area.

And finally, why was what was happening at that particular time? And you can kind of drill into that data. As I said, this is underpinned by a time series insights. We have the concept of templates and these templates are actually plug and play templates. So we have an interface and that interface defines really the capabilities of this device. So you can go and express in this case, temperature, what the temperature looks like. If I click on that dropdown, you can see that this temperature is of schema float and the display unit is Celsius. The same deal for carbon dioxide, as a scheme, as float, I look at humidity and I you'll see again, the scheme has float and the display unit as a percentage. So that's a very quick introduction to Azure IoT Central do check it out 'cause it's a fantastic way if you wanna go and quickly get IT solutions, up and running, it's all underpinned by all the things that I've been talking about. IoT Hub, plug and play, stream analytics, and a solution there that you can really go and customize and helps demonstrate the value that it can bring to an organization very quickly.

So that's the end of the session. So these are the resources for the session. You'll find links to the slides, to the session code that I use. So the C-sharp and the Azure sphere C code to help you build applications, which can talk to Azure IoT Hub. There's a resources there to the overall session and do check out a series of curated links I put together for the session. So check out aka.ms/iot20/learn. These are a series of Microsoft learn modules, which will really help you dig into building solutions and really understanding their data architecture for an application for an IoT application. We also have something called the AZ-220 IoT Developer Certification. That certification absolutely fantastic opportunity to really learn across a whole range of disciplines. So when you're building IT solutions for Azure, so very recommend checking that out and just those resources are all available as part of Microsoft Learn. So be sure to go and search for that and find an amazing set of resources to really help you in your learning journey with Azure. So thank you very much, and I really hope you enjoyed the session.

About the Author
Microsoft Learn
Training Provider
Students
73
Courses
5

This open-source content has been provided by Microsoft Learn under the Creative Commons public license, which can be accessed here. This content, which can be accessed here, is subject to copyright.