Designing Data Flows in Azure
Data Flow Basics
Designing a Data Flow Solution
The course is part of this learning path
This Designing Data Flows in Azure course will enable you to implement the best practices for data flows in your own team. Starting from the basics, you will learn how data flows work from beginning to end. Though we do recommend an idea of what data flows are and how they are used, this course contains some demonstration lectures to really make sure you have got to grips with the concept. By better understanding the key components available in Azure to design and deploy efficient data flows, you will be allowing your organization to reap the benefits.
This course is made up of 19 comprehensive lectures including an overview, demonstrations, and a conclusion.
- Review the features, concepts, and requirements that are necessary for designing data flows
- Learn the basic principles of data flows and common data flow scenarios
- Understand how to implement data flows within Microsoft Azure
- IT professionals who are interested in obtaining an Azure certification
- Those looking to implement data flows within their organizations
- A basic understanding of data flows and their uses
Related Training Content
For more training content related to this course, visit our dedicated MS Azure Content Training Library.
In this lesson, let's take a 30,000-foot view of the modern data warehouse, and how data flows through it. Source data can come in many forms. It can come from logs, files, and even media and it often comes from many types of devices and applications. As this data comes in, whether it's structured or unstructured, it needs to be combined. For this process, you can leverage Azure Data Factory. As such, Azure Data Factory is essentially the engine that provides the overall flow control. It acts as the orchestrator in the entire process because it helps ingest all the different types of data. Once it's done with ingestion, it stores the data in Azure Blob storage. Doing so offers the ability to maintain data in its raw format for long periods of time, due to the low cost. This allows you to go back later and to run new or even different translations on the data to get new answers to new questions that crop up. Next in the flow comes Azure Databricks.
Choices include Python, Scala, and Spark SQL, among others. Alternatively, you could use HDInsight, instead. Either offering, however, provides a scalable analytics capability that's used to clean and transform the data, which is then stored in the SQL Data Warehouse. Whether it's Databricks or HDInsight that does it, the data is going to be cleaned up so it follows a schema that can be used to get business answers from, via analysis. To perform the analysis, you can use Azure Analysis Services. To facilitate the querying of the completed analysis, you can use a service such as Power BI. By leveraging something like Power BI, you can get stuff like operational reports and analytical dashboards. It essentially leverages Analysis Services under the hood. It will then query Azure SQL Data Warehouse. As you can see, in such a design, data flows in from its original sources. It's then stored and cleaned up. Next, the data is remodeled and translated, before being stored in the final analysis format. Analysis is then run against the data This overall process is a typical way in which you would pull together several different components to design a modern data warehouse.
About the Author
Tom is a 25+ year veteran of the IT industry, having worked in environments as large as 40k seats and as small as 50 seats. Throughout the course of a long an interesting career, he has built an in-depth skillset that spans numerous IT disciplines. Tom has designed and architected small, large, and global IT solutions.
In addition to the Cloud Platform and Infrastructure MCSE certification, Tom also carries several other Microsoft certifications. His ability to see things from a strategic perspective allows Tom to architect solutions that closely align with business needs.
In his spare time, Tom enjoys camping, fishing, and playing poker.