Designing Data Flows in Azure
Data Flow Basics
Designing a Data Flow Solution
The course is part of these learning pathsSee 1 more
This Designing Data Flows in Azure course will enable you to implement the best practices for data flows in your own team. Starting from the basics, you will learn how data flows work from beginning to end. Though we do recommend an idea of what data flows are and how they are used, this course contains some demonstration lectures to really make sure you have got to grips with the concept. By better understanding the key components available in Azure to design and deploy efficient data flows, you will be allowing your organization to reap the benefits.
This course is made up of 19 comprehensive lectures including an overview, demonstrations, and a conclusion.
- Review the features, concepts, and requirements that are necessary for designing data flows
- Learn the basic principles of data flows and common data flow scenarios
- Understand how to implement data flows within Microsoft Azure
- IT professionals who are interested in obtaining an Azure certification
- Those looking to implement data flows within their organizations
- A basic understanding of data flows and their uses
Related Training Content
For more training content related to this course, visit our dedicated MS Azure Content Training Library.
Azure Databricks is an alternative to HDInsight. It's an Apache Spark-based analytics in Azure that allows you to deploy data analytics and artificial intelligence. It allows you to pull together data at virtually any scale. You can then obtain data insights via features such as analytical dashboards and operational reports. With Azure Databricks, you can transform data into actionable insights via advanced machine learning tools, allowing you to combine all kinds of data at any scale. It also allows you to build and deploy custom machine learning models. As part of a data flow, Azure Databricks offers you the ability to deploy an easily-autoscaled Spark environment within minutes. It supports languages such as Python, Scala, R, and SQL, in addition to several deep learning frameworks and libraries such as TensorFlow, and Pytorch. Its native integration with Azure AD and other Azure services allows you to build a modern data warehouse, complete with machine learning as well as real-time analytics. The image on your screen does a good job of depicting the role that Databricks plays in a typical data flow. It fits into the transformation and analysis stage of the data flow. Once the ingested data has been processed through Databricks, analyzed, and transformed, it can then be served up via services such as Power BI and other Azure Analysis services. What Databricks does, essentially, is provide you with an analytics platform that can be used to apply intelligence to data, which in turn, provides valuable insights into the collected data. Moving forward, you can expect to see more and more modern workflows leveraging Azure Databricks.
Tom is a 25+ year veteran of the IT industry, having worked in environments as large as 40k seats and as small as 50 seats. Throughout the course of a long an interesting career, he has built an in-depth skillset that spans numerous IT disciplines. Tom has designed and architected small, large, and global IT solutions.
In addition to the Cloud Platform and Infrastructure MCSE certification, Tom also carries several other Microsoft certifications. His ability to see things from a strategic perspective allows Tom to architect solutions that closely align with business needs.
In his spare time, Tom enjoys camping, fishing, and playing poker.