Designing Data Flows in Azure
Data Flow Basics
Designing a Data Flow Solution
The course is part of these learning pathsSee 1 more
This Designing Data Flows in Azure course will enable you to implement the best practices for data flows in your own team. Starting from the basics, you will learn how data flows work from beginning to end. Though we do recommend an idea of what data flows are and how they are used, this course contains some demonstration lectures to really make sure you have got to grips with the concept. By better understanding the key components available in Azure to design and deploy efficient data flows, you will be allowing your organization to reap the benefits.
This course is made up of 19 comprehensive lectures including an overview, demonstrations, and a conclusion.
- Review the features, concepts, and requirements that are necessary for designing data flows
- Learn the basic principles of data flows and common data flow scenarios
- Understand how to implement data flows within Microsoft Azure
- IT professionals who are interested in obtaining an Azure certification
- Those looking to implement data flows within their organizations
- A basic understanding of data flows and their uses
Related Training Content
For more training content related to this course, visit our dedicated MS Azure Content Training Library.
In this first demonstration, we're going to create an Azure Data Factory. And what we'll do is we'll start the Data Factory User Interface, so that we can create a pipeline in the upcoming demonstrations. To create the Azure Data Factory, you need to use either the Edge browser or Google Chrome. At this time the Data Factory UI is only supported on Microsoft Edge or Google Chrome. As you can see on your screen here, I'm in my Azure Portal, and I'm in my Azure course resource group. To create the Data Factory, I simply click Create a resource, and then I can scroll down to Analytics. Under analytics, I'll find Azure Data Factory. So select Azure Data Factory, and begin the process here. On the New data factory screen, we're gonna give our Azure Data Factory a name. The name that we give it needs to be globally unique. What this means is, it needs to be unique across the entire Azure landscape, not just within your own tenant or subscription. So I'm going to call this MyDataFactory, and then I give it a suffix. We then need to tell it what subscription to deploy to, and we're going to use the Page-As-You-Go. And then we'll use my existing resource group called AzureCourse. We're offered three different versions, version one, version two, and version two with data flow. We're going to use version two here, for our demonstration.
Now, the differences between these three versions, are that version two, adds a little bit more functionality over that of version one, while also making some of the tasks that you performed in version one even easier. Version two with data flow, builds upon version two, by allowing you to create data flows and pipelines without any coding. It's intended to be a totally visual design solution. However, since it's in preview, things are likely to change as bugs are identified and suggestions are made. So we're going to use two for this demonstration. As far as location goes, we'll deploy our Data Factory into the East US location. Now, what I do want to point out is if I drop down the box here for location, you're only going to see a subset of Azure locations. That's because only a subset of Azure locations support Data Factory. So we'll keep East US here, and then we'll click Create. And what we're going to do is pin this to our dashboard here, and so now we can see, our Data Factory has been provisioned. In the next demonstration we'll create a pipeline using the Data Factory User Interface.
About the Author
Tom is a 25+ year veteran of the IT industry, having worked in environments as large as 40k seats and as small as 50 seats. Throughout the course of a long an interesting career, he has built an in-depth skillset that spans numerous IT disciplines. Tom has designed and architected small, large, and global IT solutions.
In addition to the Cloud Platform and Infrastructure MCSE certification, Tom also carries several other Microsoft certifications. His ability to see things from a strategic perspective allows Tom to architect solutions that closely align with business needs.
In his spare time, Tom enjoys camping, fishing, and playing poker.