In this course, we're going to review the features, concepts, and requirements that are necessary for designing data flows and how to implement them in Microsoft Azure. We’re also going to cover the basics of data flows, common data flow scenarios, and what all is involved in designing a typical data flow.
Learning Objectives
- Understand key components that are available in Azure that can be used to design and deploy data flows
- Know how the components fit together
Intended Audience
This course is intended for IT professionals who are interested in earning Azure certification and for those who need to work with data flows in Azure.
Prerequisites
To get the most from this course, you should have at least a basic understanding of data flows and what they are used for.
Hello and welcome back. What we're going to do in this demonstration here is create an Azure data factory. This is the first part in this lab we're going to work through here, or these demonstrations. Now, on the screen here I'm logged into my Azure portal. I'm at my home page and I'm logged in as my global admin. I can get the data factories in a couple different ways. I can create a resource. I can search for them in the bar. Or I can just select them from my recently used resources here.
So, we'll just take the easy way here and we'll go to data factories. And we can see we have no data factories here right now. So, what we're going to do is go ahead and create our data factory. Now, on this create data factory page, what we need to do is provide some project details. We need to tell Azure what subscription this is going into, what resource group it's going into, and we need to provide some instance details. So, for this demonstration here, we'll go into the Berks Batteries subscription. And what we'll do for the resource group is deploy into the BerksRG resource group.
Now, once we've selected the resource group, we need to provide the region name and version for our instance. We'll leave the default East US here selected. And what we'll do here is we'll call this MyDataFactory. Now, you'll notice the data factory name is already taken. And that's because this name needs to be globally unique. So, it needs to be unique across the entire Azure landscape. So, I'll just be lazy here and change it to MyDataFactory1973. And we're good there. And then you can do a V1, which is being deprecated or the V2 that is recommended.
So, we'll select V2. And for this demonstration we don't need to provide any of the get information or networking information tags or advanced. We can just go ahead and review and create here. And what it does is an error is out and tells me, go back to get configuration. What we have to do is check the box to configure Git later if we're going to use that. So, I'll go ahead and review and create. What it does, it validates our information and we can go ahead and create the data factory.
At this point, the data factory has been deployed. We'll click, 'Go to resource' here and that takes us to our data factory page. Now that we have our data factory provisioned, we'll move into the next demonstration, where we'll walk through the process of creating a pipeline with data flow activity.
Tom is a 25+ year veteran of the IT industry, having worked in environments as large as 40k seats and as small as 50 seats. Throughout the course of a long an interesting career, he has built an in-depth skillset that spans numerous IT disciplines. Tom has designed and architected small, large, and global IT solutions.
In addition to the Cloud Platform and Infrastructure MCSE certification, Tom also carries several other Microsoft certifications. His ability to see things from a strategic perspective allows Tom to architect solutions that closely align with business needs.
In his spare time, Tom enjoys camping, fishing, and playing poker.