Power BI has changed the business intelligence (BI) landscape forever, enabling BI professionals and regular Excel users alike to work with big data and build beautiful and insightful dashboards. Integral to the Power BI platform is the data model.
This course discusses fundamental concepts of data modeling design within Power BI. We explore Microsoft's recommended best practices as well as common data modeling complications and the solutions to them. This course will prepare learners to develop data models strategically to achieve the best performance.
Learning Objectives
- Identify the primary components of data models and describe flat schemas and star schemas
- Diagnose common Power BI data modeling issues within models and recommend appropriate solutions to them
- Identify common challenges in Power BI data models, implement smart solutions, and avoid common mistakes
Intended Audience
- Business professionals whose job requires them to design and build data models in Power BI
- Anyone preparing to take the Microsoft DA-100 exam
Prerequisites
To get the most out of this course, you should be familiar with preparing data using Power BI.
The term granularity refers to the level of detail within the data of your model. High granularity means that you can see lots of minute details, while low granularity means you see fewer details and focus more on the bigger picture. There are pros and cons associated with each. And so each model will have a unique definition of ideal granularity.
Each table will have its own level of granularity too, defined by its smallest measurable increment. Date tables that go down to the day, will support analysis on changes over time down to the day. This might be perfect for a marketing company determining the efficacy of certain marketing campaigns throughout each month and year, but other industries require much more frequent measurements, such as power companies who often measure down to the nanosecond.
That high granularity is needed for power companies. But if the marketing company measured in nanoseconds, their data model would be preposterously huge and unnecessarily slow. When determining the appropriate level of granularity for your model, you can ask what level of detail is expected by report consumers or by report authors? And also, will your models' insights be compared or used next to results from other analysis systems? If so, the granularity should probably be as similar as possible.
Once you've decided the right level of granularity for your model, make sure all to mention tables and the fact table, can be measured as increments of that granularity.
Chelsea Dohemann is a Senior Technical Trainer and Microsoft Certified Master with almost a decade of experience in technology training. She has taught an array of applications from Microsoft products including Office 365 web apps, Microsoft Office Suite, Power BI, VBA for Excel, and SharePoint to Adobe Acrobat Pro and Creative Cloud. Being a persistent learner herself, Chelsea is acutely in-tune with the challenges of learning. She presents her topics in plain language, with real-world examples, reducing complex concepts down to their simple parts.