The course is part of this learning path
Typically the life cycle of a Power BI dataset doesn't end with publishing. This course looks at managing Power BI datasets after they have been deployed. More often than not, the source data changes regularly, so we look at updating functionality available within the Power BI service and other cloud-based methods for keeping your data fresh. Not all source data is cloud-based and readily accessible to the Power BI service, so we'll see how to use data gateways to access on-premises data, which ironically also includes data residing on cloud-hosted virtual machines.
Sometimes you'll need to grant access to datasets beyond just viewing the data. We see how you can permit users and user groups to repurpose a dataset for reports they publish. Row-level security groups are a good way to partition data for different audiences, but assigning domain users to each group can be tedious. You'll see how to assign users to row-level security groups by leveraging Office 365 user group membership. The course finishes by going through the global options in Power BI Desktop to see how to customize the user experience and improve performance.
- Learn how to implement a data gateway to access on-premises data sources
- Learn to map Office 365 user groups to Power BI row-level security groups
- Permit other report designers to use a deployed dataset for their own purposes
- Understand the different options for keeping deployed data fresh
- Learn about the global file options in Power BI Desktop
This course is designed for anyone who wants to learn how to manage their Power BI datasets after they have been deployed.
To get the most out of this course, you should be comfortable using Power BI desktop and have some knowledge of publishing reports to PowerBI.com
After you've published your report and or dataset, chances are you need to keep the data up to date. There are several data refreshing scenarios depending on the data's format and original source. When we look at a workspace, a dataset has two columns: refreshed, when the dataset was last refreshed, and next refreshed, when it will be refreshed. Currently, there is no refresh schedule in place for SalesAndMargin; hence the N/A for not applicable. The last on-demand manual refresh was attempted on the 4th of April but was unsuccessful, as indicated by the red warning triangle. We can configure a refresh schedule by clicking on the rectangle with the refresh symbol on the bottom left. Alternatively, you can instigate an on-demand manual data refresh by clicking the circular arrow to the left. Clicking schedule refresh takes us to the dataset settings page, where we can expand the schedule refresh section. The refresh frequency is daily or weekly, and you can refresh a dataset eight times a day with a standard Power BI account, 48 times – every half hour with a premium account. You can add scheduled refresh times with the Add another time link, but to be honest, adding 48 times like this is tedious, so I'll just do a few. At the top of the settings page, there is a hyperlink to the refresh history. With the schedule in place, it's a case of hurry up and wait. One thing to be aware of is that refreshes won't happen exactly at the specified time, but usually within five minutes. There we go. Not quite 11:33, and the dataset has been refreshed. Looking at the history, the refresh type says scheduled.
You can invoke a dataset refresh using the API via a PowerShell command, code, or a Power Apps Flow job. Let's look at using Invoke-PowerBIRestMethod via PowerShell. First, I need to install the Microsoft Power BI management module by running PowerShell in administrator mode. Next, I'll log in to Power BI. The command Invoke-PowerBIRestMethod takes three parameters. Url is the API URL with the unique id of the workspace and dataset guids embedded in it. Method is post, and body is JSON with some extra parameters. In this case, I've said notify me by email if the refresh fails after 3 attempts. Okay, let's run that. Looking at the refresh history, we can see that it successfully ran with the refresh type of Via API.
You can trigger a dataset refresh based on changes with the data source using Power Automate's Flow. I'll create a new Flow starting with a blank automated cloud flow. The flow is called RefreshViaGateway, and I'll trigger a refresh when data is added to an SQL Server table. The SQL Server is on-premises, and I'm accessing it through an on-premises data gateway. The database has a table called RefreshViaFlow that will be used to trigger the update when a record is inserted into it. The second step is Power BI: Refresh a dataset, where I select the workspace and dataset. So that's OnPremData and SalesAndMargin. Right, it looks like my trial Power Apps license isn't going to be capable of running this job. I'll quickly upgrade to a per-user license. Next, I need to turn on the flow before inserting a record into the SQL Server table. The table used to trigger the flow job needs a unique Id column and a timestamp column. If you don't want to add those columns to the table you're monitoring for changes, you can create a dedicated table as I have here and use SQL triggers to insert records into it. I'll insert a record via SQL Server Management Studio to kick off the flow job. Heading back to Power Automate and refreshing the run history, we can see the job has run within a minute of the data change. Quite impressive. Back in Power BI's refresh history, the Flow job has shown up as a refresh via API.
Another scenario is refreshing a Power BI Desktop pbix file from an external location like One Drive or SharePoint. Refreshing from One Drive circumvents the need for a gateway, but refreshing the file from One Drive isn't the same as refreshing the file's contents. You can schedule PowerBI.com to refresh the file as often as you like, but you'll keep seeing the same stale data in the report unless you have a mechanism for refreshing the file's contents. Automatically updating the file's contents on One Drive is outside the scope of PowerBI.com's refreshing abilities. In this situation, you can create an Azure Power Automate flow to copy the file to One Drive. To automatically refresh the file, you may have to use a third-party solution like Power Update from PowerOn.
Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard.