Finding Power BI Semantic Model Refresh Operations In Gateway Logs

The logs for your on-premises data gateway contain a lot of information that is useful for troubleshooting and performance tuning. The contents of the logs are documented here and there a several great solutions out there for reporting on their contents: there’s a Microsoft-provided template and Rui Romano’s more sophisticated monitoring solution here (there’s also another, even better community solution being worked on that will be publicly available soon). To do any troubleshooting or performance tuning, though, you need to link the data generated by Power BI in Log Analytics or Profiler traces to the data in the gateway logs. How can you do this?

Let’s take the refresh of an Import mode semantic model that uses an on-premises SQL Server database as a source as an example. If you have Log Analytics connected to your workspace you can run a KQL query something like this to find the events generated by your semantic model refresh by Power BI:

PowerBIDatasetsWorkspace
| where TimeGenerated > ago(1hr)
| where OperationName in ("CommmandBegin", "CommandEnd", "ProgressReportBegin", "ProgressReportEnd")
//| where OperationName in ("ExecutionMetrics")
| project OperationName, OperationDetailName, EventText, TimeGenerated, XmlaRequestId 
| order by TimeGenerated asc

All the events associated with the refresh command will have the same value in the XmlaRequestId column. If you’re using a Profiler trace to capture the same information the RequestID column will contain the same value.

If you then export the gateway logs and look in the Query Execution Report log file for example (I did it with Power Query – it’s just a CSV file) you can match the XmlaRequestId value from Log Analytics to the value in the RequestId column in the log file:

In this example my semantic model contains two tables sourced from an on-premises SQL Server database, so there are two entries in the Query Execution Report log file, one for each table. Once you’ve found the matching rows in the Query Execution Report log file then you can find all kinds of useful information such as how much time (if any) it took to read data from the source or to spool data to disk. You can do the same thing with the other gateway log files: for example the Open Connection Report will tell you how long it took to open any connections to the data source.

If you have a DirectQuery model connected to an on-premises data source then you do the same thing: all the events associated with a DAX query (such as the Query Begin/End events) will all have the same XmlaRequestId, and again all you need to do is find that value in the RequestId column in the gateway logs.

Share this Post

Comments (0)

Leave a comment