Export the results of best practice analyzer from all models
In this post, we will see how to automatically export the results of best practice analyzer for all our models.
This post focuses on SQL Server Analysis, Azure Analysis Services, and Power BI datasets only if you have access to the BIM files. To automatically export the results of best practice analyzer from the Power BI service requires different approaches, especially on the authentication side which I won’t cover in this post.
Table of Contents
What is Best Practice Analyzer
Best Practice Analyzer (BPA) is a feature of Tabular Editor that allows you to define rules based on your model’s metadata and then scan the entire model to identify issues according to these predefined rules.
Once the scan is completed, all identified issues are revealed, and some issues can be automatically resolved. Here are the results of the Best Practice Analyzer
While running the BPA on demand can be very useful for developers, obtaining a comprehensive overview of all our models at once can provide better insight into the overall quality of the models at a company level.
Setting BPA rules and storing the JSON rule file
The BPA rules can be defined in a JSON file and customized as much as you wish, provided you have sufficient knowledge of JSON and the TOM hierarchy.
Alternatively, you can download the BPA rules from the Microsoft Git repository, which is frequently updated with new content and then keep only the rules that appear relevant to your organization.
Once you are satisfied with the definition of the rules, you need to upload the BPARules.json file to the designated directory on the machine where the BPA will run.
%AppData%\..\Local\TabularEditor
to make the rules available only to you.%ProgramData%\TabularEditor
to make the rules available for everyone on your local machine.
Run CMD command
Given that this post focuses on automating and exporting the results of the Best Practice Analyzer, let’s explore a straightforward method to export one model at a time using CMD.
To export the results of a single model, you can use the following command:
TabularEditor.exe yourServer yourDB -A > bparesults.txt
It is also possible to run the Best Practice Analyzer against BIM files instead of directly from the models. Here’s how you can do it:
TabularEditor.exe "c:\Projects\SSAS\AdventureWorks\Model.bim" -A > bparesults.txt
For more command lines related to Tabular Editor, you can refer to the documentation available here.
Export the results of best practice analyzer from a server
To export the results of best practice analyzer for all models on your server, you can execute this PowerShell command
[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
# Variable for your SSAS server name
$SSASServerName = "your server"
# Connect to the SSAS server
$SSASServer = New-Object Microsoft.AnalysisServices.Server
$SSASServer.Connect($SSASServerName)
#Set the location of where tabular.exe file is located
Set-Location -Path "C:\TabularEditor"
# Try to list the SSAS databases Name
foreach($databses in $SSASServer.Databases ) {
$cube_name=$databses.ID
./TabularEditor.exe $SSASServerName $cube_name -A | Out-File -FilePath C:\TabularEditor\bpa\${cube_name}.txt
}
This script generates one file per model in the directory configured above. You can rerun the script to obtain updated BPA results, as it will override the existing files.
Make sure that the machine from which you’re running the command has access to the target server, you usually need to open the ports 2383 & 2382. You can check the documentation here to set the firewall according to your scenario. And of course, the login that executes the script needs to have sufficient permission on the target server.
Export the results of best practice analyzer from BIM files
You can also execute the script against BIM files, enabling it to handle Power BI premium datasets if you store the BIM files of your dataset before publishing them to the Power BI service, which may be the case if you have implemented CI/CD pipelines with Azure DevOps for example.
# Set the directory path
$directoryPath = "Your_Directory_Path_Here"
# Get all .bim files recursively from the directory
$bimFiles = Get-ChildItem -Path $directoryPath -Recurse -Filter "*.bim"
# Path to TabularEditor.exe
$tabularEditorPath = "C:\TabularEditor\TabularEditor.exe"
# Loop through each .bim file
foreach ($file in $bimFiles) {
# Print the name of the current file being processed comment/uncomment for debugging
#Write-Host "Processing file: $($file.FullName)"
# Construct the full path of the file
$filePath = $file.FullName
#1st option: Construct the output file path (to use if you have properly named your bim files)
#$outputFilePath = "C:\TabularEditor\bpa\" + $file.BaseName + ".txt"
#2nd option: Construct the output file path (to use if all bim files are named "Model.bim"
$outputFilePath = "C:\TabularEditor\bpa\" + $parentFolderName + "_" + $file.BaseName + ".txt"
# Execute TabularEditor command and output to file
& $tabularEditorPath $filePath -A | Out-File -FilePath $outputFilePath
}
By default, BIM files are named “Model.bim”. If you don’t rename these files with a distinct name, you’ll need to comment out the first option in the script and use the second option, which uses the folder name of the BIM file as the output name. Here’s an example of how I export the results of Best Practice Analyzer for all my BIM files, using the folder name as a prefix, since all my BIM files were named “Model.bim”.
Schedule the script to run every day
To schedule our script, we can use the Windows Task Scheduler, which is a simple, free and reliable tool for running scripts daily.
Let’s create the task in the task scheduler.
Let’s begin by creating a dedicated folder for scripts related to SSAS. It will make it easier to monitor all jobs related to SSAS if we need to schedule additional scripts in the future.
Once the SSAS folder is created, follow these steps:
- Create a new task in the Task Scheduler.
- Give the task a meaningful name to easily identify its purpose.
- Define a trigger for the task, setting it to run “on a schedule” daily, and specify the desired execution time.
- In the action section, choose ‘Start a Program’ and input the following information:
Add any additional conditions as needed, or leave them at default settings.
After creating and scheduling the task, it’s best to run it initially to ensure that the script works as expected.
Expose the result to Power BI
Now the final task consists of reading, cleaning and loading the exported results of the BPA into PowerBI.
First, we need to create a parameter called “FolderPath” to facilitate its execution across different environments.
Create a Folder Connection
When we export the results of best practice analyzer it generates several files as we saw earlier.
Therefore, we need to create a folder connection with the path where the files have been generated.
After loading the files, you should obtain a table with the binary data of each file.
Thus, we need to read and transform the text data into columns.
The M code
You can get the M code directly by downloading the PowerBI file from my github link
The M code simply removes the unnecessary rows and then splits columns using separators such as square brackets or apostrophes.
Depending on the Tabular Version you’re using and how you’ve named the BPA rules, you may need to make adjustments. I typically specify the [Category] and [Severity] within the Description of the rule, as it makes it easier to read and parse.
Here is an example of my rule description: “[Performance] [2] Do not use floating point data types”
The Power BI
I’m not really a visual guy so you can build something much better than that but here is an overview of all the models with the data of the date of the last exported result BPA.
Here is a more detailed page wich allows you to look at specific models or categories or specific rules.
You can download the PBIX file here.
Configure the data source
The final step involves creating the data source connection on the Power BI portal. Depending on where you store the exported BPA result files, you may require a different data source type.
In my scenario, a folder connection suffices, as I directly run and export the results of Best Practice Analyzer on the gateway machine. However, in your scenario, configuring a data source connection may not be necessary if your files are stored in the cloud.
Conclusion
In this post, I covered how to export the results of best practice analyzer from all Analysis Services models or BIM files. This can be very useful to have a global report refreshed on a daily basis to alert your teams when their models do not comply with the best practice rules that you have put in place within your company.
4 thoughts on “Export the results of best practice analyzer from all models”
Hi,
This is really interesting !.
I tested this with the standard BPARules.json and it work just fine to extract the data with PowerShell.
I tried to use the BPA.pbix Power BI report and changed the FolderPath but there seems to be some issue with the “Severity” information (1, 2, 3) as that doesn’t seems to be included in the extracts.
Example from extract:
Tabular Editor 2.23.0 (build 2.23.8823.23768)
——————————–
Loading model from server …
Running Best Practice Analyzer…
=================================
Column ‘APDV010_PART_DIM'[APDV010_PART_ID] violates rule “[Performance] Do not use floating point data types”
Column ‘APDV010_PART_DIM'[PART_QTY_IN_LVL1] violates rule “[Performance] Do not use floating point data types”
Column ‘APDV010_PART_DIM'[PART_QTY_IN_LVL2] violates rule “[Performance] Do not use floating point data types”
Column ‘APDV010_PART_DIM'[FORECAST10W] violates rule “[Performance] Do not use floating point data types”
AM I missing some part of the setup…
hi Ake,
Thanks for the comment! Yes that’s something I added myself in the extracted JSON rule file, you can either add it too or remove the M code part but if you’re not sure where to remove it I’d advise to add the [severity] in the file like I explained in the post: Here is an example of my rule description: “[Performance] [2] Do not use floating point data types” where [2] is the severity.
hi
i have an issue.
i’ve installed TE 2 and have a model.bim file on my machine and already downloaded bpa.json. but when I run the script in powershell I face this error:
TabularEditor.exe : The term ‘TabularEditor.exe’ is not recognized as the name of a cmdlet, function, script file, or
operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try
again.
At line:2 char:1
+ TabularEditor.exe “d:\Model.bim” -A > bparesults.txt
+ ~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (TabularEditor.exe:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
hi Mahdi,
Can you copy/paste your script here