The VertiPaq-Analyzer tool is one of the great community tools that I really cannot live without. It gives me a great overview of all elements in my model and identifies potential performance problems by showing the storage requirements of each column. So when seeing expensive columns, the first question that arises is: “Do I really need this column or could I delete it?”. Luckily, this can now be answered with my new Power BI Cleaner tool. This tool shows the usage of all columns (and measures) within the tables of the VertiPaq Analyzer.
Power BI Cleaner tool
So whenever there is no entry in the column “Where Used” you can go ahead and eliminate the column (or measure) from the model. Well – with one exception actually: Fields used in the definition of incremental load policies are currently not identified. So make sure to consider this before running wild 😉
In the table “Measures” you can spot which measures might not be used:
And as a general recommendation: Always make sure to have your old version still in place and create a new version for your cleaned up file, just in case.
To learn more about the used elements, you can drill through to detail pages “Where Used Direct” and “Where used Indirect”:
“Where Used Direct” will show where the elements are directly used in:
And the next page “Where Used Indirect” will show elements that are using the selection indirectly at a later stage as well:
Please note that the drill through pages will only return values for columns and measures that have been used or referenced by used items (directly or indirectly) . So if a column or measure has been used in another column or measure this will not be shown as long as those elements haven’t been used anywhere in the report. This is intentional and reflects the purpose of the tool to be used as a cleanup-helper and not as a documentation tool.
What’s (not) covered
The areas covered for usage detection are the following:
- Calculated columns
- Filters (on visual, page, report and drillthrough-level)
- Conditional formatting
And again – NOT covered are:
- Incremental load policies
Please let me know if there are areas that I’ve missed and I’ll see if I can include them as well.
- Columns that are only referenced by their name (without the preceding table name) will NOT be detected.
- False positives will be generated for measures that have the same name than column names: Unused measures will be flagged as used, if their column counterparts are used!
- Some custom visuals produce bad metadata which can lead to errors as well. The Network Navigator for example, renames the input table to “Table1” in the metadata. These fields will not be discoverable through this tool.
- Column and measure names including brackets “()” will currently not be picked up (this limitation might be lifted in future updates)
How to use
- Retrieve vpax-file with DAX-Studio and paste path to vpax-file into parameter “FilePathVPAX”. ( ! This is a preview-feature that you have to enable like described here: https://www.sqlbi.com/blog/marco/2019/09/15/vertipaq-analyzer-2-0-preview-1/ )
- Transform pbix to pbit and paste path for pbit into parameter “FilePathPBIT” (This facilitates showing of “WhereUsed”-column in page “Tables”)
- Refresh All
Or check out this video for a detailed walkthrough:
How does it work
The tool uses the library and functions of the VertiPaq Analyzer and many metadata parsing algorithms of my Power BI Comparer tool. After all I want to thank Marco Russo for the support and confirmation that it is totally OK to adopt their open sourced solutions to other tools.
All the logic of this solution sits in the Power Queries. So do you want to know how this all works? Then just open the query editor and study the queries.
Custom visuals with bad metadata could lead to error.
Will upload hardened version soon.
Download file: PowerBICleaner.zip
Enjoy and stay queryious 😉