How to hack yourself in Power BI (and Power Pivot?)

Reading Gerhard Brueckl’s post on how to visualize SSAS calculation dependencies reminded me of my post about a similar technique from December last year.

His solution has features that would do my version good as well:

  1. Directly connect to the model to be analysed without clumsy export of measures to txt via DAX Studio
  2. Including calculated columns? No!: Who does calculated columns in DAX in PBI? Do them in the query editor using M instead (more functions, better compression, easier merge of model to SSAS once needed)

So wouldn’t it be cool if we could just add a documentation page to our current model – “all in one” so to speak? Here you find how to “hack”-connect with Excel to your current Power BI Desktop-Model.

So what works with Excel should work with PBI as well – just that we need to connect via the query-editor, using M. And of course: As we’re hacking ourselves here (i.e. the file we’re currently working on), we need to save our changes in order to make them being shown.

Unfortunately the port number changes everytime you re-open your PBI file, so you have to change that number in your code once you want to refresh the report (it’s in the first line of the query). That’s all you need to update. Although the code/name of your model will change as well, this will be taken care of automatically: Using a DMV to fetch the value of column [Value] in the first row – being the Catalog here:

Database = AnalysisServices.Database(“localhost: “&Port, “”, [Query=”select * from $SYSTEM.DISCOVER_PROPERTIES”]){0}[Value]

where Port=”YourCurrentPortnumber” ( as text !)

So yes: You can directly run DMV’s into your current file from Power BI – no need to go via DAX-Studio!

Then retrieve the data you’re after. Unfortunately the DISCOVER_CALC_DEPENENCY didn’t return any results (neither in PBI nor in Excel/Power Query) (as I would have preferred to use Gerhard’s visualizations as well), so I continued with my file and the MDSCHEMA_MEASURES DMV:

DMV = AnalysisServices.Database(“localhost: “&Port, Database, [Query=”select * from $SYSTEM.MDSCHEMA_MEASURES”])

You’ll find the full code in the file attached:

OK – this was just for fun. It’s completely unsupported … so anything could happen!

But you could use the M-code in Excel to connect to a PBI-instance in order to generate a flat-file-export from your PBI-data, inclusive measures! In Excel you could even hold the changed PortID in a field in your sheet and connect that as a parameter to your query. In that case you even don’t have to edit the queries in order to update your results.

Thinking about it: Would this also be a way to create calculated tables in Power Pivot (put the code into the command area where I addressed the DMV’s)? I could hack all night long… or leave that for someone else 🙂

Enjoy & stay queryious 🙂

P.S.: If you’re not running DAX Studio, here you find a different method to retrieve the current port-ID.

Edit 2016-10-26: There has actually been a much better description of this process by Chris Webb here (apart from the retrieval of the DB-name :-)) . You might also be interested in some new schemas that Meagan Longoria has beautifully documented here.

Edit 2016-11-13: Mim posted a very nice code in the comments to automatically connect to your current model, which I modified a bit: You can pass your DAX-query to the function optionally, so if you leave it empty you will be prompted to the step where you can manually expand your fields:

(optional Dax_Query as text) =>

      Source_Port = Folder.Files("C:\Users"),
      msmdsrv = Table.SelectRows(Source_Port, each [Name] = "msmdsrv.port.txt"),
      #"Sorted Rows" = Table.Buffer(Table.Sort(msmdsrv,{{"Date created", Order.Descending}})),
      Select_Last_Instance_Active = Table.FirstN(#"Sorted Rows",1),
      #"Combined Binaries" = Binary.Combine(Select_Last_Instance_Active[Content]),
      #"Imported Text" = Table.FromColumns({Lines.FromBinary(#"Combined Binaries",null,null,1252)}),
      PortID = Table.TransformColumns(#"Imported Text",{},Text.Clean){0}[Column1],
      Database = AnalysisServices.Database("localhost: "&PortID, "", [Query="select * from $SYSTEM.DISCOVER_PROPERTIES"]){0}[Value],
      Import_SSAS = if Dax_Query = null then AnalysisServices.Database("localhost: "&PortID, Database){[Id="Model"]}[Data][Data]{0} else AnalysisServices.Database("localhost: "&PortID, Database,[Query=Dax_Query])

Comments (23) Write a comment

  1. Excellent post, I am really becoming intrigued with the possibilities in terms of leveraging Power BI’s data model and not losing access to pivot tables, as well as far more effectively documenting data models (completely).

    If you are not running DAX Studio or don’t have the freedom to install utilities on your machine, opening the Windows PowerShell cmd prompt and pasting these two lines will output the remote port being used by Power BI Desktop for your current session (better yet we could save this to a file to be loaded in a parameters table):

    $processID = Get-Process | Where-Object{$_.ProcessName -eq ‘PBIDesktop’} | Select -ExpandProperty Id
    Get-NetTCPConnection | Where-Object{$_.OwningProcess -eq “$processID” -and $_.RemotePort -gt 0}|Select-Object RemotePort -Unique

    (PS – Thank you very much for answering many of my questions on the Power Query forums)


    • Wow!
      Thank you very much for posting this very valuable tip (restricted access to utilities is really not uncommon)!!


  2. @Leaning Tree
    Using Power Shell is way better
    Thank you very much for this tip

    “better yet we could save this to a file to be loaded in a parameters table…”
    How can this be achieved ?


    • Sam

      paste this code in a notepad and save it as .bat

      for /f “tokens=2 delims=,” %%F in (‘tasklist /nh /fi “imagename eq msmdsrv.exe” /fo csv’) do (
      set var=%%F

      for /f “tokens=2 delims= ” %%h in (‘netstat -ano ^| findstr ESTABLISHED ^| findstr %var%’) do (
      set var1=%%h

      echo %var1% > PBD_Port.csv

      it will generate a csv file, with the port name, no admin right is required.

      I guess now, PowerBI desktop become our Personal ssas server !! Exciting times indeed.


  3. fantastic, at last, i can connect to my powerbi model, and dynamically change the database name !!! thanks a lot for this solution


    • Great that this helped!
      BTW: I’ve just added some additional resource that you might find helpful as well.


        • Wow – thanks Mimoune!

          So with just one PBI instance open we can fully dynamically hack ourselves now: Use the FromFolder-method, address the AnalysisServicesWorkspace-level, filter on msmdsrv.port.txt as the name of the file and retrieve the port number:

          Lines.FromBinary(Table.SelectRows(Folder.Files(“C:\Users\..YourUserName..\AppData\Local\Microsoft\Power BI Desktop\AnalysisServicesWorkspaces”), each ([Name] = “msmdsrv.port.txt”))[Content]{0},null,null,1200){0}

          Love it 🙂


          • I am using a slightly different script here
            Source_Port = Folder.Files(“C:\Users”),
            msmdsrv = Table.SelectRows(Source_Port, each [Name] = “msmdsrv.port.txt”),
            #”Removed Other Columns” = Table.SelectColumns(msmdsrv,{“Content”, “Name”, “Date modified”}),
            Select_Last_Instance_Active = Table.FirstN(#”Removed Other Columns”,1),
            #”Combined Binaries” = Binary.Combine(Select_Last_Instance_Active[Content]),
            #”Imported Text” = Table.FromColumns({Lines.FromBinary(#”Combined Binaries”,null,null,1252)}),
            #”Cleaned Text” = Table.TransformColumns(#”Imported Text”,{},Text.Clean),
            Port = #”Cleaned Text”{0}[Column1],
            Database = AnalysisServices.Database(“localhost: “&Port, “”, [Query=”select * from $SYSTEM.DISCOVER_PROPERTIES”]){0}[Value],

            Dax_Query=”evaluate SUMMARIZE(cobra,[tag])”,

            Import_SSAS = AnalysisServices.Database(“localhost: “&Port, Database,[Query=Dax_Query])

            i don’t really need the user name
            i much prefer to include the dax query to retrieve the selection i want.

            i am sure there are some extra steps not needed.


          • Thank you mim, that’s very smart! Allowed myself to “consolidate” it with my previous suggestion 🙂

  4. Imke

    i failed to appreciate your last comment “where you can manually expand your fields”

    actually i just realized that when you expand the fields and keep it as a connection only, then you create a traditional pivot table, the data will be persistent, even if you close the powerbi desktop.

    it means you can create a reports and share it with the client, they can slice and dice without a data model, that’s a game changer in certain scenario

    client have only excel 2010
    don’t want to share the data model but only the results
    the reports will be very light in size


  5. Can anyone explain how to gain access to my “SSAS” db? Once I follow the steps given above, I am being asked to provide my credentials. As this instance of “SSAS” is really just my Power BI file, how do I provide this information?


  6. David

    unfortunately, you have to do the authentication each time, as every time PowerBI generate you new port id, i am trying to find any option in M where you can assign a default authentication mode,


  7. Hi Imke,

    @mim’s query was taking a while to execute on my machine, probably my file system cache was too cold, so I’ve adapted your short query with a small optimization for current user.

    CurrentUser = Table.FirstN(Table.Sort(Folder.Contents(“C:\Users”),{{“Date accessed”, Order.Descending}}),1)[Name]{0},
    Port = Lines.FromBinary(Table.SelectRows(Folder.Files(“C:\Users\”&CurrentUser&”\AppData\Local\Microsoft\Power BI Desktop\AnalysisServicesWorkspaces”), each ([Name] = “msmdsrv.port.txt”))[Content]{0},null,null,1200){0}

    I assume that the current user is the last one that has accessed their user folder “C:\Users”. This is not a true optimization because apparently NTFS can take up to an hour to update that value.

    By the way thanks @mim for your suggestion to check this post again


    • Hi Igor,
      thank you, that’s brilliant! We can even give it a name: CurrentUser.Name 🙂
      That’s very useful for user specific pre-filtering of imports as well. Will you blog about it?


  8. Hi Igor,

    maybe it is wise to check why it is slow in your case, CurrentUser trick is very nice, but i can’t make it to work, the access date are not updated real time,

    maybe because i am using SSD, i did not notice it is slow 🙂


  9. I just came across your post when googling a solution for my situation.

    I have a situation here which I think could potentially use the method discussed in this post but I am not 100% though. So would like to hear your view here. I used PQ to cleanse and summarize a dataset which was subsequently loaded to Excel’s Data Model where I have built my power pivot tables. Is it possible that I can use a separate Excel file to build power pivot tables based on this Data Model which resides in a different Excel file? So it is kinda like Front End: Excel, Back End: Another Excel’s Data Model created by the PQ? If so, how can I access to this Data Model. Many Thanks!


    • Hi Gordon,
      you can only connect to a Power BI data model, but not to an Excel model.
      But it’s just some clicks to transfer an Excel data model to Power BI.
      If you want to connect to a Power BI Desktop model, that Power BI file has to be open on your machine as long as you want to connect from Excel to it. This is hack that’s not supported by Microsoft.
      The “official” way to connect from Excel to a central Power BI model is to publish your model the the Power BI service and use Power BI publisher tool to connect directly to it: . That’s hassle-free and supported.


Leave a Reply