Use R to export data from Power BI

Edit 03 Aug 2016: With the July 2016 release you can now run your R-scripts directly in the query-editor (so no row-limit any more!). No need to go via the visuals. But this will limit you to export datasets from the query-editor, so no DAX.

Edit 22 Jan 2016: Currently this method is limited to 150k rows! It will drop all others without warning!

Edit 25 Jan 2016: 150k rows is the limit on all R-scripts at the moment (link). A warning sign will be shown on the image. Will create a new blogpost if this changes – so stay tuned (by following this blog, Twitter or LinkedIn)

With the December release, Microsoft enabled Power BI’s content to be fetched from the R-feature. So instead of plotting this data to the screen, could we just export it to a file?

Yes we can:

write.table(trim(dataset), file=”your filepath & filename.txt”, sep = “\t”, row.names = FALSE);

write.table has a lot of options, which are explained here. I’ve only used a few here: Exactly defining the location where the output-file shall be stored (file=), setting tab as delimiter (sep=”\t”) and skipping row names, which would be an index in our example here.

In addition to that, I had to get rid of trailing whitespaces that somehow sneaked into the table (trim(dataset)). Therefore the gdata-package is needed. So the full R-code looks like this:

require(gdata)
write.table(trim(dataset), file=”your filepath & filename.txt”, sep = “\t”, row.names = FALSE)
plot(dataset);

Here you have to make sure to turn the slashes in the filepath: So if your path looks like this: C:\Users\Imke\Desktop\Rfile.txt you need to write it like this: C:/Users/Imke/Desktop/Rfile.txt

Let’s take this as a start that will probably fulfill many users need to quickly export data from Power BI to simple text files. Further use cases I see are:

  • Workaround for export of Power Query result with over 1,1 Mio rows (import your Excel-Power-Queries into Power BI).
  • Solve performance problems that arise from too complicated transformations: Now we can buffer temporary results in external files and re-import them, breaking the refresh chain
  • Incremental load scenarios: If we have data that grows with time and the transformation before loading into the model takes quite some time, it could make sense to perform the transformations only once – then loading results to the external files and fill the datamodel with an append: transformed newest data and “just imported already transformed” data from the external file(s). Here a further option could come in very handy: “append=TRUE” will append the exported data instead of overwriting the file. Just add column.names=FALSE, because you only need them once.
  • Web-harvesting: Keeping data that will be replaced by newer data with next download: See above

But watch out: These R-features are still in preview mode, so not safe to use in production.

Export to SQL-server? – worth checking out.

Are you new to R and fear effort or trouble of installing? No need at all – Just install R with standard settings (1), set options in Power BI (1), install R-Studio with standard settings (2) and install gdata-package (2). Shouldn’t take more than 15 minutes:

  1. How to download and activate R-features in Power BI: here
  2. Easiest way to install packages is via R-Studio:

Image1

Then you start a new visual for the Export: Choose R-visual and check the fields that you want to see in your table.

Image3

This will create the dataset, that the R-script references.

For further details on how to get started with R-visuals, check out Dustin Ryan’s excellent blogpost.

Edit 2015-01-04: If you’re using PBI service, you can export your data to csv directly. But this is currently limited to 10k rows.

Edit 2015-01-12: Just found this brilliant comprehensive guideline of “How to Learn R”: With learning paths and many resources!

Edit 2017-03-11: Here you’ll find a detailed description on how to export to SQL-server.

Enjoy and stay queryious 🙂

Comments (35) Write a comment

  1. Pingback: Use R to export data from Power BI (and Power Query) | MS Excel | Power Pivot | DAX

  2. Pingback: Use R to export data from Power BI (and Power Query) – The BIccountant | CompkSoft

    • Indirectly yes: As the only thing from Power BI that can be passed on to the R-script is the dataframe. That being the table or underlying data for the visualisation. In there can be everything from your data – including the measures.

      Reply

  3. Hi! I am trying to use the r script in the query editor like this:

    require(gdata)
    write.table(trim(dataset), file=”your filepath & filename.txt”, sep = “\t”, row.names = FALSE);

    but how can i define the data i want in the dataset?

    dataset<-(column names…)?

    Reply

  4. Hi Imke!
    tried to do it to export DAX measures values via visualisation (then I cannot use query editor?), but got an error (sonething like “unexpected end of data” or “too early end of data” – cant translate to English)… Any thoughts?
    Just made R-visual and add measures, or row and measures – no way

    Reply

    • Hi Maxim,
      unfortunately no idea. Did you test the syntax in R Studio? Maybe it’s due to different separators or language settings.

      But if I’m not mistaken it should also be possible to use M script now: Hack to you current model as described here: http://www.thebiccountant.com/2016/04/09/hackpowerbi/
      But instead of using a DMV, write a DAX query. If you create a parameter for the current port number people could even refresh it without having to edit the M-code.

      Reply

      • Thanks for answer, Imke! Did not tested (forgot about this option, for sure), there was only MS R Open on that computer, and it used only as R engine for PowerBI (I am totally noob in R to write own scripts).
        Will check this post also, thanks!

        Reply

  5. Excellent article, Imke. Would be great if you can describe on how to automate the date range for “Transformed newest data” based on the last date of “Just imported already transformed data”. I’m stuck at this point and it’d be great if you can share some insight into the same. Thanks.

    Reply

    • Thx Deepak!
      Re-reading it I must admit that this not particularly clear. What I’ve tried to describe here was the idea to combine 2 steps into one, which might not be the best idea actually: Load the latest data and append to csv and at the same time combine the (previously existing) csv with the new data in the same file.

      Instead I would now recommend to do one step after the other:
      1) Import your new data, transform and append to the csv. Now your csv should have all data you need.
      2) Import your new csv and build your report on it.

      If you need to check what the latest data in your csv is, you have basically 2 options:
      1) Check the latest date in the file itself (but this would require the import of the file, which could take long in this particular scenario) or
      2) Check the latest update-date of the file itself (if that’s enough) by using From-Folder method. This will display your csv-files metadata without the need to fully import it

      Hope this answers your question?

      Reply

  6. Using the R code/script above from the query editor worked great. Thanks for sharing this. Hopefully MS will create an export to csv or excel from the query editor in a future release.

    Reply

  7. imke

    thanks for your code, i tried the same but exporting to sqlite, this little database is really fast !!!

    Reply

  8. # ‘dataset’ holds the input data for this script
    library(DBI)
    library(RSQLite)
    driver = dbDriver(“SQLite”)
    con = dbConnect(driver, dbname = “C:/sqlite/database.db”)
    dbSendQuery(con, “DROP TABLE IF EXISTS Table1”)
    dbWriteTable(con, “Table1”, dataset)
    dbDisconnect(con)
    dbUnloadDriver(driver)

    you need to create the database first, which is very easy in sqlite,

    sqlite3.exe database.db

    actually in a standalone scenario, and for the purpose of storing data, sqlite is a good replacement to ms sql server

    Reply

  9. imke

    another option that i just tried and it does work with massive file, and do not crash unlike dax studio, I test it with a 4Mil and it works perfectly 🙂

    is using powershell to connect to powerbi desktop and export to csv, for the moment i manually input the port number and the database name, but i guess it is easy to get those data automatically 🙂

    let me see one can use Power BI desktop as a solid ETL tool !!!!

    http://www.kasperonbi.com/dump-the-results-of-a-dax-query-to-csv-using-powershell/

    so here is the workflow

    load a lot of different data, do some transformation, then load it to the data model, export to csv.

    take that CSV and load it to the final data model , or an sql database.

    Reply

  10. Imke

    i vaguely remember seeing that blog post but i had the same reaction as yours, it sounds too technical,
    specially when he start speaking about sql server.

    i will try to add the port number, the code is already available, but i have zero knowledge in Powershell,

    just think about this scenario doing the heaving lifting in Power BI desktop export the final Result to csv, import to sqlite consume in whatever client you want , Excel maybe qlik !!!!!!

    Reply

    • Hi mim,
      I’m missing expressions for superlative here !!
      Saying this is totally super awesome still is an understatement 🙂

      As I haven’t worked with Powershell before, I was a bit irritated when I ran the script and the Powershell-window showed a pretty empty blue screen with just a blinking cursor. And due to the fact that I hadn’t loaded my query to the datamodel yet, no csv-file had been created. But also no error-message was shown, as in this modus, the code seems to be hidden.
      But when I opened Powershell separately and copied the code, things became clearer. When running the script from there, I found my file under ..Users/Imke.

      BTW: When exporting in csv-format, large datasets will not be loaded completely!! If loading as txt instead, everything works fine (just be patient until the file is fully loaded, as you can open it while still loading – then data might not be complete!).

      So thank you again so much Mim! I’m very sure that this will be extremely helpful for very many users out there!!

      Reply

      • thanks Imke, i am happy you like it, i was afraid the script does not work because of local language setting 🙂

        unfortunately with very large data sets, the script does not work, i had a 4 Mi table and it does export fine with 5 columns, if i add another column, the export fail because, the SSAS embedded with Power BI desktop is configure to time out after 30 seconds and there is no way to change this configuration, ok maybe in theory we can change it using ssdt (or ssms) but in any case it will disappear when you close the PBD.

        i will try to change the export to txt and see the difference.

        in the worst case scenario, maybe we can export by a chunk of 3Mil.

        it is interesting that now, i do like working with PBI, but only for SSAS and Powerquery, i think 2017 will be very interesting for the Microsoft BI stack

        Reply

        • Thx mim – very interesting to know where the limitations lie.
          Yes, very much looking to what 2017 will bring us here as well 🙂

          Reply

  11. If this is used in combination with this query from here: http://www.thebiccountant.com/2015/12/28/how-to-export-data-from-power-bi-and-power-query/ , it is possible to use Power Query to write back out any table from the workbook, including DAX calculated columns.

    (Dax_Query as text) =>

    let
    Source_Port = Folder.Files(“C:\Users”),
    msmdsrv = Table.SelectRows(Source_Port, each [Name] = “msmdsrv.port.txt”),
    Select_Last_Instance_Active = Table.FirstN(msmdsrv,1),
    #”Combined Binaries” = Binary.Combine(Select_Last_Instance_Active[Content]),
    #”Imported Text” = Table.FromColumns({Lines.FromBinary(#”Combined Binaries”,null,null,1252)}),
    PortID = Table.TransformColumns(#”Imported Text”,{},Text.Clean){0}[Column1],
    Database = AnalysisServices.Database(“localhost: “&PortID, “”, [Query=”select * from $SYSTEM.DISCOVER_PROPERTIES”]){0}[Value],
    Import_SSAS = if Dax_Query = “” then AnalysisServices.Database(“localhost: “&PortID, Database){[Id=”Model”]}[Data][Data]{0} else AnalysisServices.Database(“localhost: “&PortID, Database,[Query=Dax_Query]),
    #”Run R Script” = R.Execute(“# ‘dataset’ holds the input data for this script#(lf)write.table(dataset, file=’your filepath & filename.csv’, sep=””\t””, col.names = TRUE, row.names = FALSE)”,[dataset=Import_SSAS])
    in
    #”Run R Script”

    Reply

    • Hey yes – that’s cool! Thx a lot Mike!
      This is the link where the script can be found that allows you to paste the DAX-query: http://www.thebiccountant.com/2016/04/09/hackpowerbi/
      So to wrap up:
      Mikes script allows you to export anything from your current PBIX-file (incl. DAX columns and measures !!) straight to csv or even SQL-server – with a little help from R 😉

      Reply

  12. Hi, I cannot access R in power query, any hints there? all the sites and downloads seem to be specific to Power BI

    Reply

Leave a Reply