How Power Query can return clickable hyperlinks with friendly names to Excel

When you use Power Query as an Excel-automation-tool rather than just to feed the data model, you might want to return clickable hyperlinks that carry friendly names. This doesn’t work out of the box, but with a little tweak it will be fine:

The trick

Return a text-string that contains the Excel (!)-formula for hyperlinks, preceded by an apostrophe  ‘ . After the data has been loaded to the sheet, check the column and replace ‘= by = to activate your Excel-formula:

Activate the HYPERLINK formula by replacing ‘= with =

You can then format the column to “Hyperlink”:

Format as Hyperlinks

And make sure that future refreshes are automatically formatted as well:

Make sure format sticks

This of course works with all other excel-formulas as well. So if you’re still struggling a bit to calculate your numbers in Power Query, you can simply create a column that holds your Excel-formula which you “activate” by the replacement 🙂

The formula

This formula creates the new column, bringing in the content of columns “URL”  and “Friendly Name” to the Excel-formula:

Table.AddColumn(Source, “ExcelLink”, each “‘=HYPERLINK(“”” & [URL] & “””, “”” & [Friendly Name] & “””) “)

Quotation marks and escape signs

You’ll probably notice the multitude of quotation marks: Quotation marks are used in M to indicate that a text-element is following and will also be used at the end of a text-element. In that context, they are called “escape signs”. So what happens when your text ends with a quotation mark like here: ‘=HYPERLINK(” ? How to prevent the ending quotation mark to be mistaken as escape signs and returning: ‘=(HYPERLINK ( , so cut off the last quotation mark? You add 2 additional quotation marks: That will keep one of it in the text-string.

Subscribers  can download the file here: ClickableHyperlinks.xlsx

Enjoy & stay queryious 🙂

How to open a complex JSON record in Power BI and Power Query

Today I’ll show you a very useful technique how to deal with a JSON record that contains a wild mixture of different elements like this:

If you click on one of the expandable elements, their content will be shown, but you’ll loose all the “surrounding” information (metadata) that is visible now. This is often an issue, regardless if you want to create multiple tables from it to build a star-schema or just need a handful of fields or a denormalized table. But with a little help from M, you’re good to go:

Table.FromRecords( { MyJsonRecord } )

Will returns this:

With this move, every expansion of one of the expandable elements will keep the existing data in place:

Create one big flat table

Simply expand one element after each other to create a denormalized table

Create star schema

For multiple tables, keep this query and reference it to create you (sub-)tables. Always keep the Id-column as the key (!) to combine all the tables in your data model later. (Provided you use this in a function for multiple entities/series)

Best is to play with it, so just past this code into the advanced editor:

 

If your JSON-record has a different structure with “just” header and data in different fields, this technique will be more suitable for you: http://www.thebiccountant.com/2016/04/23/universal-json-opener-for-quandl/

Enjoy & stay queryious 🙂

How to import from Excel with cell coordinates in Power Query and Power BI

There might be occasions where you want to import data from Excel into Power Query or Power BI using cell coordinates like a range from E3 until G9 for example (“A1 cell reference style”). The function I provide below also caters for the potential pitfalls of this task that Maxim Zelensky has described in his article.

Background

If your worksheet has one leading empty row and column, the import will ignore them and automatically return the range starting from B2. So to fetch the range E3:G9 you have to delete the first row and the first 3 columns. But as Maxim has found out, remaining formats on empty cells will lead to an import of empty rows and columns. So the number of rows and columns to delete will vary and is hard/impossible to predict.

Method

The range that PowerQuery or PowerBI will import is stored in the Excel-file already in the sheet-data and the xml looks like this (“Sample3” from Maxims data):

The imported range is E1 till J12, as the first rows contain formatting instructions, and will therefore be imported as well. In the 3rd row E3 shows up with the first value, which is surrounded by “<v>”.

This is how it looks like in the Xml.Table in the query editor:

Task is to calculate the number of rows and columns delete, considering the individual offset that is caused by the formatted empty cells.

Code

So I’ve cooked together these ingredients in a pretty massive code that you can download here: fnImportFromExcelCellCoordinates.txt

How to use the function

Read more

New M-function: Table.TransformColumnTypesToFirstRowsTypes for PowerBI and PowerQuery

The following function automatically transforms all columns to the types that have been detected in the cells of its first row. Provided they come as: Number, date or text (but you can add additional type conversions if you need them.).

It also has some rough edges: If the first value is empty, the column will be converted to text. Also, it contains the (improved) logic from this article:  So if a date is written in a way that it could also be a number, then it will be converted as a number. To minimize the room for errors here, I’ve converted the values to text first, but this is still something to watch out for. But in very many cases it will just do what you have long been looking for:

Use cases:

  • You don’t want to use the automatic but static/hard coded type-conversion in the 2nd step (because you know you’re table is going to have more columns in the future and you want to cater for proper type-conversion of them as well)
  • You’ve lost your column types due to some other command (like Table.ReplaceValues)

M-Code

Code to download: TableColumnTypesToFirstRowsTypes.txt

 

Enjoy & stay queryious 😉

How to expand a column that cannot be expanded in Power BI and Power Query in Excel

Especially when working with JSON-data, you might end up with a column that has elements of mixed types in it. The expand column – arrows will be missing, but some elements still need to be expanded, like here:

But there is an easy way to fix it:

Transform to expandable column

Table.TransformColumns(Source, {{“Column1”, each if Value.Is(_, type list) then _ else {_} }} )

It transforms the “Column1” from table “Source” by checking, if the content of the each row ( _ ) is of type list and if yes, keep that value ( _ ) and if not, transform it to a list (by framing it into curly brackets {_} )

Syntax for tables

Table.TransformColumns(Source, {{“Column1”, each if Value.Is(_, type table) then _ else #table({“Column1”}, {{_}} ) }} )

Syntax for records

Table.TransformColumns(Source, {{“Column1”, each if Value.Is(_, type record) then _ else [a=_] }} )

File for Subscribers to download: HTExpandColumnThatCannotBeExpanded.zip

This technique should be applied to columns where the expandable elements all have the same structure. If that’s not the case, you should use this technique instead.

Enjoy & stay queryious 😉

Table.TransformColumns – alternative in PowerBI and PowerQuery in Excel

This article shows a trick for a little problem that annoyed me for quite some time: How to get Table.TransformColumns transforming the values of a column with a reference to a value (from the same row) of another column?

1 Replace text by a value from another column

So instead of adding a new column where the “*” is replaced by the value from column “WhildcardValue”, I just want to perform the replacement-operation in the original “Text”-column, so that I don’t have to rename and delete the other columns later:

So far, I always ended up fighting with Table.TransformColumns-function and got quite frustrated because I couldn’t find a way to reference the (row-) value of the other column. As it turns out, I was fighting the wrong target here, because Table.ReplaceValue is actually the saviour for this challenge:

1.1 Check column “Text” -> Transform -> Replace Values

So in (4) we just filled in a dummy-value, which we’re going to replace with a reference to the desired column: “each [WildcardValue]” like this:

Read more

Non-linear Break-Even Analysis in PowerBI

A break-even analysis tells you at which value of the parameter in question your profit-calculation will turn positive (link). Here we need to sell at least 173 at a given price of 20 before we’ve recovered all our costs:

If your variable costs are constant, you can solve it by this formula:

BreakEvenQuantity = Total Fixed Costs / (Unit Sale Price - Unit Variable Costs)

You’ll find tons of examples on how to do this in Excel like here .

Non-linear cost structure

But in real life, the variable costs often depend on certain quantities as you get discounts for purchasing large amounts. The following table shows a cost structure with fix costs in row 1. The 3rd column “FixOrQty” indicates if the cost item is fix or dependent on the quantity (Qty). The 2nd row contains a variable cost that is constant with 2 for all quantities. Row 3&4 show a variable cost of 8 for quantities up to 100 and if you purchase more than 100 the costs will be lowered to 2 for all additional quantities. Row 5-7 have a similar structure, but with 3 quantity ranges:

 Solve with goal-seek algorithm

Read more

Import multiple files from Dropbox folder into PowerBI and Excel (via PowerQuery) at once

Below you’ll find a video where you can see how easy it is to import multiples files from a Dropbox folder into PowerBI or Excel at once.

There are 2 different methods to grant access to your Dropbox: Grant access to the whole Dropbox or to a (newly created ) folder only. I will present the folder-method, as granting access to your whole Dropbox is really dangerous in my eyes – unless you are prepared to share it all publicly: The token generated will allow anyone to read your data. So also all those people who you’ve sent this beautiful dashboard where you just forgot that it contained your token…

To make it super-easy for you, I’ve created a function that you can download here: fnDropboxFolder

The code for it I’ve got from this thread in the PowerBI forum, which contains some additional useful information and a link to a solution with a custom connector for PowerBI, making it easy to deploy in a corporate environment (designed by Igor Cotruta).

Just watch how it works:

Some screenshots to follow along:

Read more

Bill of Material (BOM) Explosion Part2: Costing in Excel and PowerBI

Following up on the BOM-explosion: A comment reminded me that I had missed to present the costing techniques to calculate the total costs of each (sub)-product.

Reversing the aggregation direction

What I had shown is how to “aggregate” from parent down to child-level to retrieve the total quantity of each component within a BOM (“How many of each (sub-) components do we have to order (or build) for that bike?”) (1).

Now we reverse the aggregation direction and aggregate the total (!) quantities back up to the parents (2).

And, as this doesn’t make too much sense in an economical way, the second aggregation will be their prices (3). This will give us the sum of all part-costs (“How much will the order of all the parts cost us?”). This is also very useful for planning purposes or reconciliation of prices for intermediate products with your master data.

And if your model holds sales-data as well, you can calculate the totals costs of your total sales within each period. (4)

VAR 1: Using classical hierarchies

Read more

Performance difference between Excel and PowerBI with M is huge!

I knew that the performance of M in the query editor of PowerBI was much better than in Excel, but only today I discovered the incredible difference we actually have here:

If you want to apply the BOM-solution I’ve posted here, you’ll soon discover that the performance in Excel starts to suck with large datasets. Performance decreases exponentially and my sample datasets with 4 levels and 100k rows didn’t went through, 16 GB RAM constantly at the limit, unable to do any other task at the same time.

In contrast, performance in PowerBI totally blew me away: Memory management is different. Rise in RAM-consumption was always below 3 GB, even with my largest dataset (a 5-level 1Mio (!) rows BOM table that exploded to 3,8 Mio rows). Also no sweat in CPU, so I was able to easily perform other tasks at the same time on my laptop.

Read more