Comfort Functions for Easy Profit & Loss statements in Power BI and Excel – Part2

Here comes some long awaited comfort functions for part 2 of my easy P&L series. In the first section I’ve presented the general principle on how to work with a structure using an accounts-group-table. Today I will present 2 alternatives to define the reports without specifying single accounts. So if a new accounts are added to the chart of accounts, you don’t have to adjust your report definitions: Just make sure that to fill in all the fields in your account-group-table and you’re ready to go 🙂

No need to specify single accounts

So you only need to adjust your report definitions if you add new group items. If that’s still too much, take the 2nd solution, which will even eliminate that requirement:

  1. Individual Account Layout: Just define each subtotal and determine for which subtotals single accounts shall be shown

No more specification of individual accounts

How to use it:

How to use Individual Report Layout

2. Ultrashort Account Layout: Further simplification of just defining the groups (hierarchy) that shall be shown (with option to filter on one of them)

No need to define individual group items

How to use it:

How to use Ultrashort Layout

So these 2 different layouts will both produce the same reports incl. all accounts – just like in the first example. So you can choose which layout-style suits you best – actually, you can use all 3 in parallel. You just have to make sure to grab your pivot-rows from the correct tables and in Excel to grab the matching measures, as they all have their own bridge-tables (which need to be used in the measures):

 

How it works

Read more

Easy Profit & Loss (and other account) statements in PowerBI and Excel – Part2

Welcome to part 2 of my series of easy Profit & Loss and other account statements in Power BI and Excel. In the first part I introduced the general principle of creating asymmetric shaped reports who use just one measure per column (you should have read this article in order to understand this post here).

How the technique works

This technique capitalises the aggregation power of the Vertipaq engine and creates a bridge-table between your DimAccount-table and the ReportsAccountsLayout-table. In there for every line of your report, all accounts that belong to the (sub-)totals are matched (“AccountsAllocation”). This table can get very long, but the engine can handle this easily:

Different use case: Account-groups-tables

In the first example we’ve worked with a chart of accounts, which had a parent-chield-hierarchy defining all the subtotals of the report. In this example we’re working with a different setup, using the good old DimAccountsGroups-table. Just one row per account and the columns are coming in pairs, containing the group-criteria and the sort-order for the report:

Individual Report-Layout

We also need a second table (ReportsAccountsLayout) that holds the definitions of the report-layouts like this:

Read more

Dynamic & bulk type transformation in Power Query, Power BI and M

This is just a quick code-share-of-the-day of different scenarios for dynamic type transformation of multiple columns at once.

The syntax to transform the format of 2 columns (“Column1” and “Column2”) in a table (“Source”) looks like this:

Now if you would like to do one of the following:

1. Force all columns of the table to be transformed to one type (text)

The second argument of Table.TransformColumnTypes is a list of lists, whose elements contain 2 arguments: The name of the column to transform and the type to be applied.

For this dynamic approach we start with a list of the table’s column names (Table.ColumnNames) and transfer it magically to a list of list, using List.Transform with an expression with curly brackets again like this: each {_, type text}: This operation iterates through every element of the list (List.Transform) and performs the actions that follow the “each” on every element of the list (which is represented by the underscore: _)

2. Transform all newly added columns of a table to one specific type

Imagine you have a table with different column types where users can add new columns with random names. You want these columns automatically to be converted to text:

Same procedure as the first, just that you need to identify the newly added column names. Therefore you use List.Intersect with the two tables to compare in list-format (curly brackets) as shown in line 3 above.

3. Transform all columns whose name are in a list to one specific type

Let’s close with the easiest case, which you’d probably be able to find out by yourself: Say your query returns a dynamic list somewhere with column names who then shall all be converted to a specific type:

You can directly reference the List as the first argument of the List.Transform-command.

You can download the file here: ChgTypeOfColumns.xlsx

Enjoy & stay queryious 🙂

 

Easy Profit and Loss and other (account) scheme reports in Power BI and Power Pivot using DAX

This is about an easy way to create typical finance reports like Profit and Loss using DAX that (unlike all other solutions I’ve come across so far) can be handled with very basic knowledge of this language like this:

Image1

The trick

The trick that makes my solution so easy lies in the fact that it requires no aggregation functions of the output-mediums like:

  • pivot-tables: who struggle with asymetric logic and are not available in Power BI so far
  • cubefunctions: who are not available in PowerBI so far

So we have to build the details as well as all aggregations into the solution as it is and don’t rely on/use any aggregation functions (This means for Excel: We have to turn off subtotals as well as totals in our pivot tables. It means for Power BI: Hurray! Finally a solution where the lack of pivot-tables doesn’t matter).

How to

The aim is to create a table/matrix with (account) details and aggregations into the rows and different slices of time-Intervalls or comparisons into the column sections. As for the columns, this will be covered by measures like [Actuals], [Budget], [PreviousPeriods], [Differences in all shapes…]. And – as the values in the columns should be the same – I’d prefer to use only one measure per column – that is fully sliceable and works on all (sub-) totals of course. … Ok – so some dreams later I found it:

MyMagicMeasure := CALCULATE([StandardMeasure], AccountsAllocation)

So you just wrap simple measures like Act=SUM(Fact[Amount]), Plan=SUM(Plan[Amount]), DiffPlan_Act=[Plan]-[Act] … into the CALCULATE together with the bridge-table as the filter-argument:

This is the many2many-technique in it’s simplest form (PostFromMarcoRusso). It all goes via simple aggregation on all accounts found in the filter context:

Image4

Our  bridge-table “AccountsAllocation” consists of one account number per simple account and has multiple rows for the (sub-)totals – being all accounts that belong to them:

Image5

The ConsKey stands for the row in our report (1) and the AccountKey_ holds the account numbers that are going to be aggregated (many (for the sub-totals) and 1 for the account-rows). So all we need is this unique and simple aggregation on AccountKey for every row in the report – with a filter from the Reports-table via our bridge table to the DimAccounts, who then filters our FactTables: 1 -> many -> 1 -> many.

Read more

Advanced type detection in Power BI and Power Query: Table.ClassifyMixedColumnTypes

This is not a proper blogpost, just a quick share of a function I’ve created today which I think will be very useful for others as well:

Automatic type detection will assign only one type to a column, but sometimes there are multiple types in it and then the type “any” will be applied, which means no type at all. In these cases, often the type of the individual elements/rows in those columns are the key to the necessary ETL-transformation operations to be performed. So you would like to be able to write statements like this:

Table.AddColumn(PreviousStep, “Event”, each if Type.Is(Value.Type([MyColumn]), type text) then ThisEvent else AnotherEvent)

(reading: Add a column with a new Event that depends on if the type of the current row in MyColumn is text: then do ThisEvent else do AnotherEvent)

My advanced type detection function will identify and allocate different data types. It will try to apply a couple of type conversion operations and in case of success, apply the type to the individual cell/record field and in case of failure move on with the next tries. If none of the type conversions succeeds, it will return the original state (any). It takes the name of your table and a list of the column names on which it should be applied as the input-parameters.

(Table, ListOfColumnNames) =>
let
Table.ClassifyMixedColumnTypes.m = Table.TransformColumns(Table.TransformColumnTypes(Table, List.Transform(ListOfColumnNames, each {_, type text})), List.Transform(ListOfColumnNames, each {_ , each try Number.From(_) otherwise try Date.From(_) otherwise  try Text.From(_) otherwise _}))
in
Table.ClassifyMixedColumnTypes.m

So this function will apply different data types (expand as you like/need) within one column on a row-by-row basis – which is what I’ve been looking for quite a while 🙂

Thank you so much Bill Szysz for showing me how to use the List.Transform-trick in order to bulk-apply transformations!

If you’re as lazy as me, you could be tempted to pass “Table.ColumnNames” to the “ListOfColumnNames”-parameter – but this might slow your query down! (Guessing – not much practical experience gained yet)

A warning at the end, that this is of course error-prone – as some strings like “3.2” for example are not unambiguous and can get converted as a date although in the specific context should be numbers (and vice versa). So you should check properly and – if needed – perform some additional transformations before in order to guarantee the correct outcome. But at least in my case it did exactly what was needed.

Please share your thoughts/experiences with this function & stay queryious 🙂

How to analyse M-functions step-by-step?

This is a quick tutorial on how to analyse M-functions step-by-step.

Queries, that have been transformed into a function (1) can be invoked in the query-editor (2) and they will then return their output for you to check (3). But they will not show you how they did it. In the query-editor where you normally see the single steps and their individual output if you click on them, you will just see one line for the function and one for the invocation (4):

Step1

So how can we make them reveal all their secrets for us to see the quickest way?

Yes, we could edit the code and replace every parameter that has been passed. But this can get a bit tedious, especially if you haven’t written the code by yourself and don’t know where and how often the parameters are used. A very quick and safe way is to replicate the parameters within the let-expression by copy-&pasting like this:

AnalyzeM-Functions

That way you just define them once and don’t have to care about how often they are used within the following query. And it is very quick to do as you can just copy the parameter-expression and comma-separate your inputs.

Step2

I bet you want to check this out on Chris Webb’s latest blogpost here: DynamicDateSelections.zip

Enjoy & stay queryious 🙂

Multiple replacements or translations in Power BI and Power Query

A common task when cleaning data is to perform multiple replacements at once. When dealing with “translation tables” you might have come across Chris Webb’s solution using List.Generate to deal with this. This works fine – but might deliver unexpected results if you don’t consider it’s implications properly. Say we add lines like shown below, then we might expect the result as highlighted in yellow:

Image2

Multiple replacements using 2 different techniques

So hey – what happened here?: Our catamaran turns out as a bearamaran and the chair turns to chwater. Read more

How to hack yourself in Power BI (and Power Pivot?)

Reading Gerhard Brueckl’s post on how to visualize SSAS calculation dependencies reminded me of my post about a similar technique from December last year.

His solution has features that would do my version good as well:

  1. Directly connect to the model to be analysed without clumsy export of measures to txt via DAX Studio
  2. Including calculated columns? No!: Who does calculated columns in DAX in PBI? Do them in the query editor using M instead (more functions, better compression, easier merge of model to SSAS once needed)

So wouldn’t it be cool if we could just add a documentation page to our current model – “all in one” so to speak? Here you find how to “hack”-connect with Excel to your current Power BI Desktop-Model.

So what works with Excel should work with PBI as well – just that we need to connect via the query-editor, using M. And of course: As we’re hacking ourselves here (i.e. the file we’re currently working on), we need to save our changes in order to make them being shown.

Read more