Create a 445 Calendar using Power Query

March 19, 2015

by Matt Allington

In my last blog on I showed how easy it is to create a standard calendar for Power Pivot using Power Query.  Most of my customers however don’t use a standard calendar but instead use a 445 calendar (which is very common in the Retail Industry). A lot of people shared with me their 445 calendars after my last post, and that reminded me that although the concept of a 445 calendar is very common, everyone seems to have different rules on how the calendar works.

Differences in 445 Calendars include

  1. What month you start the financial year
  2. What day of the week is the start of the week.
  3. 365 divided by 7 = 52 + 1 day remainder. Different companies handle the extra day in different ways.

So there are almost as many permutations as companies and I don’t want to go down the path of trying to write 1 calendar that will work for everyone.  Instead this post covers the techniques I used to solve the 445 calendar problem for one of my customers.  If you are so inclined, you can copy these techniques plus some of your own to meet your own 445 needs.  To give you an idea, it took me about an hour to think through the problems, research the functions and build a working calendar.  If you copy my techniques, you should get a head start on that for your own 445.

The rules of this particular calendar are

  1. The first day of the financial year is the Monday on or before 1 May.
  2. The number of weeks each month are in the pattern 454 (4 for May, 5 for June, 4 for July) and then repeats.
  3. The calendar weeks, months and years mirror the financial data.  So Mon 31 Dec 2013 is the first week of financial week 36.  Even though the date is actually in the year 2013, it is treated as part of the first calendar week of 2014.

Here is how I did it.  But first a word of warning.  The steps are easiest to follow if you first download the sample workbook (link at the bottom) and step through the Applied Steps as you read my explanations in this post.  If you are not a Power Query expert and you just read the post, it will quite hard to understand.  So do yourself a favour and download the workbook, and step through it as you read the post

Read the rest of this entry »

What is Power Pivot’s #1 Competitor?

January 21, 2014

Tableau Versus Excel.  Not Tableau versus Power Pivot.  That is telling, ye?

This Picture is a Hint.  An Admittedly Annoying Hint That Hounds me on Facebook.

“OK, way to make it super-obvious, Rob.  It’s Tableau, right?”

Actually, no.  It’s not Tableau.  And the Tableau advertisement above basically proves my point.

By far, the biggest “competitor” to Power Pivot is…  Excel itself. 

In other words, lack of awareness that Power Pivot even EXISTS is still the biggest “competitor” to Power Pivot today.

The Tableau marketing department is smart.  They know that “normal” Excel is their chief competitor.  And they know that “normal” Excel has some frustrating weaknesses when it comes to data analysis.

So they go right for the throat.  I salute and admire their savvy.  Which brings me to a movie quote.

Read the rest of this entry »


September 5, 2012

Guest post by David Churchward [Twitter]


Gantt Hours measure debugging with DAX Studio – isn’t it pretty!

At the end of my last post in the Gantt Chart series, GANTT CHART WITH RESOURCE LOADING, I said that I would return to explain how the Hours measure worked.  I took a brief diversion, almost foray, into the world of CRM PIPELINE FUNNEL CHARTS but I’m now back to explain the workings of that Hours measure.  You can revisit PART1 and PART2 of this series to recap.

In the interim, I’ve been working on some “fine tuning” of that Hours measure.  I’ll come onto the final solution in due course, but I’ll work through the workings of the Hours measure as we left it initially to explain why it needed tuning up!

For those amongst you with an appreciation of SQL, you might think of this like creating a DAX Equivalent LEFT JOIN between tables with a BETWEEN thrown in for good measure.

Read the rest of this entry »

CRM Pipeline Funnel Chart (and something on Themes)

September 3, 2012

Guest post by David Churchward [Twitter]

Pipeline Funnel Chart in Excel PowerPivot

Pipeline Funnel Chart

For those of you waiting on the explanation of the SUMX / SUMMARIZE measure in GANTT POST 2, I have to put you on hold for a while longer.  Sorry.  In truth, I’m working through some performance aspects with that measure.

For now, I thought I would take the opportunity to expose the Pipeline Funnel Chart.  It’s really quite straight forward but it’s incredible how few people know that it’s available to everyone, doesn’t require any real manipulation and YES it’s a standard PowerPivot chart!

For those of you that want to dig straight in, YOU CAN DOWNLOAD THE WORKBOOK HERE

Read the rest of this entry »

Gantt Chart with Resource Loading

August 29, 2012

Guest post by David Churchward [Twitter]

Gantt Chart with Resource Loading Report

Gantt Chart with Resource Loading Report

The response to POST 1 of this Gantt Chart mini series was swift and it didn’t take long for Vegard to contact me with a query regarding resource loading.  Vegard had quickly taken the template and adapted it to include resource hours.  The issue existed with getting the total hours for each resource to behave the way he wanted.  Naturally, individuals will normally work for a set expected amount of time per day and if tasks are planned which exceed that expectation, it’s obvious that they might not get the work done!

In this post, I’ll start to explain how to adapt the model from the previous post to include resource loading and associated subtotals per individual resource.

Read the rest of this entry »

Gantt Chart in PowerPivot

August 28, 2012

Guest post by David Churchward [Twitter]

Gantt Chart in PowerPivot

Gantt Chart in PowerPivot, fully dynamic and sliceable!

It’s a rare diversion from the normal Financial stuff that I subject you to!  Having run numerous projects and found myself writing endless task lists in Excel so that I can distribute to other people, it suddenly occurred to me that PowerPivot can do this.

Not only can PowerPivot do Gantt Charts, I think it destroys other software in this game.  Sure, MS Project is always going to be there and I’m very fond of it, but, anyone familiar with this:

“Thanks for sending that through, but I don’t have MS Project.  Can you give me an Excel version or PDF”?”

Maybe you haven’t, but I find myself continuously confronted with distributing Excel task lists and producing my own Gantt charts in Excel to grab screenshots for presentations.  PowerPivot gives me (and you) the means to monitor and update tasks quickly and easily, but it also gives you the means to distribute via Sharepoint – awesome!

If you want to just get on with it, GRAB THE FILE HERE and start using it.  There’s some brief instructions on how to use it in the file.  Otherwise, read on …

Read the rest of this entry »

Tiered Commissions, Income Taxes, and other “Tiered Rate” Calculations in PowerPivot

July 23, 2012

Guest post by David Churchward [Twitter]

Commissions Part 3 Report

I mentioned in my LAST POST that I would return with a variation on that commission calculation.  Two months later and very much overdue, here it is!  Apologies to those of you who have been waiting on me for this.

I left off with a calculation that applied a commission rate based on a rates table.  That rate was applied to the total value.

But what about a scheme where one rate applies to the first N dollars (or pounds, euros, etc.), then another rate applies for the next M dollars, and so on?  Income taxes are calculated this way, for instance.  And being able to solve this sort of thing in Excel very efficiently (using PowerPivot) will be generally quite useful.

In this post, I’ll explain how to do this in a model that I refer to as Base Plus.  And while I will discuss this purely in the context of tiered commissions, keep in mind that it applies to taxes and many other things as well.

Read the rest of this entry »

Commission Calculations in PowerPivot Part 2

May 15, 2012

Guest post by David Churchward [Twitter]

Team and Manager Commissions Report

You may recall in my last post, COMMISSION CALCULATIONS IN POWERPIVOT, we got to the point where we could dynamically calculate the sales value and attributable commission rate that should be applied based on time, value and team parameters, reading from a Rates table.

In this post, we’ll complete the commission calculation, providing a different value for individuals and the team manager.

Where are we?

Just to recap, we got to the point in my last post where we had calculated [Sales_Value] and [Comm_Rate] as below:

Read the rest of this entry »

Commission Calculations in PowerPivot

May 9, 2012

Guest post by David Churchward [Twitter]

Commissions Report

Firstly, I have to be clear that I’m not presenting a “one-size-fits-all” approach to sales commission calculations here.  That wouldn’t be possible because commission schemes vary extensively and, in my experience, some schemes aren’t even based on a logic that can derive a mathematical answer!  However, I’m presenting an approach here that will hopefully provide a template approach that can be modified to a number of different scenarios.

The Target Outcome

The scheme that I’m using in this example operates as follows:

  1. Commission is paid monthly based on the achievement in that month
  2. As a salesperson sells more, then accelerators trigger.  That is to say, for example, a salesperson may receive 1% of sales up to £10,000 and 2% of sales between £10,000 and £20,000.  The 2% is payable on the whole value.  Therefore, if a salesperson sells £11,000 then they would receive 2% of the full £11,000
  3. Percentages and bands can change monthly at management discretion
  4. This is a monthly threshold so each salesperson is reset to zero at the start of each month.
  5. Each product group carries a separate set of bands and rates
  6. Managers receive commission at a different rate to the sales team based on the total sales for their team.

Read the rest of this entry »

Aged Debtors–Dynamic Banding in DAX

April 17, 2012

Guest post by David Churchward [Twitter]

Aged Debtors Report

Having hit the P&L and Cash Flow in previous posts, it seems only reasonable to move on to Balance Sheet aspects.  The die-hard “non-accountant” Excel Pros and programmers amongst you are probably experiencing a sudden bout of Narcolepsy, but let me assure you that this gives us the perfect opportunity to explore dynamic banding in DAX so please prop those matchsticks in place for the minute!

In this post, I’ll construct a dynamic Aged Debtors report.  This will calculate debt ageing values, showing outstanding debts by customer at user defined points in time.

The Accounting Terminology Bit

Let’s get this done quickly.  Debtors are those customers (normally) who owe the company money.  This is often legitimate as it is normal to offer a customer a period of time to process your invoice and pay.  However, it’s not unusual for this period to extend for a number of reasons.  I won’t go into this now because I could waffle on too long with experiences in this area and I’ve never even worked in Credit Control!

In short, an aged debtor report details all outstanding debts, by customer, categorised into timeframe buckets to show how old the debt is.

Historically, it wasn’t unusual for an aged debtor report to be something that could only be run as at “now”.  This means that accountants everywhere had a tiny window of opportunity to capture this information and preserve it in their archives with a wealth of information lost in hidden folders everywhere.  PowerPivot gives us the opportunity to derive a report that holds all of this information with historical ageing available based on user selection and the ability to profile customers to see who tends to take their time to pay and maybe highlight some reasons why.

The Dataset

The transaction process probably looks something like this:

  1. Invoice is created at a point in time
  2. Payment is received from the customer and a payment transaction is created
  3. Payment is “allocated” to the invoice

Allocations Fact Table

Aged Debtors Allocations Fact Table

My core table here is an Allocations table.  This table details all relevant transactions including invoices (and credit notes), payments and allocations.  These transactions might look like this:

Aged Debtors Invoice

An invoice is raised for £10k on 15th March 2012 with an invoice reference (or invoice number if you like) of 1100.

Aged Debtors Payment

Payment is received one month later with a reference of PAY1

Aged Debtors Allocation

The invoice and payment are allocated against each other meaning that the system has linked the payment PAY1 with the invoice 1100 and the outstanding debt is therefore zero and the payment is fully cleared down.

Customers Table

My customers table simply holds customer code and name, but it might hold other information about the customer such as customer contact and credit limit information.  Unfortunately, I can’t display this table here as it holds real customer information and a scrambled name that I ultimately show on this report…sorry! (it sort of defeats the object if I give that away in a screenshot!!)

Bands Table

Outstanding debts need to be categorised into distinct bands.  This is normally representative of months, but it can be anything.  I’ve created a linked table in Excel which provides a Name together with From and To parameters.  For example, 0-30 represents debts that are 0 (From) to 30 (To) days old.

Aged Debtors Bands Table

Dates Tables

I use 2 dates tables.  One is to link the allocation date to and the other is linked to the invoice date.  I’m using V1 for this so I need both tables, but in V2 you can hold one dates table and then use the V2 function USERELATIONSHIP to determine which field to link through to the Dates table on.

These tables are simply a list of sequential dates covering the timeframe of my analysis.  The invoice dates table holds an additional column called Month_End_Date simply to use as a logical slicer to put on my report.

Table Relationships

My relationships look like this:

Aged Debtors Relationships

You’ll notice that the Alloc_Fact table links through to Customers and both of the Dates tables but there is NO relationship to Bands.  Bands is a stand alone table that is used for report headings and parameters without any defined relationship.

Onto the DAX

My report is going to hold a slicer for Month_End_Date.  This is an “effective” run date.  That is to say that the user is going to select a time point up to which transactions and allocations will be included, but any transactions after those dates should be ignored.

Therefore, I create a measure called Debtor_Value which aggregates underlying transactions and allocations up to that date.












This measure uses a FILTER() on the Alloc_Dates table using COUNTROWS() to specify those dates that fit a criteria of being prior to the date selected on the slicer.

Why use FILTER() and COUNTROWS()?

It’s a reasonable question to ask why I can’t use a simple Month_End_Date filter as opposed to using FILTER() and COUNTROWS().  I have to hold my hands up and put a call out to Rob and The Italians for a technical reason, but in my mind, I know that I’m using the same field (Month_End_Date) in the evaluation, once from the slicer and once from my row set.  To define between the two, I need to use EARLIER() to reference my row set and MAX to call the value from my slicer.  To do so, I have to use a filter context that creates a table expression to identify all of the dates that ARE to be used in my measure.

Incidentally, if you want more on EARLIER(), Rob’s Rat Analytics sums it up.

Thanks to Alberto Ferrari here as this filter expression is something I picked up from him on the Mr Excel forum.

Aged Debtors UnAged

Irrespective of the technical explanation, I know that these values are correct, but the allocations to specific ageing buckets is obviously NOT.

Ageing Debtors

In order to allocate invoices to the correct ageing buckets, we need to calculate how old the invoice is and then calculate which bucket this relates to.  The age of the invoice has to reference the selected Month_End_Date.















The COUNTROWS() evaluation determines that we have one band to work with.  This eliminates any confusion over the aggregation level that we’re working at and ensures that we will get one result for the FROM and TO elements of the band.

CALCULATE() then takes over to use our previous measure [Debtor_Value] within a filter context that determines the invoices (or rather invoice dates) to use.

The FILTER() element of this expression is where things get tricky.  If I’m honest, in the first iteration of this measure, I tried to filter invoices.  My rationale was that each invoice carried a specific date.  From this date, I could determine how old the invoice was at any given point in time.  And, you’ll be pleased to know that it worked.  But there was a problem!  Because I’m carrying over 300,000 invoices in my dataset (which isn’t actually that many), any SUMX or MAXX is going to iterate over 300,000 records and therefore may not be very quick.  In addition, as my invoice count increases, so my performance is going to continue to degrade.

In an enlightened moment, I realised that I only had roughly 1,500 dates and the number of dates increases at a much slower rate than the number of invoices.  Could I use the same approach purely on dates and ignore the invoice number?  Ehhh………YES!

Essentially, MAXX is iterating over the table Inv_Dates and working out if the difference, in days, between the selected Month_End_Date on the slicer and Inv_Dates[Date] is greater than or equal to the Bands[From] AND also less than or equal to the Bands[To].  Where this is the case, this measure uses the value from [Debtor_Value] subject to the FILTER().  Where this is NOT the case, the measure uses the value from [Debtor_Value] but DOESN’T apply the FILTER() expression.

And, there you have it!

Aged Debtors Report

What’s Next

With these measures in place, wouldn’t it be great to mash this up with an invoicing profile and maybe even a dissection by product or industry sector to give a true profile on a customer?  I’ll be back with a load of mashups on this as soon as I can.

Incidentally, as I alluded to earlier, this analysis is a real dataset so I unfortunately can’t release the Excel.  Sorry.

My Experiences with Hosted PowerPivot, Part One

February 21, 2012

Guest post by David Churchward

Pivotstream Dashboard Application

One of Azzurri Communications Ltd’s PowerPivot
Applications Running in the Browser

Six Months Ago:  The “Lightbulb” Moment

Almost exactly six months ago, after being a long time reader of this blog, I emailed Rob and asked him a question regarding something that I just couldn’t get my head around in DAX – Banding!  He kindly responded, and his answer solved my problem, so I asked him if I could return the favour somehow.  He asked if I’d mind writing it up as a guest post, which I did.

Now, double-digit guest posts later, I’m amazed at how far I’ve come in short order.  Something definitely “clicked” for me that day, and my grasp of PowerPivot’s capabilities expanded rapidly.  It felt like that moment that I imagine Pianists reach where they can suddenly play by ear, because whilst I could conquer most things in DAX, it didn’t seem to quite “flow” – until that day!

Up until that point I had viewed PowerPivot as a “private” tool – something that was useful for me in my work, a supplement to other tools and methods.  But starting six months ago, I started to understand that PowerPivot could, and SHOULD, be used to improve or replace most of our existing Business Intelligence and Analytics tools.

Step One:  Azzurri Deploys its First “On Premise” SharePoint Server

At Azzurri, I am fortunate to enjoy two critical flavours of support:

  1. Our executive team is open-minded to progressive ways of doing things (so long as there is a solid value proposition).
  2. My tech team is a crack outfit who will bring me the moon if I ask for it, but tend to make reasonable alternative suggestions such as building data warehouses.

In other words, Azzurri is the perfect sort of place to deploy PowerPivot for SharePoint, bordering on a textbook example.  There aren’t many companies of Azzurri’s size where I could explain the benefits of a PowerPivot server, win people over, and have a server deployed two weeks later.  But that is precisely what happened Smile

Step Two:  Start Emailing Rob Again Smile

We didn’t just deploy the server, we immediately began USING it for serious work.  And that led to questions – questions about performance.  Questions about hardware.  Questions about customisation.  Questions about refresh.  Questions about “core and thin” workbooks.

Rob and I had a friendly correspondence going at that point, so I started sending those questions over.  I even looped him into email chains with our tech team, and we talked through a number of issues and optimisations.

Step Three:  Try Out Hosted PowerPivot in Parallel

Everything I do in Systems Development, especially with my Finance background, is about Cost V Benefit, ROI, IRR and payback.  With this in mind, I started wondering whether it made sense for us to develop PowerPivot for SharePoint expertise of our own.

We had originally decided to go with our own SharePoint deployment because we had the required licences and a particularly clever team who I had every faith could deliver.  This seemed obvious as SharePoint was already in operation at Azzurri.  My initial view was that it must be relatively straight forward to bring all of the BI tools into the equation.

Two weeks into the process, however, I was already seeing that things might not be as straight forward as I had first hoped.  Performance was the first major barrier that I hit and I couldn’t be entirely sure what kind of investment in hardware might be required to alleviate this.  Out of nowhere, PowerPivot gallery started playing games which turned out to be an IE9 issue and then I was introduced to Kerberos which, it turns out, isn’t a breakfast cereal that I was yet to encounter!

I knew about the Pivotstream Hosted Solution of course, and I still wasn’t ready to commit to hosting, but I decided that running a trial in parallel made a lot of sense, especially since I was particularly aware that my tech team needed to be doing other things.

I’m very glad that we decided on a trial, because step four was to switch over full-time.

Goodbye “Do it Yourself”, Hello SaaS

The journey I’ve been on as a customer of Pivotstream has validated for me that the SaaS model together with the capability of PowerPivot makes for a more compelling business solution for reporting and BI than any alternative that I can find.

I’ve been particularly conscious of making sure that my tech team spend their time where they can really drive business value – building Data Warehouses, ETL and efficient business processes.  It was clear that time spent developing SharePoint Server was time not spent adding value elsewhere.  There’s no doubt in my mind that they would have delivered, but I knew that they could deliver more value elsewhere to more than offset any cost of hosting.

Summary of Our Experience

Once I had taken the decision to try out Pivotstream’s hosted solution, it became clear that “elapsed time” taken was no longer going to be a constraint to the project.  On that same day, Azzurri had it’s own Pivotstream site in full working order with admin and consumer accounts setup for the trial.  It was now down to me to start making this a fully functional dashboard.

Naturally, I had workbooks at the ready and I loaded a few up immediately.  I started sniffing some of the additional features that I could now start playing with.  Before I knew it, I was canvassing Pivotstream for direction on Query String URL filtering (an awesome attribute to drill across to other dashboards).  A handy guidance document found it’s way into my inbox and I was away.

I was supplied with a program to split core and thin workbooks, another gem that just saves time and aggravation.  I served up a query with web part layouts and, next day, I get a new page layout deployed straight to our site.

Immediately, the focus of what I was delivering was about end user usability as opposed to finding ways around potential (and in some cases very evident) performance issues.  Performance was immediately apparent on the Pivotstream solution, as could be immediately seen by some of my more “chunky” analysis that didn’t even make it flinch.  My in-house SharePoint Server could take upwards of a minute to open these workbooks whereas the hosted solution barely registers seconds.

Within a matter of days, I realised that the limits of this solution only existed to the extent of the limits of my imagination in creating dashboards.  All of a sudden, my focus was turned on making sure that full value was derived and, to that end, I started spreading the word within Azzurri.  Some initial training took place and I immediately recognised that these clever individuals that I was working with had even more insight bursting to get out and the fact that we were playing in Excel meant that they could immediately relate to what they were being shown.  I had hit that fantastic point in the project where momentum starts taking over and this is probably less than two weeks after starting the trial.

Speed (both of implementation and application), elimination of complexity and additional value adding applications delivered in a scalable data-centre model with an OPEX cost model sums it up for me.  Now, it’s just about making the dashboards deliver the real benefit to the business – insight!

As I’ve been writing this, Rob’s reminded me of a comment I made back towards when this whole thing started:

“My key driver is laziness so I’m always looking for quicker and better ways to do things. In doing so, I find myself working non-stop so I may have my driver wrong or I’m failing miserably!”

The reality is that Hosted PowerPivot does do it quicker and better.  I’m working non-stop because the results speak for themselves and I fundamentally “get it”.  The reality is that my driver was wrong!

Prepayments & Deferred Revenue

January 24, 2012


Guest Post by David Churchward 

Prepayments & Deferred Revenue Layout Example

You’ll often hear us Accountants referring to things like adjusting for timing differences or prepaying costs or deferring revenue.  This is often interpreted as “massaging the numbers”, but, believe me, there is a very reasonable theory behind it.  These sort of adjustments can be made for a number of reasons, but the main underlying concept is that adjustments have to be made to ensure that the transaction is reflected in the period to which it relates.

In this post, I’ll explain how prepayments and revenue deferrals work together with the DAX measures that drive these calculations.

What are Timing Differences?

It’s worthwhile just taking the time to understand what we’re trying to do here.  Take, for example, a support contract that you bill to your customer in advance for 12 months.  You invoice the customer in advance and the customer duly pays.  However, as per accounting principles, you have to reflect the fact that the invoice relates to 12 months worth of support.  In it’s simplest sense, your Profit and Loss should therefore see 1/12th of that revenue in each reporting month.  Since the invoice is for a full 12 months, we have to create transactions to defer the element that relates to future periods, entering the balance onto the Balance Sheet.

As a quick example of the calculation, at the end of the first month to which the support contract relates, you would defer 11/12ths of the invoice value:

Deferral Value = Invoice Value * (Number of Future Months / Total Months on Invoice)

We would therefore see 1/12th of the invoice value on our Profit & Loss and the remaining 11/12ths would be sat on our balance sheet.

Referring to Prepayments separately from Deferred Revenue (or Deferred Costs) for that matter is simply a matter of terminology.  The underlying concept of the calculation is the same.  I consider everything to be a Prepayment in this sense.  The only difference is where the transactions sit in the Profit & Loss and Balance Sheet statements.

Note – Accountants may defer revenue or costs for other reasons.  This post is simply a calculation to reflect the timing difference adjustment.

Days or Months

In the simplified example above, I’ve used a monthly concept that allows for a simple allocation of the transaction value to the Profit & Loss in equal monthly values.  However, as we know, the number of days in each month can differ.  For this calculation to be more accurate, we should pro-rata the calculation based on the number of days in that month.

The DAX measures that I show in this post will allow for selection of Monthly Deferral / Prepay or Daily Deferral / Prepay.  Which you use depends on your accounting policy.

Data Structure

My data structure is as follows:

Fact Table (Prepayments)

I have a core fact table which contains all transactions that require a prepayment or deferred revenue adjustment.  I have called this table Prepayments.

Prepayments Fact Table

The key elements to this table are as follows:

Date – This is the transaction date of the invoice posting

Nominal_PandL – This is the Profit & Loss Nominal Ledger account that the invoice was posted to

Nominal_Prepay_BS – This is the Balance Sheet Nominal Ledger account where the prepayment or deferral will be posted to.

Start / End – These are the start and end dates of the transaction.  If the invoice relates to a period from 1st Jan 2012 to 31st Dec 2012 then these dates would be 01/01/2012 and 31/12/2012 respectively.

Method – This is either MONTH or DAY depending on the level of granularity that you want your calculation to be calculated at.

PandL_Values – This is the total value of the invoice charge.

Prepay_Start / Prepay_End – I’ll come onto these alternative dates shortly.  In brief, these exist because your prepayment timeframe may differ from the strict dates to which the invoice relates.

Days Table

Prepayments Days Table

This table holds a sequential, unbroken list of dates (Date) together with associated attributes including Month End Date, Previous Month End Date, Month Start Date and Next Month Start Date.  These attributes become relevant in our measures and, whilst these can be calculated at run time, it’s easier to take care of them this way.  The purest would be correct in suggesting that there’s a potential performance impact, but dealing with these aspects in our measure may confuse the issue for now!

Months Table

Prepayments Months Table

Similar to the Days table, this table holds a collection of attributes relevant at the month level including Month End Date, Previous Month End Date, Month Start Date and Next Month Start Date.

Start Dates Table

Prepayments Start Dates Table

This is another sequential list of dates which is joined to the Start field on the Prepayments table.  I’ll come on to the PrepayStartdate field shortly.

End Dates Table

Prepayments End Dates Table

This is another sequential list of dates which is joined to the End field on the Prepayments table.  You’ll notice the absence of a third field which, again, I’ll come on to.

Table Relationships

Prepayments Table Relationships

The only relationships that exist are between the Prepayments table and the Start_Dates and End_Dates tables.

Dates That Span Periods

A charge period can be anything.  In it’s simplest form, it would be for complete months starting on the first day of a month and ending on the last day of a month.  However, things are never that simple.  Unfortunately, you could get a charge that is for 3 months from 14th Jan 2012 to 13th Apr 2012 as an example.  The time between these two dates is indeed 3 months, but the charge covers 4 discrete calendar months.

If you’re using a Days method of apportionment, this isn’t a problem.  However, if you’re apportioning on the basis of months, you have 3 options in this example:

  1. Take the 3 equal months charges as Feb, Mar and Apr
  2. Take the 3 equal months charges as Jan, Feb and Mar
  3. Take equal monthly charges for Feb and Mar with a pro-rata charge for Jan and Apr.

In truth, if you’re going to even attempt option 3, you would probably be better off jumping straight for the Days apportionment and circumvent the problem!  Options 1 and 2 depend on accounting policy and president.  The prudent option would probably be to go for option 2 for costs (taking them as early as possible) and option 1 for Revenue (taking later rather than earlier).  On the basis of consistency, I feel it’s best to prescribe this in your accounting policies and then use the same method for everything.  Having said that, the best option is almost certainly to take a Days approach.

For the purpose of this post, I’m going to use option 1.  This means that I’ll translate any start date that isn’t the first day of the month to the first day of next month.  I create two calculated fields on my Prepayments table as follows:






These measures retrieve an alternative date from the Start_Dates and End_Dates table.  For any mid-month dates, this table holds a PrepayStartdate of the first of the next month.

The Months Approach

Firstly, let’s just remember our simplified equation to give us the deferral value:

Deferral Value = Invoice Value * (Number of Future Months / Total Months on Invoice)

And now we’ll jump straight into the DAX






























The first two IF statements in this measure calculate when we want the rest of the measure to evaluate.  We want this measure to evaluate when:

  1. The number of month end dates on our report is 1 (because anything else is likely to return an error)
  2. The prepayment start and end dates fit within the range of our report.  This, essentially, tells the report which month end dates to display and the rest of the measure uses these dates to evaluate the correct result to display.

We then use a CALCULATE statement to calculate our basic equation:

Deferral Value = Invoice Value * (Number of Future Months / Total Months on Invoice)

Invoice Value = SUMX(Prepayments,Prepayments[PandL_Value]

Number of Future Months =











This calculates the number of month values in the Months table using the COUNTROWS function that sit between the MonthEndDate on our report and the MonthEndDate of the end date of our transaction.  I then subtract 1 to reflect that the MonthEndDate of the report is not included as it is the current reporting month.

Total Months on Invoice =









Once again, I’m counting the number of entries in the Months table that sit between the start and end dates of our prepayment timeframe.

I then apply a filter to the CALCULATE statement to ensure that this measure is only applied to transactions where the Method is set to “MONTH”.

So why do we have to use SUMX for the Invoice Value?

The simple answer is that we need to conduct this equation at the transaction level of granularity and then SUM the results.  The main reason for this is that we’re using a division calculation which is only relevant at that level and becomes confused and meaningless when aggregated.

In order to show this, I’ll explain a simple example of converting USD to GBP with some theoretical exchange rates:

SUMX Example

What are the values for X and Y?  We obviously know that Y needs to be 150, being the sum of the two GBP values.  However, if we use SUM, we wouldn’t get that answer.  The answer that you would get depends on how you condition your measure but you wouldn’t get an answer of 150, it’s more likely to be 150/2=75 (where MAX is used in the equation) or 250/3.5=71.42 (where SUM is used) or 250/1.75=142.86 (where SUM and AVERAGE is used).  The correct value only comes about when you use SUMX which sums the underlying outcomes of the equation and evaluates at the granularity level of the table that you specify (in our case the base level fact table – Prepayments).

This is a very brief outline of a very complex and clever function.  In short, if you want to simply add the outcome of underlying equations, SUMX is the way to go.

It’s worth checking out one of my favourite ever posts to start getting to grips with SUMX – SUMX() – The 5 point palm exploding fxn technique.

The Days Approach

This uses the same methods and concepts as the Month approach.  The only difference is that we’re evaluating to the number of days in the relevant periods of time.





























Prepayments Total

We now have two measures to evaluate our Days Prepayments and our Months Prepayments.  We can bring these together in one measure as follows:



Once again, I have to use SUMX to ensure that this measure sums the total of the underlying calculations.

Prepayments Without Values Not Prepaying

The “Values Not Prepaying”

So we’ve dealt with the reducing balances associated with Monthly and Daily Prepayments and Deferred Revenues.  However, what about a situation where the invoice charge relates to future months that haven’t yet been reached?  For example, we receive an invoice in January that relates to April and beyond, as in the screen shot above where the last entry has a transaction date of 10th Jan but relates to a period April to October.  In this instance, we have to prepay (or defer) the full amount of the invoice for January, February, March and April.














In this measure, we once again use SUMX to ensure that we SUM the outcome of the underlying evaluations.  However, we apply two filters to limit the selection to situations where certain criteria hold.  This is an alternative to preceding the CALCULATE with an IF statement to conduct this evaluation first.

  1. The first filter evaluates when transaction date is less than the MonthEndDate of the report column
  2. The second filter evaluates when the Prepay_Start date is greater than the MonthEndDate of the report column.

We then add this into our Prepayments_Total measure as:




Prepayments Balance Sheet Layout

And there you have it.  Our Balance Sheet layout above shows the total value of each transaction that needs to be prepaid or deferred in each month.

What This Means to Accountants

For those Accountants amongst you, you’ll no doubt have cottoned onto the fact that, if you’re careful, this could be your prepayments and deferred revenue reconciliations conquered.  The process that I’ve always implemented is as follows:

  1. Ensure that all transactions are posted to the Profit and Loss in full, as if the P&L was going to take the whole value in one period.
  2. Capture prepayment start and end dates on each transaction
  3. Extract the data into a PowerPivot application such as this.
  4. Refresh to give you the values that need to be prepaid (as per the above screenshot)
  5. Journal the appropriate values between the Profit and Loss and Balance Sheet on a reversing journal.
  6. Next month the previous transaction will reverse and you do the same again.

By doing this, you’ve got a self reconciling Balance Sheet for Prepayments and Revenue Deferrals (and all transactions for that matter that have a straight line release – for example, you could use this for elements of Fixed Asset Depreciation but that’s a whole other post).

And One More Thing

You may have noticed in the first graphic that there was also a Profit & Loss impact section to the report.  I’ll come onto this in a future post but it uses similar concepts to the measures in this post.  With this in place, you’ve got elements of a forecast P&L, you’re on course for Recurring Revenue Assurance analysis and you’re putting in place the building blocks of a full Finance Business Intelligence Application that’s just dying to be exposed in SharePoint as per Rob’s post In the Browser Aesthetics Yield a Greater Return.