Mike Allen Systems

Doing Smarter Things With IT

EMA Calculations with Power Query

This is my first attempt attempt at EMA calculation in Power Query, this is a work in progress. My intention is to use this work as a basis for technical indicators used in the stock market such as Bollinger Bands.

I have taken my EMA Example from https://stockcharts.com/school/doku.php?id=chart_school:technical_indicators:moving_averages which I find very useful on the subject. I was also informed by Reza Rad at https://radacad.com/list-accumulate-hidden-gem-of-power-query-list-functions-in-power-bi which showed me how to use List.Accumulate, then I just needed to find an application for that knowledge and here it is.

The data that I have used is from the stockcharts.com EMA example, this helped me validate my solution, I started from their downloadable data. I’ve put some comments in the code, the magic is in the List.Accumulate, of course. The input data is in an Excel Workbook, in a table called “DataTable”, with two columns “Date” and “Close”

     Source = Excel.Workbook(File.Contents("emadata.xlsx"), null, true),
     DataTable_Table = Source{[Item="DataTable",Kind="Table"]}[Data],
     #"Changed Type1" = Table.TransformColumnTypes(DataTable_Table,{{"Date", type date}, {"Close", type number}}),
     EMA_Factor = .1818 ,
     Average = 10 ,
     #"Changed Type" = Table.TransformColumnTypes(#"Changed Type1",{{"Close", type number}}),
     // Add an Index to out close price table
     #"Add Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1),
     //  Create a list of the closing prices
     ClosePriceList = #"Add Index"[Close],
     // Calculate SMA , the Simple Moving Average
     #"Add SMA" = Table.AddColumn(#"Add Index", "SMA", each if [Index] >= Average then List.Sum(List.Range(ClosePriceList,[Index]-Average,Average))/Average else ""),
     // Get the first SMA value to be the first EMA
     StartEMA = Table.Range(#"Add SMA",Average-1,1)[SMA]{0},
     // Calculate the Expotential Moving Average
     #"Add EMA" = Table.AddColumn(#"Add SMA", "EMA", each if [Index] >= Average then List.Accumulate(List.Range(ClosePriceList,Average,[Index]-Average),StartEMA,(state,current)=>((current - state) * EMA_Factor) + state) else "")
     #"Add EMA"

Here’s a picture of the final output, any feedback is greatly appreciated, thank you for reading:


Aging Debt using Power Query Lists

Power BI problems that require accessing previous rows of data can be difficult in Power BI, one way of resolving this is to translate columns into lists and then process the lists. This is an example that performs debtors aging, where we have one payments and five aging buckets that we wish to allocate those payments into. Please refer to my previous post http://mikeallen.com.au/post/Creating-an-Aging-Calendar-in-Power-Query as to how the aging buckets are created and assume that transactions have been grouped similar buckets numbered B1 thru B5.

Here is some sample data:


Here are the payments:


The method I’m going to use to age involves comparing the payment to the cumulative debt on the account going from B5, the oldest to B1 the newest,  so I need to get the data to look like this:


When I have this data I then apply this formula to each line to calculate the aging value:

if [Payment] >= [Cumulative] then 0
                     else if [Payment] < [Cumulative] and [Payment] > ([Cumulative] - [Actual])
                             then [Cumulative] - [Payment] else [Actual])

This handles the three cases of a fully paid, partially paid and unpaid buckets in one calculation, which I think is pretty neat!

Here is the Power Query that does all this, this is a Beta, I’m still refining it. The data is held in an Excel workbook in the format as shown above:

     Source = Excel.Workbook(File.Contents("BucketAging.xlsx"), null, true),
     Buckets_Sheet = Source{[Item="Buckets",Kind="Sheet"]}[Data],
     #"Promoted Headers" = Table.PromoteHeaders(Buckets_Sheet, [PromoteAllScalars=true]),
     #"Removed Other Columns" = Table.SelectColumns(#"Promoted Headers",{"Account", "Type", "Value"}),
     GroupRowsByAccount = Table.Group(#"Removed Other Columns", {"Account"}, {{"DataTable", each _, type table [Account=number, Type=text, Value=number]}}),
     MergePayments = Table.NestedJoin(GroupRowsByAccount, {"Account"}, Payments, {"Account"}, "Payments", JoinKind.LeftOuter),
     ExpandPaymentValue = Table.ExpandTableColumn(MergePayments, "Payments", {"Value"}, {"Value"}),
     RenamedPaymentColumn = Table.RenameColumns(ExpandPaymentValue,{{"Value", "Payment"}}),
     AddBucketNumbers = Table.AddColumn(RenamedPaymentColumn, "Bucket", each List.Numbers(1,5)),
     ExpandBuckets = Table.ExpandListColumn(AddBucketNumbers, "Bucket"),
     AddAgingList = Table.AddColumn(ExpandBuckets, "AgingList", each [DataTable][Value]),
     AddCumulativeValue = Table.AddColumn(AddAgingList, "Cumulative", each List.Sum(List.FirstN([AgingList],[Bucket]))),
     AddActualValue = Table.AddColumn(AddCumulativeValue, "Actual", each [AgingList]{[Bucket]-1}),
     AddAgingCalc = Table.AddColumn(AddActualValue, "Aging", each if [Payment] >= [Cumulative] then 0
                     else if [Payment] < [Cumulative] and [Payment] > ([Cumulative] - [Actual])
                             then [Cumulative] - [Payment] else [Actual]),
     AddBucketList = Table.AddColumn(AddAgingCalc, "AgingBuckets", each [DataTable][Type]),
     AddAgingBucket = Table.AddColumn(AddBucketList, "AgingBucket", each [AgingBuckets]{[Bucket]-1})

That’s all expanded out, so remove the columns that you don’t want.

I’ve tried to make the code self-documenting, it is conceptually fairly simple, you group the data down to a row for each account, expand it out for each bucket (in reverse order), then convert the values into a list and perform arithmetic on the list values while extracting the values back out again. Have a play, Lists in Power Query make it possible to do computation across rows of data.

This technique has been tested on several thousand accounts and performs well, any feedback is appreciated, this information is provided with no guarantee whatsoever Smile.


Creating an Aging Calendar in Power Query

Getting the best out of your data in PowerBI often requires the creation of specialized calendars that help you analyze your data the that you need it. I’ve been creating calendars using Power Query. Another problem I’ve had is where to store simple reference data, like the aging buckets for an Aging Calendar. So I combined these two issues and this is my solution. I must emphasize that this is not the only one way of solving this problem and may not suit your requirements, but take my thoughts and see if you find them useful.

Here is the Power Query:

     AgingTable = #table(

     EndDate = Table.AddColumn(AgingTable, "EndDate", each Date.AddDays(DateTime.Date(DateTime.LocalNow()),-[start])),
     StartDate = Table.AddColumn(EndDate, "StartDate", each Date.AddDays(DateTime.Date(DateTime.LocalNow()), -[end])),
     Duration = Table.AddColumn(StartDate, "Duration", each Duration.Days([EndDate]-[StartDate])+1),
     Datelist = Table.AddColumn(Duration, "DateList", each List.Dates([StartDate],[Duration],#duration(1,0,0,0))),
     #"Expanded DateList" = Table.ExpandListColumn(Datelist, "DateList"),
     #"Removed Columns" = Table.RemoveColumns(#"Expanded DateList",{"start", "end", "EndDate", "StartDate", "Duration"}),
     #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"DateList", "Date"}})
     #"Renamed Columns"

This is fairly simple, but put together creates, I think, an elegant solution for an aging calendar. There’s an order column in the AgingTable, so that you can get your buckets in the right sequence on reports. I’ve started the date from today, but you can use other options, such a parameters. You can also insert this query into a PowerBI service DataFlow, which I’m starting to favor for my PowerBI calendars. Another nice technique here is the use of the list to generate the dates. Lists are one of the most powerful features of Power Query.

Let me know what you think, all feedback is greatly appreciated. Oh, and this advice is offered without any guarantee Smile

Improved Xero Invoice extraction using Linx

Having successfully extracted Xero invoices using Linx, I decided to drill deeper an improve my understanding and methods.

My aim is still to extract the invoices and load them into Power BI, but now I have the additional target of loading the invoices into CosmosDB and then loading the CosmosDB collection into PowerBI, and eventual Azure Analysis Services.

I improved my understanding of Invoice Type, realizing that the data was both Accounts Receivable and Payable.

Here’s a sample of the data:

         "InvoiceType": "ACCPAY",
         "ContactName":"Melrose Parking",
         "Partition": "201709"

Key takeaways:

    1. The ‘id’ is critical and must be named that to map to the id in the Cosmosdb
    2. That Url, while null in the example, is the url of the Invoice PDF
    3. I’ve added a ‘Partition’ field for CosmosDB and Analysis services use.
    4. I’ve trimmed the time off InvoiceDate and DueDate
    5. All the JSON formatting is done in Linx

Here’s a pic of the Linx project:


That InvoiceItem is a type I’ve defined with the format for the JSON

My thinking is that as the invoice status changes I can update CosmosDB using the id, the CosmosDB API has an ‘Upsert’ functionality which will do this neatly. The intention being to extract Xero Invoices that have changed in the last day on a daily basis using Linx and then upsert them to the CosmosDB and then refresh the Power BI/ Analysis Services Model.

I’m using a very simple C# program to do the upsert, this is all dynamic, with very little code.

foreach(dynamic invoice in invoices)
                await this.client.UpsertDocumentAsync(UriFactory.CreateDocumentCollectionUri(databaseName, collectionName), invoice,options);

This code doesn’t care what the invoice looks like, the CosmosDB has no schema, but I insert into a collection called ‘Invoices’ with a consistent layout.

Also it is still possible to load the JSON directly to PowerBI, the query is very simple:

     Source = Json.Document(File.Contents("C:\Users\Mike\Documents\LinxOutput\invoice.json")),
     #"InvoiceTable" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
     #"Expanded Column1" = Table.ExpandRecordColumn(InvoiceTable, "Column1", {"InvoiceDate", "InvoiceNumber", "Total", "Status", "id", "SubTotal", "TotalTax", "TotalDiscount", "DueDate", "ContactId", "ContactName", "UpdatedDate", "Reference", "AmountDue", "AmountPaid", "InvoiceType", "Url"}, {"InvoiceDate", "InvoiceNumber", "Total", "Status", "id", "SubTotal", "TotalTax", "TotalDiscount", "DueDate", "ContactId", "ContactName", "UpdatedDate", "Reference", "AmountDue", "AmountPaid", "InvoiceType", "Url"}),
     #"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1",{{"InvoiceDate", type date}, {"DueDate", type datetime}})
     #"Changed Type"

If you’ve got this far, thank you for your interest, there’ll be more on Xero, Linx, CosmosDB and PowerBI in the near future.

Get Invoices from Xero to Power BI using Linx

I have a side project exploring low-cost and low-code ways of interfacing with Xero. I’ve extracted data using Linx from https://linx.software/ , a useful low-cost software solution.

You need to create a private Xero App at https://developer.xero.com/documentation/auth-and-limits/private-applications , this is actually fairly simple. Generating the certificate is the most difficult bit, see my blog https://mikeallen.com.au/post/Application-Authenticate-using-publicprivate-key-pairs.

Once that is done set up a Linx job with two steps:

1/ GetInvoices


It’s all pretty self-explanatory and it will generate a nice json file for you.

Import the json into Power BI using Power Query, the Date formats are a bit tricky because there’s a time zone in there. I’ve fixed the ‘Date’ column to make it just a Date.

Here’s the query:

     Source = Json.Document(File.Contents("C:\Users\Mike\Documents\LinxOutput\LinxInvoices.json")),
     #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
     #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"Type", "Contact", "Date", "DueDate", "Status", "LineAmountTypes", "LineItems", "SubTotal", "TotalTax", "Total", "TotalDiscount", "UpdatedDateUTC", "CurrencyCode", "CurrencyRate", "InvoiceID", "InvoiceNumber", "Reference", "BrandingThemeID", "Url", "SentToContact", "ExpectedPaymentDate", "PlannedPaymentDate", "HasAttachments", "Payments", "CreditNotes", "Prepayments", "Overpayments", "AmountDue", "AmountPaid", "FullyPaidOnDate", "AmountCredited"}, {"Type", "Contact", "Date", "DueDate", "Status", "LineAmountTypes", "LineItems", "SubTotal", "TotalTax", "Total", "TotalDiscount", "UpdatedDateUTC", "CurrencyCode", "CurrencyRate", "InvoiceID", "InvoiceNumber", "Reference", "BrandingThemeID", "Url", "SentToContact", "ExpectedPaymentDate", "PlannedPaymentDate", "HasAttachments", "Payments", "CreditNotes", "Prepayments", "Overpayments", "AmountDue", "AmountPaid", "FullyPaidOnDate", "AmountCredited"}),
     #"Expanded Contact" = Table.ExpandRecordColumn(#"Expanded Column1", "Contact", {"ContactID", "ContactNumber", "AccountNumber", "ContactStatus", "Name", "FirstName", "LastName", "EmailAddress", "SkypeUserName", "ContactPersons", "BankAccountDetails", "TaxNumber", "AccountsReceivableTaxType", "AccountsPayableTaxType", "Addresses", "Phones", "IsSupplier", "IsCustomer", "DefaultCurrency", "UpdatedDateUTC", "XeroNetworkKey", "SalesDefaultAccountCode", "PurchasesDefaultAccountCode", "SalesTrackingCategories", "PurchasesTrackingCategories", "PaymentTerms", "ContactGroups", "Website", "BrandingTheme", "BatchPayments", "Discount", "Balances", "HasAttachments"}, {"ContactID", "ContactNumber", "AccountNumber", "ContactStatus", "Name", "FirstName", "LastName", "EmailAddress", "SkypeUserName", "ContactPersons", "BankAccountDetails", "TaxNumber", "AccountsReceivableTaxType", "AccountsPayableTaxType", "Addresses", "Phones", "IsSupplier", "IsCustomer", "DefaultCurrency", "UpdatedDateUTC.1", "XeroNetworkKey", "SalesDefaultAccountCode", "PurchasesDefaultAccountCode", "SalesTrackingCategories", "PurchasesTrackingCategories", "PaymentTerms", "ContactGroups", "Website", "BrandingTheme", "BatchPayments", "Discount", "Balances", "HasAttachments.1"}),
     #"Changed Type1" = Table.TransformColumnTypes(#"Expanded Contact",{{"Date", type datetimezone}}),
     #"Removed Columns" = Table.RemoveColumns(#"Changed Type1",{"ContactID", "ContactNumber", "AccountNumber", "ContactStatus", "FirstName", "LastName", "EmailAddress", "SkypeUserName", "ContactPersons", "BankAccountDetails", "TaxNumber", "AccountsReceivableTaxType", "AccountsPayableTaxType", "Addresses", "Phones", "IsSupplier", "IsCustomer", "DefaultCurrency", "UpdatedDateUTC.1", "XeroNetworkKey", "SalesDefaultAccountCode", "PurchasesDefaultAccountCode", "SalesTrackingCategories", "PurchasesTrackingCategories", "PaymentTerms", "ContactGroups", "Website", "BrandingTheme", "BatchPayments", "Discount", "Balances", "HasAttachments.1", "Status", "LineAmountTypes", "LineItems", "TotalDiscount", "CurrencyCode", "CurrencyRate", "InvoiceID", "BrandingThemeID", "Url", "SentToContact", "ExpectedPaymentDate", "PlannedPaymentDate", "HasAttachments", "Payments", "CreditNotes", "Prepayments", "Overpayments"}),
     #"Changed Type" = Table.TransformColumnTypes(#"Removed Columns",{{"Date", type date}, {"DueDate", type datetimezone}, {"UpdatedDateUTC", type datetimezone}, {"FullyPaidOnDate", type datetimezone}}),
     #"Merged Queries" = Table.NestedJoin(#"Changed Type",{"Type"},INV_CR,{"Type"},"INV_CR",JoinKind.LeftOuter),
     #"Expanded INV_CR" = Table.ExpandTableColumn(#"Merged Queries", "INV_CR", {"INV_CR"}, {"INV_CR.1"}),
     #"Removed Columns1" = Table.RemoveColumns(#"Expanded INV_CR",{"Type"}),
     #"Renamed Columns" = Table.RenameColumns(#"Removed Columns1",{{"INV_CR.1", "INV_CR"}})
     #"Renamed Columns"

I added a INV_CR table to convert Type 0 to CR and 1 to INV.

This is a work in progress, my aim will be to load data to a CosmosDB, and get it all into the cloud. Feel free to get n touch if you have questions about this.

That Power Query should work nicely in Excel as well, so there are a few possibilities here

Power BI, the new desktop multi-tool

I’ve been creating short videos for Power BI and Excel users for around a year now, publishing one a month to my You Tube channel. I’m starting to get a feel for what people are looking for in Power BI, and I am surprised. My most recent video Power BI - scraping the web is a real hit! This says to me that not only is Power BI a superb business intelligence tool, it’s also handy for data extraction and manipulation. The other real lesson is that observation of how people approach a new app can indicate areas of potential growth.