Power BI DAX Getting the Value of Previous Non-NULL Row

If you are an avid report designer or user, you may have wanted to see the percent difference between one row and the previous one. Sounds easy right? Well let’s break it down. Percent difference follows a consistent formula: (current value – previous value)/previous value. Current value? Easy peasy. Previous value? That’s where this blog comes into play. It’s fairly simple to get the previous month value or previous date period value because Power BI can tell it exactly what the filter would be (September minus one month is always August ). However, there may not have been any invoices in August, so how can we compare September with the last month that had an invoice?

Let’s set the scene for this example. A business user has requested to compare each day’s orders with the previous day they had orders. To get the previous day that had orders, we can create a measure called Last Day Order Quantity (see below). Within this measure, you’ll notice we use a variable to calculate the measure Order Quantity. By pulling this into the variable, it will save the row context for Order Quantity so we can make sure that we are only looking at rows that have orders. The next variable (LastDayOrdered) filters the ship date table to grab the last date where 1. there are orders (Order Quantity is not blank) and 2. the current day (aka the row we are currently on) also has orders. The final variable does the work of calculating our order quantity for the last day that contained orders.

Last Day Order Quantity = 
VAR Orders = [Order Quantity] //grab the charge amount for this line
VAR LastDayOrdered = MINX(FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]>MAX('ShipDate'[DaysAgo]) &&[Order Quantity] <> BLANK() && Orders<>BLANK()),'ShipDate'[DaysAgo])
VAR LastDayOrders = CALCULATE([Order Quantity],FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]=LastDayOrdered))
RETURN  
LastDayOrders

As you can see in the screenshot above, there is a gap in ship dates between 1/25/2014 and 1/28/2014 and the last day order quantity pulls in the amount from 1/25/2014 (1) instead of grabbing it from 1/27/2014 (0). Sweet! Now that makes finding the % difference in order quantity very simple. Below is the full code from this example and also a more parameterized version with tons of comments to help you use it as needed. Happy coding!

Final Measure:

% Difference Order Quantity = 
VAR Orders = [Order Quantity] //grab the charge amount for this line
VAR LastDayOrdered = MINX(FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]>MAX('ShipDate'[DaysAgo]) &&[Order Quantity] <> BLANK() && Orders<>BLANK()),'ShipDate'[DaysAgo])
VAR LastDayOrders = CALCULATE([Order Quantity],FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]=LastDayOrdered))
RETURN  
DIVIDE(Orders-LastDayOrders,LastDayOrders,0)

Commented version:

% Difference = 
//Make a variable to grab the value of your measure at your current line (we will use this later)
VAR Desired_Measure = [Order Quantity]

//Now we need to grab the time where this desired measure was not null
//In my example, ship date is the column that I want to see the previous order quantity sliced by
VAR Last_Time_Measure_Had_Value = 
    // I use MINX here to ensure I get the previous day, not the next day. Feel free to flip this as desired.
    MINX(FILTER(ALL('ShipDate') 
        //this makes sure that I'm only grabbing ship dates that are before my current ship date.
        ,'ShipDate'[DaysAgo] > MAX('ShipDate'[DaysAgo])
        //this makes sure that the options for days ago only include ones that we had orders on, AKA the desired measure is not NULL for this day. 
        //DO NOT USE Desired_Measure here because desired_measure will only look at your current row and we are trying to look at all the rows to make sure we are grabbing a non-NULL one. 
        &&[Order Quantity] <> BLANK() 
        //this checks that the current row is not blank. 
        && Desired_Measure<>BLANK())
    //I need this variable to return the smallest number of days ago (hence the MINX at the beginning) that meets the criteria above
    //For your use, you should swap daysago with whatever field you are hoping to slice and dice by
    ,'ShipDate'[DaysAgo])

//This final variable calulcates my desired measure (order quantity) and filters my table for the last time that measure had a value.
VAR Last_Instance_Of_Calculated_Measure = CALCULATE([Order Quantity],FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]=Last_Time_Measure_Had_Value))

//Easy peasy now! We can take our current days orders - last day we had orders number of orders, divded by the last day we had orders number of orders to get the % growth day over day. Phewf!
RETURN  
DIVIDE(Desired_Measure - Last_Instance_Of_Calculated_Measure,Last_Instance_Of_Calculated_Measure,0)

I Wrote a Book – Hands-On SQL Server 2019 Analysis Services

While not the first time I have authored, this is the first book that I wrote as the sole author. Analysis Services is the product I built my career in business intelligence on and was happy to take on the project when I was approached by Packt.

I think one of my favorite questions is about how much research time did I put in for this book. The right answer is almost 20 years. I started working with Analysis Services when it was called OLAP Services and that was a long time ago. Until Power Pivot for Excel and tabular model technology was added to the mix, I worked in the multidimensional model. I was one of the few, or so it seems, that enjoyed working in the multidimensional database world including working with MDX (multidimensional expressions). However, I was very aware that tabular models with the Vertipaq engine were the model of the future. Analysis Services has continued to be a significant part of the BI landscape and this book give you the opportunity to try it out for yourself.

This book is designed for those who are most recently involved in business intelligence work but have been working more in the self-service or end user tools. Now you are ready to take your model to the next level and that is where Analysis Services comes into play. As part of Packt’s Hands On series, I focused on getting going with Analysis Services from install to reporting. Microsoft has developer editions of the software which allow you to do a complete walk through of everything in the book in a step by step fashion. You will start the process by getting the tools installed, downloading sample data, and building out a multidimensional model. Once you have that model built out, then we do build a similar model using tabular model technology. We follow that up by building reports and visualizations in both Excel and Power BI. No journey is complete without working through security and administration basics. If you want learn by doing, this is the book for you.

If you are interested in getting the book, you can order it from Amazon or Packt. From November 20, 2020 through December 20, 2020, you can get a 25% discount using the this code – 25STEVEN or by using this link directly.

I want to thank the technical editors that worked with me to make sure the content and the steps worked as expected – Alan Faulkner, Dan English, and Manikandan Kurup. Their attention to detail raised the quality of the book significantly and was greatly appreciated.

I have to also thank Tazeen Shaikh who was a great content editor to work with. When she joined the project, my confidence in the quality of the final product increased as well. She helped me sort out some of the formatting nuances and coordinated the needed changes to the book. Her work on the book with me was greatly appreciated. Finally, many thanks to Kirti Pisat who kept me on track in spite of COVID impacts throughout the writing of the book this year.

I hope you enjoy the book!

Welcome to 3Cloud …

A little over a month ago, Pragmatic Works Consulting was a part of a merger that included 3Cloud and Applied Cloud Services over a period of a few months. Let’s look at the journey.

June 30, 2020: 3Cloud Receives Growth Equity from Gryphon Investors
July 30, 2020: 3Cloud Acquires Applied Cloud Services
September 9, 2020: 3Cloud Acquires Pragmatic Works Consulting

From June through September, 3Cloud went from about 70 people to 170. We are now the largest Azure “pure-play” consulting company in the United States. And this is just the beginning…

So, what are my thoughts on this?

I am truly excited about the opportunity to grow a consulting company that is focused on Azure. Sure there is a lot of change, but change is not bad. As we bring our three companies together to become one, there are challenges and successes. All of us have already benefited from the merging of skills and teams to create a more complete solution team for our customers.

I look forward to seeing how we evolve over the next few months and years. Exciting times are ahead!

Customer Impact

While working for a data centric company like Pragmatic Works was great, the shift to cloud technologies and Azure data services required us to expand our capabilities beyond data and SQL Server. This merger allows us to immediately add value to our customers by adding application development and infrastructure capabilities to our toolbox. Beyond that, 3Cloud and ACS bring a mature managed services offering including the ability to host and manage customer resources in Azure (CSP).

I think our customers get a significant boost in services as we become a more complete Azure company.

Some Final Thoughts

I will miss working directly with Brian Knight and Tim Moolic, two of the founding partners at Pragmatic Works. Their vision helped shape a great organization over 12 years. In case you did not realize it, Pragmatic Works will continue on as a training organization. You can still expect excellent technical training from the team there. We all will continue to learn and grow with their support.

If you are interested in joining our team or learning more about what 3Cloud offers, reach out to me at shughes@3cloudsolutions.com. I look forward to seeing you on a webinar or working with you in the future.

Azure SQL Database Elasticity – Presentation Notes

This blog covers the content and points to the code used to create the demos in my Azure SQL Database Elasticity presentations. As of today, I have presented this at the Minnesota SQL Server User Group (PASSMN) in September 2020 and as a webinar for 3Cloud in October 2020.

Elastic Queries

Elastic queries allow developers to interact with data from multiple databases supported on the Azure SQL database platform including Synapse. Elastic queries are often referred to as Polybase which is currently implemented in SQL Server 2019 and Azure Synapse. The key difference is that elastic queries only allow you to interact with other Azure SQL Databases but not Hadoop or other database implementations (e.g. Teradata or Oracle). Part of the confusion comes from the fact that the implementation looks very similar. Both toolsets use external tables in SQL Server to interact with the connected data sources. However, Polybase requires additional components to run whereas elastic queries are ready to go without additional setup.

Be aware elastic queries are still in preview. Also, elastic queries are included in the cost of Azure SQL Database in standard and premium tiers.

Elastic Query Strategies

Elastic queries support three key concepts and will influence how you implement the feature.

  1. Vertical partitioning. This concept uses complete tables in separate databases. It could be a shared date table or dimensions in a data warehouse solution. Vertical partitioning is a method to scale out data solutions. This is one method to use Azure SQL database for larger data solutions.
  2. Horizontal partitioning or sharding. Whereas vertical partitioning keeps tables together, horizontal partitioning shards or spreads the data from a single table across multiple Azure SQL Databases. This is the most complex type of partitioning as it requires a shard map. This is typically implemented with .NET or Java applications.
  3. Data virtualization. This concept is a mix of the partitioning solutions to achieve the goal of virtualizing the data. The idea with data virtualization is that we can use a single Azure SQL Database to interact with data from multiple databases. While this concept is limited due to the limit to use Azure SQL Databases, it is a concept to look for more improvements as the product matures even more.

Elastic Query Demo

The demo used in the presentations is configured as shown here:

Three S1 Azure SQL Databases on the same Azure SQL Server. I used ADF (Azure Data Factory) to move Fact.Purchase to WideWorldDW_2 and the three related dimensions (dimDate, dimStockItem, dimSupplier) to WideWorldDW_3. I then used WideWorldDW_3 to implement the external tables to work with the data. The WideWorldImportersDW-Standard was used as the original restore of the sample database. It is the source of the data but is not used in the demos.

One note on the demo. I did not include the ADF jobs. Use the Copy activity to move the tables to the target databases. You can find more information here.

The demo code to set up the environment can be found here.

Elastic Jobs

Elastic jobs is the alternative to SQL Server Agent Jobs in Azure SQL Database. While Agent is included in Azure SQL Managed Instance, the rest of the platform needed an option to create jobs. Elastic jobs solves that issue. Currently this is also in preview and is also included with Azure SQL Database. The only additional cost is that a dedicated job database is required to support elastic jobs.

The best comparison is still with SQL Server Agent. Elastic jobs are structured with jobs which have job steps. The only limitation at the moment is that job steps must be T-SQL. Jobs can be created in the Azure portal, with PowerShell, with REST, or with T-SQL.

Elastic Transactions

One of the key pieces that was originally missing from the Azure SQL Database rollout was cross database transactions that were supported in SQL Server with MSDTC. Elastic transactions add this functionality to Azure SQL Database and is built into the platform. This functionality is application driven and currently supported in the latest .NET libraries. Overall, this will allow you to support transactions across 100 databases or fewer. While there is no limit, Microsoft currently recommends only using this to support distributed transactions over 100 or less databases due to potential performance issues.

There are a few limitations to be aware of:

  • Only supports Azure SQL Databases
  • Only supports .NET transactions
  • Does not support T-SQL Distributed transactions
  • Does not support WCF transactions

Wrap Up

Microsoft continues to improve the functionality in Azure SQL Database. These elastic features are part of that process. While I typically do not have many uses for distributed transactions, we have actively implemented elastic queries and elastic jobs for customers and look to use them more in the future.

Azure SQL Elasticity References

Hopefully you too will be able to use the elastic functionality as you continue to embrace the Azure data platform.