Power BI Adding Dynamic Hierarchies – XMLA, TOM, C#

This post is a continuation of my adventure into the Tabular Object Model and how we can use it to make Power BI scalable and incorporate it into existing .NET applications. Quick refresher, the Tabular Object Model can be accessed through the XMLA endpoint in Power BI Premium workspaces. My previous posts have covered code around adding, deleting, and adjusting columns and measures, but this one will address how to manipulate hierarchies.

Power BI hierarchies are a powerful and easy way to enable end users to dig deeper into their visuals and data. While hierarchies can be a useful resource for self-serve analytics, maintaining hierarchies can be a pain as new levels get added or removed. Thankfully, if you have Power BI premium you can use the XMLA endpoint to add code into existing .NET applications to dynamically add or remove levels from hierarchies as they are created/removed in your application.

Unfortunately, while we can manipulate, add, and delete hierarchies and their levels, visuals already containing the hierarchy will not be automatically adjusted with any new levels/ordinals.

Microsoft TOM Documentation

If you are new to using C# and the Tabular Object Model (TOM), please check out the previous blog post (https://dataonwheels.wordpress.com/2021/10/15/power-bi-meets-programmability-tom-xmla-and-c/) for both an introduction to the topic and detailed instructions on getting the C# portion of this demo stood up. Please reference the DataOnWheels GitHub page for sample PBIX files and C# packages, but note you will need a Power BI Premium workspace with XMLA endpoint write-back enabled in order to run this entire demo.

Power BI Hierarchies

To start out, let’s make sure we understand the components of a hierarchy that we will need to replicate using our TOM script. In the Power BI Desktop app, creating a hierarchy is fairly simple. For example, let’s say I want to have end users drill down from category to subcategory. To do this, I would hover over the category column then click on the three dots next to the category column and select “create hierarchy”.

Next, go to the subcategory column and you’ll notice a new option called “add to existing hierarchy”. Select our newly created hierarchy (default will be named after the top level in the hierarchy), and it will add subcategory underneath category within the hierarchy. Pretty neat stuff but also very manual.

From this, we can see that there are a few components to a hierarchy that we will need to address in our TOM script:
1. Name
2. Levels
3. Order of levels (Ordinal)
4. Column in each level
5. Name of level

Using TOM to See Hierarchies, Levels, Ordinals, and Source Columns

Now that the data model contains a hierarchy, we can publish it up to a Premium enabled workspace in the Power BI service and see it using our TOM script. I won’t go into details on building out this script from scratch, so please reference this blog post for a complete walk through on connecting to your workspace and building a simple C# application to use with this demo.

To list out the hierarchies in the data model, you will need something like this script in your code (entire zip file is in the DataOnWheels github for reference):

  // List out the hierarchies in the product table
            foreach (Hierarchy hierarchy in table_product.Hierarchies)
            {
                Console.WriteLine($"Hierarchies: {hierarchy.Name}");
            }

And poof there it is! Our Category Hierarchy! Next we will have our script list out the levels within the hierarchy.

// List out the levels in our Category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies["Category Hierarchy"];
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Levels: {level_categoryhierarchy.Name}");
            }

Great, and the next piece will be the ordinal, or the order that the hierarchy levels should be placed. I’m going to adjust the last code so now it will tell us the ordinal/order of each level before it gives us the name. Notice that this starts at 0, not 1.

// List out the levels in our Category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies["Category Hierarchy"];
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name}");
            }

And for our final piece of the puzzle, the column name that this level of the hierarchy comes from.

// List out the levels in our Category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies["Category Hierarchy"];
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name} from {level_categoryhierarchy.Column.Name}");
            }

Editing a Hierarchy Using TOM

Let’s switch it up and begin editing our existing hierarchy by changing the name of the hierarchy, the name of the levels, the source columns, and swap the ordinances. Typically you will not need to do any or all of these things, but it may be useful in rare use cases.

To start, we will rename the hierarchy itself. Now it will be important to reference the Category Hierarchy by the lineage tag after we rename it. The lineage tag won’t change even after you change the name property of the hierarchy itself. Please note your lineage tag will be different from mine, so run the script that will list the lineage tag next to the name (below) first then replace that portion in the rest of the code where there are references to the reference tag.

// List out the hierarchies in the product table
            foreach (Hierarchy hierarchy in table_product.Hierarchies)
            {
                Console.WriteLine($"Hierarchies: {hierarchy.Name}, Lineage Tag = {hierarchy.LineageTag}");
            }
            // List out the levels in our category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies.FindByLineageTag("9aeadacd-d48d-48cb-948f-16700e030fe7");
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name} from {level_categoryhierarchy.Column.Name}");
            }

In the Power BI service, we can check if this rename effort was successful by entering edit mode.

Success! Let’s try changing the name of a level next then swap the order around.

 //Hierarchies:
            //Editing an existing hierarchy originally called Category Hierarchy
            {
                hierarchy_category.Name = "Category Hierarchy Rename Test"; //this renames the hierarchy, note the lineage tag will remain unchanged
                Console.WriteLine($"Category Hierarchy Renamed");
            }
            //Editing an existing hierarchy level 
            Level level_Category = hierarchy_category.Levels.FindByLineageTag("fe12a6fc-1023-43f9-bfdc-c59f65435323");
            Level level_Subcategory = hierarchy_category.Levels.FindByLineageTag("fbb4aa00-35dc-4490-bc40-3190b354ea54");
            {
                level_Category.Name = "Category Test";
                level_Subcategory.Name = "Subcategory Test";
                Console.WriteLine($"Category Hierarchy Levels Renamed");

Awesome! Okay now for the final piece of the puzzle – switching the ordinances to make subcategory the top of the hierarchy. Note, you will need to start at level 0. Also, if you are experiencing errors in saving the model, make sure you are out of edit mode in the Power BI Service. While it’s helpful to be in that mode to see your changes, it will be impossible to make additional changes via XMLA until you are out of it.


            //Hierarchies:
            //Editing an existing hierarchy originally called Category Hierarchy
            {
                hierarchy_category.Name = "Category Hierarchy Rename Test"; //this renames the hierarchy, note the lineage tag will remain unchanged
                Console.WriteLine($"Category Hierarchy Renamed");
            }
            //Editing an existing hierarchy level 
            Level level_Category = hierarchy_category.Levels.FindByLineageTag("fe12a6fc-1023-43f9-bfdc-c59f65435323");
            Level level_Subcategory = hierarchy_category.Levels.FindByLineageTag("fbb4aa00-35dc-4490-bc40-3190b354ea54");
            {
                level_Category.Name = "Category Test";
                level_Category.Ordinal = 1;
                level_Subcategory.Name = "Subcategory Test";
                level_Subcategory.Ordinal = 0;
                
                Console.WriteLine($"Category Hierarchy Levels Renamed & Reordered");
            }

            // List out the levels in our category hierarchy
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name} Lineage Tag: {level_categoryhierarchy.LineageTag} from {level_categoryhierarchy.Column.Name}");
            }

Boom now we have proven we can reorder the levels as well as rename them and the hierarchy itself.

Adding Hierarchy Levels & Hierarchies via TOM

Now we are finally ready to add a brand new level into our hierarchy! In the sample data, the model column should go below subcategory in my hierarchy. To add a level to the hierarchy we will need a few items – the name of the level, the ordering of the level, and the column it should reference. You can add a lineage tag as well (Power BI will not add one unless you made this level in the desktop application). Don’t forget to add the level you’ve created to the hierarchy or else it will stay in cache and never get added.

            //Hierarchies:
            //Editing an existing hierarchy originally called Category Hierarchy
            {
                hierarchy_category.Name = "Category Hierarchy Rename"; //this renames the hierarchy, note the lineage tag will remain unchanged
                Console.WriteLine($"Category Hierarchy Renamed");
            }
            //Editing an existing hierarchy level 
            Level level_Category = hierarchy_category.Levels.FindByLineageTag("fe12a6fc-1023-43f9-bfdc-c59f65435323");
            Level level_Subcategory = hierarchy_category.Levels.FindByLineageTag("fbb4aa00-35dc-4490-bc40-3190b354ea54");
            {
                level_Category.Name = "Category";
                level_Category.Ordinal = 1;
                level_Subcategory.Name = "Subcategory";
                level_Subcategory.Ordinal = 0;
                
                Console.WriteLine($"Category Hierarchy Levels Renamed & Reordered");
            }
            //Adding a new level to the hierarchy if it doesn't already exist
            if (hierarchy_category.Levels.ContainsName("Model"))
            {
                Console.WriteLine($"Hierarchy Level Exists");
            }
            else 
            {
                Level level_Model = new Level()
                {
                    Name = "Model",
                    Ordinal = 2,
                    Column = table_product.Columns.Find("Model")
                };
                hierarchy_category.Levels.Add(level_Model);
                Console.WriteLine($"Hierarchy Level Added");

Let’s try making our own hierarchy from scratch. To review, we will need to have a name for our new hierarchy, the name of the levels, the order of the levels, and the column of the levels. We will also need to explicitly add the new hierarchy to the model then add the levels to that hierarchy.

//Add a new hierarchy if it doesn't already exist
            if (table_product.Hierarchies.ContainsName("New Hierarchy"))
            {
                Console.WriteLine($"New Hierarchy Exists");
            }
            else
            {
                Hierarchy hiearchy_new = new Hierarchy()
                {
                    Name = "New Hierarchy",
                };
                table_product.Hierarchies.Add(hiearchy_new);
                Console.WriteLine($"Hierarchy Added");

                Level level_one = new Level()
                {
                    Name = "Model",
                    Ordinal = 0,
                    Column = table_product.Columns.Find("Model")
                };
                Level level_two = new Level()
                {
                    Name = "Product",
                    Ordinal = 1,
                    Column = table_product.Columns.Find("Product")
                };

                hiearchy_new.Levels.Add(level_one);
                hiearchy_new.Levels.Add(level_two);
                Console.WriteLine($"Levels added to new hiearchy");
            };

Awesome! Now we know we can programmatically add hierarchies, add levels, rearrange levels, rename levels, and point levels to different columns. This won’t apply to many use cases of Power BI, but for those of you embedding a Power BI solution into your application, this should offer greater flexibility and integration with your existing .NET applications.

Additional Resources:

Power BI: Creating an Info Button with UNICHAR()

Reports are only as useful as they are easily understandable. When making reports for executives and other business leaders, it is vital to ensure the context and insights of the report are easy to understand at a glance. As cultivators of data, we have a responsibility to make sure it is used and interpreted correctly otherwise data can quickly be used to come to incorrect conclusions. When users don’t understand data, they often distrust any data-driven decisions and go back to using instinct to make important business decisions.

While it’s important to give context to reports, the method of doing so may seem cumbersome. Some options include explaining the report to every new user or taking up valuable report space with explanations and contexts. For end user experience, it’s often best to replicate something they are already familiar with like a webpage. Many websites contain options for giving additional details to curious end users, typically they signal these resources with an “i” or information icon. We can replicate that functionality in Power BI by using UNICHAR characters and a hidden page.

The steps below will walk through how to add an information icon to the report, making a tooltip page containing your additional information, and enabling the tooltip to allow users to hover over the icon and see the information.

1. Make the following measure in your measures table:
Info Button = UNICHAR(128712)

2. Create a card visual and add this measure to it. You’ll see the following icon:

3. Create an additional page with a text box containing your additional information.

4. Hide this information page and configure it to be used as a tooltip. You may also want to resize it so that when users view it as a tooltip on the info icon it will cover the entire report. To hide the page, right click on the page name and select “Hide Page”. To get to the page configurations, you’ll need to select a blank section of the page (this avoids any visual configuration settings) and select the format paint brush. For resizing, I recommend setting it as a custom type then adjusting your pixels until it nearly fits your text box. Note: it will automatically align top left, so it’s best to align your text box to top left so as you resize you don’t lose your text box.

5. Add this info page as a tooltip for the card visual containing the info button. To do this, you’ll need to select your card visual, go to the format pane, and turn tooltips on. Once tooltips are on, you can select the type of report page then select the page that you made earlier with your additional information.

Now when you hover over the icon, the tooltip will display your additional information. In the example below, I’ll mess with the pixels more on the info page until it’s the appropriate size for the amount of information I want to convey.

And that’s all there is to it! One valid alternative to this is to make a bookmark that would lead end users to the information page. Personally, I prefer this tooltip method so end users do not lose their place within the report and it is small enough to be put on every page of the report.

Feel free to check out the PBIX used for this blog post in the Data On Wheels GitHub Repo.

Power BI Embedded: Stress Testing & Capacity Planning

When deciding if Power BI is the correct reporting platform to meet your business’s needs, price is a large factor. Power BI pricing models are based around SKU’s and v-cores, but how do you know how many you will need? Unfortunately, the ratio of capacity to expected usage is not easily determined.

For example, one instance may have a very large data model that takes a lot of memory and CPU time to refresh, 20 users at peak viewing times, hourly refreshes, and the queries are all very simple and allow for query folding. Another business may have six smaller data models, 950 users at peak viewing times, daily refreshes, and the queries populating the data model are all very very complex. All of these elements impact the usage at any given time, making predicting overall CPU needs nearly impossible. Thankfully, stress testing your capacity gives us an option that is not purely reactionary.

This blog will walk through how to stress test your capacity, the elements of capacity planning, and how to understand the results of the stress test.

Prerequisites

  • Access to the Premium Capacity Utilization and Metrics App/Report (Utilization and Metrics Report Instructions)
  • Access to the reports you wish to test against
  • Decent CPU and memory to run concurrent browser windows
  • Elevated PowerShell console to run (i.e. “Run As Administrator”)
  • Load Test Tool (instructions below)

Building the Test

  1. Navigate to the GitHub for the Load Test Tool and download the zip file containing the PowerShell script. There are two options of load testing tools, the default tests the “worst case scenario” where all of your users log on at the same time and continually mess around with filters so Power BI is forced to ignore it’s cache.
    The Realistic Load Test Tool operates similar to the default load test tool, but instead of testing the initial load time, it tests programmable functionality that end users would likely do such as changing slicers, filters, and navigating through bookmarks with some “think time” between operations. For this demo, we’ll walk through the standard load test tool to keep it simple. Please refer to the ReadMe file for further instructions on using the realistic load test tool.

    NOTE: This PowerShell script contains an unsigned PowerShell script. You must first use Set-ExecutionPolicy Unrestricted command in order to allow running of unsigned scripts. It also requires the “MicrosoftPowerBIMgmt” Power BI PowerShell modules to be installed from here.

2. Unzip the file into a folder on your desktop (or within a VM) and navigate to the Initiate_Load.ps1 file. Right click on the file to run in PowerShell.

3. The PowerShell script will walk you through a number of prompts:

  • How many reports do you want to configure?
    • This will determine how many reports are pinged concurrently. Keep in mind, you can ping reports from different workspaces in the same PowerShell run. Type a number then hit enter.
  • Authentication
    • A pop-up will appear, use the login with access to the workspaces you wish to test.
  • Select workspace index from above
    • Type the number next to the workspace where the report you wish to stress test resides then hit enter.
  • Select report index from above
    • Type the number next to the report you wish to stress test then hit enter.
  • Filters require FilterTable, FilterColumn, MinimumValue, MaximumValue in FilterColumn
    • This will be how the script circumvents the caching feature in Power BI service. By providing a table, column, min, and max values, the script is able to pass different filter contexts to the loading reports. I recommend using a date table or fact table that interacts with the entire data model and use a numerical field (minimum and maximum do not make sense for text fields).
    • Make sure there are no spaces between the variables and the commas.
    • Example: I have a table called “Invoice Date”, a column called “MonthsAgo” that I would like to filter on. I would type “Invoice Date,MonthsAgo,0,4”
  • Enter number of instances to initiate for this report
    • This is the number of browser windows that will open and ping the report(s) you have designated. Type in a number then hit enter. I recommend starting with 10 then moving up to make sure you won’t crash your machine with browser windows.
  • Do you want to launch configured reports?[y/n]
    • If you are satisfied with the parameters set for the script, type “y” then enter. If you wish to abort the test run, type “n” then enter.

4. To test how refreshes may impact the user experience (and vice versa), go into the Power BI online service and manually refresh the Power BI datasets that power the reports you are stress testing. While the Load Testing Tool is great for testing interactive operations, testing background operations will need to be done outside this tool. There are REST API’s that can trigger Power BI datasets (Refresh PBI Dataset REST API), but we won’t cover that in this blog post.

5. If the number of browser windows exceeds your computer’s capabilities, the windows will time out and you will need to refresh the browser pages individually to get them to run again. I recommend only allowing 10 pages to be active at a time. You may notice there is a little counter in the upper left hand corner of the windows. This tells you how many times that window has pinged the Power BI service. To stop each window from pinging, you will need to close each window individually. All the windows will time out an hour after the initial script is run, since the token the API generates expires after one hour.

Tests are now completed! You will need to wait roughly 45 minutes before manually refreshing the dataset attached to the Capacity & Metrics report.

Considerations for Power BI Capacity Planning

Power BI capacity planning and management is no small task. Microsoft recommends setting your capacity size based on the size of your Power BI items (datasets, dataflows, etc.) because this will directly impact the speed of operations within a SKU (planning capacity in advance). This is a good rule of thumb, but unfortunately this method does not account for extremely high levels of interaction against a small number of items.

For example, let’s say you have one certified dataset and eight reports that use that dataset. There are 20-30 people viewing each report because it’s month end and all the analysts, account managers, and executives are prepping their presentations with screenshots from these reports. The strain on this dataset (and your capacity) will be as if there were 160-240 users interacting with the dataset. Now scale this up – imagine there are 100 people looking at each report. The usage against this dataset/item quickly grows, so your (hopefully) optimized certified dataset impacts the CPU much more than a dataset with only one report on it. That is why we must consider both background and interactive operations when estimating optimal SKU/CPU.

Capacity in Power BI is determined by a few variables:

  • Schema and size of data models used (background & interactive operation)
  • The number and complexity of required queries (background operation)
  • The hourly distribution of the usage of your application (interactive operation)
  • Data refresh rates and duration (background operation)

Thankfully, the Gen2 metrics app provides insight into the current state of the capacity based on each of these variables.

Reading the Results

The Premium Capacity Utilization And Metrics report measures your capacity usage for the previous 14 or 28 days depending on the visual. This report, much like capacity metrics themselves, can be complex and hard to understand. Microsoft has provided some documentation on the contents of this report, but let’s turn our focus to elements relating to the results Load Testing Tool stress testing we did above.

To start out, make sure you have refreshed the dataset powering the Capacity Metrics report (wait until 45 minutes have passed since running the tests to ensure the test results will be in the report).

Once you enter the report, to narrow the report down to the latest tests, pop open the filter pane and adjust the “Filters on all pages” Date filter to the date of the tests.

If you did multiple rounds of testing, it will be important to know what time you ran each test. The chart in the upper right hand side of the overview page is extremely useful for identifying if any of your tests resulted in a spike of CPU above the CPU limit. For example, I did 3 tests but only one resulted in a spike of 168% CPU. To learn more about this spike, we can right click on the spiked bar and drill through to the time point detail.

This is by far my favorite portion of the report. In this drill through, you are able to see the number of interactive operations that occurred within this 30 second interval on top of all the background operations in the past 24 hours. Let’s walk through what it all means.

The limitations of your current SKU are important to know when estimating the available workload. In our example, the SKU assigned is A1 which allows for 30 seconds of capacity CPU. Keeping that in mind, let’s see how many seconds of CPU our 611 operations resulted in.

50.6 seconds, resulting in 169.98% of the capacity. Wow! Because we used the Load Testing Tool, the user for all these interactions will be the Power BI User. However, in production environment, you should be able to tell if there is a specific user crazily clicking and filtering the report within a 30 second window. Thankfully, the user was able to successfully load the report with each filter and experienced no failures from the service, but we should likely scale our capacity if this behavior is expected consistently.

In our use case, the background interactions (refreshing data model) only accounted for .53% of the capacity over the past 24 hours. It’s important to keep in mind that while the total CPU seconds is much larger than the interactive operations, the background operations are calculated over a 24 hour time frame.

If you tested multiple datasets at a time, I recommend sorting by the artifact then holding down shift and cross sorting by the CPU(s) field. That way, you can tell if one dataset is consuming more resources than another (look at the average CPU seconds to determine that).


Using the three dots in the upper right hand corner of the visual, you can export these results into an excel or CSV to examine further or to stack results from multiple tests and compare.

For the example above, I would recommend increasing SKU’s for more CPU to accommodate the large number of interactive users that are expected. I would also recommend looking closely at the data models tested and ensuring query folding is occurring and a good star schema is in place.

Additional Resources:

Power BI DAX Getting the Value of Previous Non-NULL Row

If you are an avid report designer or user, you may have wanted to see the percent difference between one row and the previous one. Sounds easy right? Well let’s break it down. Percent difference follows a consistent formula: (current value – previous value)/previous value. Current value? Easy peasy. Previous value? That’s where this blog comes into play. It’s fairly simple to get the previous month value or previous date period value because Power BI can tell it exactly what the filter would be (September minus one month is always August ). However, there may not have been any invoices in August, so how can we compare September with the last month that had an invoice?

Let’s set the scene for this example. A business user has requested to compare each day’s orders with the previous day they had orders. To get the previous day that had orders, we can create a measure called Last Day Order Quantity (see below). Within this measure, you’ll notice we use a variable to calculate the measure Order Quantity. By pulling this into the variable, it will save the row context for Order Quantity so we can make sure that we are only looking at rows that have orders. The next variable (LastDayOrdered) filters the ship date table to grab the last date where 1. there are orders (Order Quantity is not blank) and 2. the current day (aka the row we are currently on) also has orders. The final variable does the work of calculating our order quantity for the last day that contained orders.

Last Day Order Quantity = 
VAR Orders = [Order Quantity] //grab the charge amount for this line
VAR LastDayOrdered = MINX(FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]>MAX('ShipDate'[DaysAgo]) &&[Order Quantity] <> BLANK() && Orders<>BLANK()),'ShipDate'[DaysAgo])
VAR LastDayOrders = CALCULATE([Order Quantity],FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]=LastDayOrdered))
RETURN  
LastDayOrders

As you can see in the screenshot above, there is a gap in ship dates between 1/25/2014 and 1/28/2014 and the last day order quantity pulls in the amount from 1/25/2014 (1) instead of grabbing it from 1/27/2014 (0). Sweet! Now that makes finding the % difference in order quantity very simple. Below is the full code from this example and also a more parameterized version with tons of comments to help you use it as needed. Happy coding!

Final Measure:

% Difference Order Quantity = 
VAR Orders = [Order Quantity] //grab the charge amount for this line
VAR LastDayOrdered = MINX(FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]>MAX('ShipDate'[DaysAgo]) &&[Order Quantity] <> BLANK() && Orders<>BLANK()),'ShipDate'[DaysAgo])
VAR LastDayOrders = CALCULATE([Order Quantity],FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]=LastDayOrdered))
RETURN  
DIVIDE(Orders-LastDayOrders,LastDayOrders,0)

Commented version:

% Difference = 
//Make a variable to grab the value of your measure at your current line (we will use this later)
VAR Desired_Measure = [Order Quantity]

//Now we need to grab the time where this desired measure was not null
//In my example, ship date is the column that I want to see the previous order quantity sliced by
VAR Last_Time_Measure_Had_Value = 
    // I use MINX here to ensure I get the previous day, not the next day. Feel free to flip this as desired.
    MINX(FILTER(ALL('ShipDate') 
        //this makes sure that I'm only grabbing ship dates that are before my current ship date.
        ,'ShipDate'[DaysAgo] > MAX('ShipDate'[DaysAgo])
        //this makes sure that the options for days ago only include ones that we had orders on, AKA the desired measure is not NULL for this day. 
        //DO NOT USE Desired_Measure here because desired_measure will only look at your current row and we are trying to look at all the rows to make sure we are grabbing a non-NULL one. 
        &&[Order Quantity] <> BLANK() 
        //this checks that the current row is not blank. 
        && Desired_Measure<>BLANK())
    //I need this variable to return the smallest number of days ago (hence the MINX at the beginning) that meets the criteria above
    //For your use, you should swap daysago with whatever field you are hoping to slice and dice by
    ,'ShipDate'[DaysAgo])

//This final variable calulcates my desired measure (order quantity) and filters my table for the last time that measure had a value.
VAR Last_Instance_Of_Calculated_Measure = CALCULATE([Order Quantity],FILTER(ALL('ShipDate'),'ShipDate'[DaysAgo]=Last_Time_Measure_Had_Value))

//Easy peasy now! We can take our current days orders - last day we had orders number of orders, divded by the last day we had orders number of orders to get the % growth day over day. Phewf!
RETURN  
DIVIDE(Desired_Measure - Last_Instance_Of_Calculated_Measure,Last_Instance_Of_Calculated_Measure,0)

Power BI Adding Translations to Rename Columns – XMLA, TOM, C#

If you are new to using C# and the Tabular Object Model (TOM), please check out the previous blog post (https://dataonwheels.wordpress.com/2021/10/15/power-bi-meets-programmability-tom-xmla-and-c/) for both an introduction to the topic and detailed instructions on getting the demo stood up.

For the TOM and XMLA experts, imagine this. Your customer wants to dynamically rename columns without using the Power BI Desktop and would prefer all existing report visuals not get broken by the new name. Impossible? Not with TOM, XMLA, and translations within Power BI.

If you’ve ever tried to change a column name in a Power BI source, you’ve likely run into this error on any visuals that contained the renamed column. And when you hit that “See Details”, it will tell you the column that you simply renamed is no longer available for your visual.

So how do we get around that? Translations. Translations are typically used to translate report entities to other languages that will change depending on what language the end user has set on their browser. However, we can hijack this functionality to rename columns without having to impact the data model. It is a bit confusing on why this works, but imagine this: you build a Lego pyramid, but learn that one of the blocks needs to be changed from blue to green. Couple of options, you can take apart the entire pyramid (this would be akin to reopening the PBIX in Power BI Desktop and changing all of your visuals) OR you can take a green marker and color that blue brick green (adding a translation from blue to green).

If you don’t need to put this code into C#, the Tabular Editor is an excellent tool for adding translations to your data model (https://tabulareditor.com/creating-multilingual-power-bi-datasets/). However if you would like to programmatically update column names using C#, feel free to use the script below in your solution.

At a high level, here’s the hierarchy of entities used:
Workspace – Dataset – Data Model – Cultures – Object Translations
Workspace – Dataset – Data Model – Table – Column – Translated Properties

Note: There can only be one translated property per culture.

To add translations, we first need to set which culture this translation belongs in. For this example, we will use “en-US” because that is what default browser we want these names applied to. The code snippet below will list out all the cultures (aka website language codes) that are configured in this data model and list out all the translated objects (data columns in this case) that already exist.

After setting the culture/language, narrow down the column that this translation will be applied to and create a variable for the translation object. The translation object consists of two parts, the metadata object (column in this example) and the property of that metadata that we want to translate (caption in this example which is essentially display name).

Once we have these elements, we can check to see if this column already has a translation for this culture. If it does, this script will remove the old translation to allow for overwriting. If it does not, it will add the new translation to the culture within the data model.

And that’s it!

Here’s what it looks like in the service. Don’t forget to refresh your report page if you have it open for the new name to appear. There’s no need to refresh the dataset.

Full C# code:

using System;
using Microsoft.AnalysisServices.Tabular;



namespace PowerBI_TOM_Testing
{
    class Program
    {
        static void Main()
        {

            // create the connect string - powerbi://api.powerbi.com/v1.0/myorg/WORKSPACE_NAME
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/YOURWORKSPACE";
            string connectString = $"DataSource={workspaceConnection};";

            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);
            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine($"ID : {database.ID}, Name : {database.Name}, CompatibilityLevel: database.CompatibilityLevel}, Last Updated : {database.LastSchemaUpdate}");
            }
            
            // enumerate through tables in one database (use the database ID from previous step)
            Model model = server.Databases["bb44a298-f82c-4ec3-a510-e9c1a9a28af2"].Model; 
            
            //if you don't specify a database, it will only grab models from the first database in the list
            foreach (Table table in model.Tables)
            {
                Console.WriteLine($"Table : {table.Name} IsHidden? : {table.IsHidden}");

            }
           
            // Specify a single table in the dataset
            Table table_product = model.Tables["Product"];

            
            
            // List out the columns in the product table
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
             }


            //Translations can be used to rename existing columns without rebuilding the model. This also updates any visuals that use that column. 
            // List of translations on the model
            foreach (Culture culture in model.Cultures)
            {
                Console.WriteLine($"Existing Culture: {culture.Name}"); 
            }

            // Let's get a list of the existing translations within the en_US culture
            Culture enUsCulture = model.Cultures.Find("en-US");
            
            foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations) 
            {
                Console.WriteLine($"Translated Object: {objectTranslation.Value}");
            }
            // Narrow down what column within this culture/language you would like to add the translation to
            MetadataObject dataColumn = table_product.Columns.Find("Description"); //this needs to always be the original column name within the data model.
            ObjectTranslation proposedTranslation = enUsCulture.ObjectTranslations[dataColumn, TranslatedProperty.Caption];

            // Only one translation per entity per culture.
            if (proposedTranslation != null)
            {
                Console.WriteLine($"Translation Exists for this Culture & Column combo");
                enUsCulture.ObjectTranslations.Remove(proposedTranslation); //need to remove the existing translation to overwrite it
                ObjectTranslation overwriteTranslation = new ObjectTranslation()
                {
                    Object = dataColumn,
                    Property = TranslatedProperty.Caption,
                    Value = "Blue"
                };
                enUsCulture.ObjectTranslations.Add(overwriteTranslation);
            }
            else
            {
                ObjectTranslation newTranslation = new ObjectTranslation()
                {
                    Object = dataColumn,
                    Property = TranslatedProperty.Caption,
                    Value = "Blue"
                };
                enUsCulture.ObjectTranslations.Add(newTranslation);
            }

            

            // List out the translations to see what they are now that we have run the script    
            foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations)
                {
                    Console.WriteLine($"Final Translated Object: {objectTranslation.Value}");
                }
            
model.SaveChanges(); //make sure this is the last line! 
       


        }
    }
}

Additional Resources:

https://www.kasperonbi.com/setting-up-translations-for-power-bi-premium/
https://tabulareditor.com/creating-multilingual-power-bi-datasets/
https://www.sqlbi.com/tools/ssas-tabular-translator/
https://docs.microsoft.com/en-us/analysis-services/tabular-models/translations-in-tabular-models-analysis-services?view=asallproducts-allversions
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.culture?view=analysisservices-dotnet
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.culture.objecttranslations?view=analysisservices-dotnet#Microsoft_AnalysisServices_Tabular_Culture_ObjectTranslations