SQL Saturday Atlanta 2023 Recap

This past weekend, I had the pleasure of attending and speaking at SQL Saturday Atlanta! If you’re in the area, I highly recommend connecting with the local user group and getting to know fellow data nerds near you. Thank you to everyone who was able to make it out, it was great to see such a large in person SQL Saturday event post-pandemic! Big shout out to the volunteers, organizers, and sponsors who made it all happen. Thank you all for your time, hard work, and commitment that will lead to rebuilding the data community. This was the first conference I felt confident enough in my presentation to attend a session in every timeslot! Below are some takeaways I took from the sessions I attended, but there were a lot of other incredible sessions and I recommend checking out the schedule for any that interest you and reaching out to the speaker.

Attended Session Takeaways

  • Practical Use Cases for Composite Models (Kevin Arnold)
    • Never thought of using composite models for pulling in an enterprise model with a thin slice specific to the use case of the report. Genius method for maintaining a focused enterprise model while meeting the needs of your end users.
    • Perspectives can be used with personalized visuals instead of hiding columns, so end users are not overwhelmed by column and measure options.
    • Field parameters can also be used/created by end users for a cultivated experience that meets their business needs without impacting the larger audience of your enterprise model. If you haven’t heard of them (I hadn’t), highly recommend checking out this link.
  • Planning Steps for a Power BI Report (Belinda Allen)
    • Always ask stakeholders what their experience is with Power BI, it will help put all their questions and assumptions in context.
    • Ask your stakeholder for the scope of success. If they can’t define what success is for the project, you have the wrong person or the wrong client.
    • Show nothing in a needs gathering session. Listen and take notes. Similar to watching a movie before reading a book, it will severely limit the imagination necessary for an impactful report.
    • Ask who is maintaining the data currently and who will continue to do so.
    • Check out PowerBI.tips Podcast.
    • Ask if they want data access or data analytics. This will let you know if a visual report is a waste of resources for them and/or if paginated report or something similar better fits their needs.
    • Check out Chris Wagner’s blog, he has a great slide deck for a wireframing session with success owner after the needs gathering session.
    • Host office hours or something similar to foster on-going user growth
    • After project, always ask if we achieved defined success benchmarks. Try to give them a concrete ROI (ie x hours saved = x $ saved based on average salary).
    • Linktr.ee/msbelindaallen
  • Introduction to Azure Synapse Studio Development Tools (Russel Loski)
    • Synapse workspace can allow you to create T-SQL and python notebooks off items in Azure Data Lake Storage like csv and parquet files.
    • Notebooks allow markdown to be side-by-side with code
    • ctrl + space will bring up snippets to use within a notebook
    • No indexing since it’s serverless, prepare for some wait time.
    • We can promote column headers using a variable HEADER_ROW = TRUE
  • DataOps 101 – A Better Way to Develop and Deliver Data Analytics (John Kerski)
    • Check out the The DataOps Manifesto – Read The 18 DataOps Principles
    • Principles are repeatable and adaptable to new technologies
    • Make everything reproducible
    • Versioning and automated testing are keys to building sustainable solutions
    • Check out the DataOps Cookbook and pbi-tools
  • Power BI Performance in 6 demos (Patrick LeBlanc & Adam Saxton from Guy in a Cube)
    • To reduce the “other” line item in performance analyzer, limit the number of visual objects on a page.
    • Optimize DAX when the line item is over 120 milliseconds.
    • SLA for page loads is 5 seconds.
    • Using drop downs in your slicer will delay DAX from running for that visual object. That deferred execution aids in speeding up the initial load.
    • Tooltips run DAX behind the scenes on the initial load for visuals (you can see this by copying out the DAX query into the DAX Studio). To delay this execution until it’s needed, use a tooltip page.
    • If the storage engine in DAX Studio is over 20 milliseconds, there’s opportunity to optimize.
    • Variables limit the number of times a sub-measure will be run and speed up DAX queries behind visuals.
    • Keep in mind while performance tuning, Power BI desktop has 3 caches – visual, report, and analysis services engine. You can clear all caches within the desktop tool except visual. To clear that cache, you need to close and reopen the PBIX file.

My Session

I cannot express enough how grateful I am for everyone who was able to make it to my session! To have so many established professionals in the field approach me afterwards telling me how well it went was a dream come true. If you’re interested in reviewing the slides and code, please check out my GitHub folder for all you will need to recreate the demo we went through. Miss it? No worries! I’ll be presenting this topic at SQLBits and the Power BI Cruise, so come join me! I’m also open to presenting at various user groups, feel free to reach out to me at kristyna@dataonwheels.com.

Again, thank you so much to everyone who made this weekend possible and please connect with me on LinkedIN and Twitter! I’d love to stay connected!

Power BI Adding Dynamic Hierarchies – XMLA, TOM, C#

This post is a continuation of my adventure into the Tabular Object Model and how we can use it to make Power BI scalable and incorporate it into existing .NET applications. Quick refresher, the Tabular Object Model can be accessed through the XMLA endpoint in Power BI Premium workspaces. My previous posts have covered code around adding, deleting, and adjusting columns and measures, but this one will address how to manipulate hierarchies.

Power BI hierarchies are a powerful and easy way to enable end users to dig deeper into their visuals and data. While hierarchies can be a useful resource for self-serve analytics, maintaining hierarchies can be a pain as new levels get added or removed. Thankfully, if you have Power BI premium you can use the XMLA endpoint to add code into existing .NET applications to dynamically add or remove levels from hierarchies as they are created/removed in your application.

Unfortunately, while we can manipulate, add, and delete hierarchies and their levels, visuals already containing the hierarchy will not be automatically adjusted with any new levels/ordinals.

Microsoft TOM Documentation

If you are new to using C# and the Tabular Object Model (TOM), please check out the previous blog post (https://dataonwheels.wordpress.com/2021/10/15/power-bi-meets-programmability-tom-xmla-and-c/) for both an introduction to the topic and detailed instructions on getting the C# portion of this demo stood up. Please reference the DataOnWheels GitHub page for sample PBIX files and C# packages, but note you will need a Power BI Premium workspace with XMLA endpoint write-back enabled in order to run this entire demo.

Power BI Hierarchies

To start out, let’s make sure we understand the components of a hierarchy that we will need to replicate using our TOM script. In the Power BI Desktop app, creating a hierarchy is fairly simple. For example, let’s say I want to have end users drill down from category to subcategory. To do this, I would hover over the category column then click on the three dots next to the category column and select “create hierarchy”.

Next, go to the subcategory column and you’ll notice a new option called “add to existing hierarchy”. Select our newly created hierarchy (default will be named after the top level in the hierarchy), and it will add subcategory underneath category within the hierarchy. Pretty neat stuff but also very manual.

From this, we can see that there are a few components to a hierarchy that we will need to address in our TOM script:
1. Name
2. Levels
3. Order of levels (Ordinal)
4. Column in each level
5. Name of level

Using TOM to See Hierarchies, Levels, Ordinals, and Source Columns

Now that the data model contains a hierarchy, we can publish it up to a Premium enabled workspace in the Power BI service and see it using our TOM script. I won’t go into details on building out this script from scratch, so please reference this blog post for a complete walk through on connecting to your workspace and building a simple C# application to use with this demo.

To list out the hierarchies in the data model, you will need something like this script in your code (entire zip file is in the DataOnWheels github for reference):

  // List out the hierarchies in the product table
            foreach (Hierarchy hierarchy in table_product.Hierarchies)
            {
                Console.WriteLine($"Hierarchies: {hierarchy.Name}");
            }

And poof there it is! Our Category Hierarchy! Next we will have our script list out the levels within the hierarchy.

// List out the levels in our Category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies["Category Hierarchy"];
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Levels: {level_categoryhierarchy.Name}");
            }

Great, and the next piece will be the ordinal, or the order that the hierarchy levels should be placed. I’m going to adjust the last code so now it will tell us the ordinal/order of each level before it gives us the name. Notice that this starts at 0, not 1.

// List out the levels in our Category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies["Category Hierarchy"];
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name}");
            }

And for our final piece of the puzzle, the column name that this level of the hierarchy comes from.

// List out the levels in our Category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies["Category Hierarchy"];
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name} from {level_categoryhierarchy.Column.Name}");
            }

Editing a Hierarchy Using TOM

Let’s switch it up and begin editing our existing hierarchy by changing the name of the hierarchy, the name of the levels, the source columns, and swap the ordinances. Typically you will not need to do any or all of these things, but it may be useful in rare use cases.

To start, we will rename the hierarchy itself. Now it will be important to reference the Category Hierarchy by the lineage tag after we rename it. The lineage tag won’t change even after you change the name property of the hierarchy itself. Please note your lineage tag will be different from mine, so run the script that will list the lineage tag next to the name (below) first then replace that portion in the rest of the code where there are references to the reference tag.

// List out the hierarchies in the product table
            foreach (Hierarchy hierarchy in table_product.Hierarchies)
            {
                Console.WriteLine($"Hierarchies: {hierarchy.Name}, Lineage Tag = {hierarchy.LineageTag}");
            }
            // List out the levels in our category hierarchy
            Hierarchy hierarchy_category = table_product.Hierarchies.FindByLineageTag("9aeadacd-d48d-48cb-948f-16700e030fe7");
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name} from {level_categoryhierarchy.Column.Name}");
            }

In the Power BI service, we can check if this rename effort was successful by entering edit mode.

Success! Let’s try changing the name of a level next then swap the order around.

 //Hierarchies:
            //Editing an existing hierarchy originally called Category Hierarchy
            {
                hierarchy_category.Name = "Category Hierarchy Rename Test"; //this renames the hierarchy, note the lineage tag will remain unchanged
                Console.WriteLine($"Category Hierarchy Renamed");
            }
            //Editing an existing hierarchy level 
            Level level_Category = hierarchy_category.Levels.FindByLineageTag("fe12a6fc-1023-43f9-bfdc-c59f65435323");
            Level level_Subcategory = hierarchy_category.Levels.FindByLineageTag("fbb4aa00-35dc-4490-bc40-3190b354ea54");
            {
                level_Category.Name = "Category Test";
                level_Subcategory.Name = "Subcategory Test";
                Console.WriteLine($"Category Hierarchy Levels Renamed");

Awesome! Okay now for the final piece of the puzzle – switching the ordinances to make subcategory the top of the hierarchy. Note, you will need to start at level 0. Also, if you are experiencing errors in saving the model, make sure you are out of edit mode in the Power BI Service. While it’s helpful to be in that mode to see your changes, it will be impossible to make additional changes via XMLA until you are out of it.


            //Hierarchies:
            //Editing an existing hierarchy originally called Category Hierarchy
            {
                hierarchy_category.Name = "Category Hierarchy Rename Test"; //this renames the hierarchy, note the lineage tag will remain unchanged
                Console.WriteLine($"Category Hierarchy Renamed");
            }
            //Editing an existing hierarchy level 
            Level level_Category = hierarchy_category.Levels.FindByLineageTag("fe12a6fc-1023-43f9-bfdc-c59f65435323");
            Level level_Subcategory = hierarchy_category.Levels.FindByLineageTag("fbb4aa00-35dc-4490-bc40-3190b354ea54");
            {
                level_Category.Name = "Category Test";
                level_Category.Ordinal = 1;
                level_Subcategory.Name = "Subcategory Test";
                level_Subcategory.Ordinal = 0;
                
                Console.WriteLine($"Category Hierarchy Levels Renamed & Reordered");
            }

            // List out the levels in our category hierarchy
            foreach (Level level_categoryhierarchy in hierarchy_category.Levels)
            {
                Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name} Lineage Tag: {level_categoryhierarchy.LineageTag} from {level_categoryhierarchy.Column.Name}");
            }

Boom now we have proven we can reorder the levels as well as rename them and the hierarchy itself.

Adding Hierarchy Levels & Hierarchies via TOM

Now we are finally ready to add a brand new level into our hierarchy! In the sample data, the model column should go below subcategory in my hierarchy. To add a level to the hierarchy we will need a few items – the name of the level, the ordering of the level, and the column it should reference. You can add a lineage tag as well (Power BI will not add one unless you made this level in the desktop application). Don’t forget to add the level you’ve created to the hierarchy or else it will stay in cache and never get added.

            //Hierarchies:
            //Editing an existing hierarchy originally called Category Hierarchy
            {
                hierarchy_category.Name = "Category Hierarchy Rename"; //this renames the hierarchy, note the lineage tag will remain unchanged
                Console.WriteLine($"Category Hierarchy Renamed");
            }
            //Editing an existing hierarchy level 
            Level level_Category = hierarchy_category.Levels.FindByLineageTag("fe12a6fc-1023-43f9-bfdc-c59f65435323");
            Level level_Subcategory = hierarchy_category.Levels.FindByLineageTag("fbb4aa00-35dc-4490-bc40-3190b354ea54");
            {
                level_Category.Name = "Category";
                level_Category.Ordinal = 1;
                level_Subcategory.Name = "Subcategory";
                level_Subcategory.Ordinal = 0;
                
                Console.WriteLine($"Category Hierarchy Levels Renamed & Reordered");
            }
            //Adding a new level to the hierarchy if it doesn't already exist
            if (hierarchy_category.Levels.ContainsName("Model"))
            {
                Console.WriteLine($"Hierarchy Level Exists");
            }
            else 
            {
                Level level_Model = new Level()
                {
                    Name = "Model",
                    Ordinal = 2,
                    Column = table_product.Columns.Find("Model")
                };
                hierarchy_category.Levels.Add(level_Model);
                Console.WriteLine($"Hierarchy Level Added");

Let’s try making our own hierarchy from scratch. To review, we will need to have a name for our new hierarchy, the name of the levels, the order of the levels, and the column of the levels. We will also need to explicitly add the new hierarchy to the model then add the levels to that hierarchy.

//Add a new hierarchy if it doesn't already exist
            if (table_product.Hierarchies.ContainsName("New Hierarchy"))
            {
                Console.WriteLine($"New Hierarchy Exists");
            }
            else
            {
                Hierarchy hiearchy_new = new Hierarchy()
                {
                    Name = "New Hierarchy",
                };
                table_product.Hierarchies.Add(hiearchy_new);
                Console.WriteLine($"Hierarchy Added");

                Level level_one = new Level()
                {
                    Name = "Model",
                    Ordinal = 0,
                    Column = table_product.Columns.Find("Model")
                };
                Level level_two = new Level()
                {
                    Name = "Product",
                    Ordinal = 1,
                    Column = table_product.Columns.Find("Product")
                };

                hiearchy_new.Levels.Add(level_one);
                hiearchy_new.Levels.Add(level_two);
                Console.WriteLine($"Levels added to new hiearchy");
            };

Awesome! Now we know we can programmatically add hierarchies, add levels, rearrange levels, rename levels, and point levels to different columns. This won’t apply to many use cases of Power BI, but for those of you embedding a Power BI solution into your application, this should offer greater flexibility and integration with your existing .NET applications.

Additional Resources:

Power BI Adding Translations to Rename Columns – XMLA, TOM, C#

If you are new to using C# and the Tabular Object Model (TOM), please check out the previous blog post (https://dataonwheels.wordpress.com/2021/10/15/power-bi-meets-programmability-tom-xmla-and-c/) for both an introduction to the topic and detailed instructions on getting the demo stood up.

For the TOM and XMLA experts, imagine this. Your customer wants to dynamically rename columns without using the Power BI Desktop and would prefer all existing report visuals not get broken by the new name. Impossible? Not with TOM, XMLA, and translations within Power BI.

If you’ve ever tried to change a column name in a Power BI source, you’ve likely run into this error on any visuals that contained the renamed column. And when you hit that “See Details”, it will tell you the column that you simply renamed is no longer available for your visual.

So how do we get around that? Translations. Translations are typically used to translate report entities to other languages that will change depending on what language the end user has set on their browser. However, we can hijack this functionality to rename columns without having to impact the data model. It is a bit confusing on why this works, but imagine this: you build a Lego pyramid, but learn that one of the blocks needs to be changed from blue to green. Couple of options, you can take apart the entire pyramid (this would be akin to reopening the PBIX in Power BI Desktop and changing all of your visuals) OR you can take a green marker and color that blue brick green (adding a translation from blue to green).

If you don’t need to put this code into C#, the Tabular Editor is an excellent tool for adding translations to your data model (https://tabulareditor.com/creating-multilingual-power-bi-datasets/). However if you would like to programmatically update column names using C#, feel free to use the script below in your solution.

At a high level, here’s the hierarchy of entities used:
Workspace – Dataset – Data Model – Cultures – Object Translations
Workspace – Dataset – Data Model – Table – Column – Translated Properties

Note: There can only be one translated property per culture.

To add translations, we first need to set which culture this translation belongs in. For this example, we will use “en-US” because that is what default browser we want these names applied to. The code snippet below will list out all the cultures (aka website language codes) that are configured in this data model and list out all the translated objects (data columns in this case) that already exist.

After setting the culture/language, narrow down the column that this translation will be applied to and create a variable for the translation object. The translation object consists of two parts, the metadata object (column in this example) and the property of that metadata that we want to translate (caption in this example which is essentially display name).

Once we have these elements, we can check to see if this column already has a translation for this culture. If it does, this script will remove the old translation to allow for overwriting. If it does not, it will add the new translation to the culture within the data model.

And that’s it!

Here’s what it looks like in the service. Don’t forget to refresh your report page if you have it open for the new name to appear. There’s no need to refresh the dataset.

Full C# code:

using System;
using Microsoft.AnalysisServices.Tabular;



namespace PowerBI_TOM_Testing
{
    class Program
    {
        static void Main()
        {

            // create the connect string - powerbi://api.powerbi.com/v1.0/myorg/WORKSPACE_NAME
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/YOURWORKSPACE";
            string connectString = $"DataSource={workspaceConnection};";

            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);
            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine($"ID : {database.ID}, Name : {database.Name}, CompatibilityLevel: database.CompatibilityLevel}, Last Updated : {database.LastSchemaUpdate}");
            }
            
            // enumerate through tables in one database (use the database ID from previous step)
            Model model = server.Databases["bb44a298-f82c-4ec3-a510-e9c1a9a28af2"].Model; 
            
            //if you don't specify a database, it will only grab models from the first database in the list
            foreach (Table table in model.Tables)
            {
                Console.WriteLine($"Table : {table.Name} IsHidden? : {table.IsHidden}");

            }
           
            // Specify a single table in the dataset
            Table table_product = model.Tables["Product"];

            
            
            // List out the columns in the product table
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
             }


            //Translations can be used to rename existing columns without rebuilding the model. This also updates any visuals that use that column. 
            // List of translations on the model
            foreach (Culture culture in model.Cultures)
            {
                Console.WriteLine($"Existing Culture: {culture.Name}"); 
            }

            // Let's get a list of the existing translations within the en_US culture
            Culture enUsCulture = model.Cultures.Find("en-US");
            
            foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations) 
            {
                Console.WriteLine($"Translated Object: {objectTranslation.Value}");
            }
            // Narrow down what column within this culture/language you would like to add the translation to
            MetadataObject dataColumn = table_product.Columns.Find("Description"); //this needs to always be the original column name within the data model.
            ObjectTranslation proposedTranslation = enUsCulture.ObjectTranslations[dataColumn, TranslatedProperty.Caption];

            // Only one translation per entity per culture.
            if (proposedTranslation != null)
            {
                Console.WriteLine($"Translation Exists for this Culture & Column combo");
                enUsCulture.ObjectTranslations.Remove(proposedTranslation); //need to remove the existing translation to overwrite it
                ObjectTranslation overwriteTranslation = new ObjectTranslation()
                {
                    Object = dataColumn,
                    Property = TranslatedProperty.Caption,
                    Value = "Blue"
                };
                enUsCulture.ObjectTranslations.Add(overwriteTranslation);
            }
            else
            {
                ObjectTranslation newTranslation = new ObjectTranslation()
                {
                    Object = dataColumn,
                    Property = TranslatedProperty.Caption,
                    Value = "Blue"
                };
                enUsCulture.ObjectTranslations.Add(newTranslation);
            }

            

            // List out the translations to see what they are now that we have run the script    
            foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations)
                {
                    Console.WriteLine($"Final Translated Object: {objectTranslation.Value}");
                }
            
model.SaveChanges(); //make sure this is the last line! 
       


        }
    }
}

Additional Resources:

https://www.kasperonbi.com/setting-up-translations-for-power-bi-premium/
https://tabulareditor.com/creating-multilingual-power-bi-datasets/
https://www.sqlbi.com/tools/ssas-tabular-translator/
https://docs.microsoft.com/en-us/analysis-services/tabular-models/translations-in-tabular-models-analysis-services?view=asallproducts-allversions
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.culture?view=analysisservices-dotnet
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.culture.objecttranslations?view=analysisservices-dotnet#Microsoft_AnalysisServices_Tabular_Culture_ObjectTranslations

Power BI Meets Programmability – TOM, XMLA, and C#

For anyone who read the title of this and immediately thought, “Oh no, I can’t do C#! Since when do I need to be in app dev to do Power BI?!” Never fear, I had the same panic when writing it haha. I recently did another blog post that uses the TMSL to accomplish a similar goal as this blog (TMSL Blog), but there are some added benefits to using TOM and C# instead of TMSL and SQL.

Recently, a client suggested that they would like to update their Power BI model schema through a pipeline triggered by their application. They allow end users to create custom UDFs (user defined fields) on the fly and also delete them. Normally, Power BI developers would have to open the PBIX file in the Power BI Desktop application and refresh the data model there to pull in the new columns. However, we have another option using the XMLA endpoint, TOM, and C#.

To start, let’s define a couple key terms.

TOM = Tabular Object Model. The TOM can be used inside numerous scripting languages to manipulate the data model. In this case, we are going to use C# so that the code can be called by a larger variety of applications.

TMSL = Tabular Model Scripting Language. TMSL can be used inside SSMS, and is very easy to manipulate, but does not lend itself well to C#-based applications and automation.

Limitations: You cannot export the PBIX file from the service once the XMLA updates have been made. For adding columns to the model, that’s not a big problem since those would be added in once you opened the desktop tool again. The problem comes if you create or edit visuals in the online service that you don’t want to overwrite in future iterations.

Tools needed:

Notes:

  • Ensure you have a data source you can add columns to if you are following the example below
  • Save a copy of your PBIX report so you can make visual edits in the future. Once you edit a data model using the XLMA endpoint, you can no longer export it as a PBIX file from the online PBI service

Process:

  1. First, create and publish a Power BI Report to the online service. No need to add any visuals, but make sure you have at least one table you have access to edit the columns in to follow along with this demo. You will need a Power BI Pro license and access to publish to a Premium workspace.
  2. Next, add a column to your data source that does not currently exist in your Power BI report. For example, make a column in Excel or SQL called “New Column Test” with the letter “a” filled in for every row. I will make one called “Description” in my example.
  3. Unfortunately, Power BI does not refresh the schema in the service so it will not pull in the new column unless you open up the report in Power BI Desktop and refresh there then republish. One way around this is using the XMLA endpoint from the premium workspace and add the column into the JSON code using the TOM (Tabular Object Model) in C#. Before we walk through each of those steps, keep in mind that doing this will prevent that Power BI dataset from being downloaded as a PBIX file ever again. So, it’s best to keep a local copy of that PBIX file for any visual updates that need to be made, or simply use this dataset as a certified dataset to be used in multiple reports.
  4. Open the premium workspace, select settings, and go to the “Premium” tab to copy the workspace connection.

5. Here comes the scary part, but hey it’s October, the month for tackling our fears! So here we go. Time to make a basic C# application. Open up a file in Visual Studio (ensure you have .Net 5.0 and .Net Core installed as well) and navigate to File –> New Project, choose the Console Application template (should be top one), pick any name you’d like (aka PowerBI_TOM_Testing), select .NET 5.0 for your framework, then hit create. Phewf, you have your app, yay! Under the view tab, go ahead and select Solution Explorer
and you should see it pop open on the right side of your screen.

6. Double-click on “Program.cs” to open your project. Now, go under the Tools tab to NuGet Package Manager then to Manage NuGet Packages for Solution. This is where we get to inform our application of the packages of code we want to use.

7. Go to Browse and search Microsoft.AnalysisServices.NetCore.retail.amd64 and two options should pop up. Go ahead and hit “install” for each of them. Once you’re done, double-check the install by hopping over to the Installed app and make sure they are both there (be sure to clear your search first).

8. Go ahead and close this window and go back to the Program.cs tab and let’s try out using a script using our XMLA endpoint! Swap out PowerBI_TOM_Testing with whatever you named your project in step 5. And Swap out the powerbi://api.powerbi.com/v1.0/myorg/POC with the link you copied in step 4. You should see zero errors show up on the bottom. If not, double check that you have all of the brackets and semi-colons.

using System;
using Microsoft.AnalysisServices.Tabular;

namespace PowerBI_TOM_Testing
{
    internal class Program
    {
        static void Main(string[] args)
        {

            // create the connect string
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/POC";
            string connectString = $"DataSource={workspaceConnection};";

            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);

            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine(database.Name);
            }
        }
    }
}

9. To run it and get back the datasets in your workspace, simply hit the green arrow at the top. It will pop open with a sign in option, so sign into your Power BI account and watch it go! To see your output, wait for the debugging window to finish running and you should see a list of all the datasets in your workspace!

10. Okay time to add a column into the data model!

For this section, I am going to add some conditional logic so that the script knows what to do if the column already exists. Now fair warning, there’s also a bit of script that adds a measure for you as well. You can delete that section of code, or use it as a template for adding measures into your data model. For more example code, please check out the Power BI Development Camp (https://github.com/PowerBiDevCamp/Tabular-Object-Model-Tutorial/blob/main/Demos/Learning-TOM/Learning-TOM/DatasetManager.cs).

Notes are in green.

Important note, you have to have the SaveChanges() command AFTER the refresh request. If you put the refresh after the save changes, you will have a column with zero data in it.

Here’s the full script, including the script for adding a measure. Please feel free to utilize the additional resources for more examples and assistance. Paste pieces of the code below into your visual studio and enjoy watching your data magically appear into your data model.

using System;
using Microsoft.AnalysisServices.Tabular;


namespace PowerBI_TOM_Testing
{
    internal class Program
    {
        static void Main()
        {

            // create the connect string
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/POC";
            string connectString = $"DataSource={workspaceConnection};";

            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);

            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine($"ID : {database.ID}, Name : {database.Name}, CompatibilityLevel: {database.CompatibilityLevel}");
            }
            // enumerate through tables in one database (use the database ID from previous step)
            Model model = server.Databases["bb44a290-f82c-4ec3-a510-e9c1a9a28af2"].Model; 
            
            //if you don't specify a database, it will only grab models from the first database in the list
            foreach (Table table in model.Tables)
            {
                Console.WriteLine($"Table : {table.Name}");
            }
           
            // Specify a single table in the dataset
            Table table_product = model.Tables["Product"];

            

            // List out the columns in the product table
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
             }

            // Adding our column if it doesn't already exist
            if (table_product.Columns.ContainsName("Testing")) //this looks to see if there is a column already named "Testing"
            {
                Console.WriteLine($"Column Exists");
                table_product.Columns.Remove("Testing"); //if the column exists, this will remove it
                Console.WriteLine($"Column Deleted");
                Column column_testing = new DataColumn() //this will add back the deleted column
                {
                    Name = "Testing",
                    DataType = DataType.String,
                    SourceColumn = "Description"
                };
                table_product.Columns.Add(column_testing);
                Console.WriteLine($"Column Created!");
            }
            else
            {
                Column column_testing = new DataColumn() //this will add the column
                {
                    Name = "Testing",  //name your column for Power BI
                    DataType = DataType.String, //set the data type
                    SourceColumn = "Description" //this must match the name of the column your source 
                };
                table_product.Columns.Add(column_testing);
                Console.WriteLine($"Column Created!");
            }




            // List out the columns in the product table one more time to make sure our column is added
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
            }



            // Add a measure if it doesn't already exist in a specified table called product
            if (table_product.Measures.ContainsName("VS Test Measure"))
            {
                Measure measure = table_product.Measures["VS Test Measure"];
                measure.Expression = "\"Hello Again World\""; //you can update an existing measure using this script
                Console.WriteLine($"Measure Exists");
            }
            else
            {
                Measure measure = new Measure() 
                {
                    Name = "VS Test Measure",
                    Expression = "\"Hello World\"" //you can also use DAX here
                };
                table_product.Measures.Add(measure);
                Console.WriteLine($"Measure Added");
            }


 
            table_product.RequestRefresh(RefreshType.Full);
            model.RequestRefresh(RefreshType.Full);
            model.SaveChanges();



        }
    }
}


Additional Resources:

Power BI: Adding Columns to a Published Data Model using the XMLA Endpoint & TMSL

Goal of this demo: Update a Power BI model schema by adding a column to the data model without opening a PBIX file and ensure the scheduled refresh still works.

Why would this be useful? Updating the schema in the desktop tool requires an entire refresh of the data model which can take a while if your model is large. Also, app developers could systematically add new elements to existing data models using a formulaic XMLA script through SSMS, saving your report designers time when new fields need to be added.

Limitations: You cannot export the PBIX file from the service once the XMLA updates have been made. For adding columns to the model, that’s not a big problem since those would be added in once you opened the desktop tool again. The problem comes if you create or edit visuals in the online service that you don’t want to overwrite in future iterations.

Tools needed:

  • A data source you can edit (Excel will work, this demo uses a SQL view)
  • Power BI Desktop
  • Power BI Premium Workspace (Premium Per User should also work)
  • Power BI Pro License
  • SSMS (SQL Server Management Studio)

Notes:

  • Ensure you have a data source you can add columns to if you are following the example below
  • Save a copy of your PBIX report so you can make visual edits in the future. Once you edit a data model using the XLMA endpoint, you can no longer export it as a PBIX file from the online PBI service

Process:

  1. First, create and publish a Power BI Report to the online service. No need to add any visuals, but make sure you have at least one table you have access to edit the columns in to follow along with this demo. You will need a Power BI Pro license and access to publish to a Premium workspace.
  2. Next, add a column to your data source that does not currently exist in your Power BI report. For example, make a column in Excel or SQL called “Testing” with the number 1 filled in for every row.
  3. Unfortunately, Power BI does not refresh the schema in the service so it will not pull in the new column unless you open up the report in Power BI Desktop and refresh there then republish. One way around this is using the XMLA endpoint from the premium workspace and add the column into the JSON code using the TMSL scripting in SSMS. Before we walk through each of those steps, keep in mind that doing this will prevent that Power BI dataset from being downloaded as a PBIX file ever again. So, it’s best to keep a local copy of that PBIX file for any visual updates that need to be made, or simply use this dataset as a certified dataset to be used in multiple reports.
  4. Open the premium workspace, select settings, and go to the “Premium” tab to copy the workspace connection.

5. Open SSMS and select “Analysis Services” for your server type. In the server name, paste in the workspace connection string that you copied in step 4. Authentication will be Azure Active Directory with MFA. The user name and password will be the same email and password you use to access Power BI.

6. Under databases, you’ll see all the datasets present in that workspace. Expand the database with the name of your dataset and expand the tables. Note, there will be tables present that you don’t recognize. For every date table in your data model, Power BI builds a table behind the scenes that will now be exposed. Navigate to the table you want to add the column to and right click it.

7. Navigate through by hovering over “Script Table as” then “CREATE OR REPLACE To” then select “New Query Editor Window”. This will open a script to adjust the data model of that table in TMSL (tabular model scripting language).

8. Now here’s the tricky part. It’s best if you already have a column in your data model that is the same data type as the one you want to add so you can just copy/paste the JSON object from the existing script. My example is for an integer column, but you can do this for any data type. Scroll down in the code until you start to see your column names. In my dataset, I have a column named “Custom1” that is the same type as my “Testing” column. All you have to do is copy and paste the code of your sample column then swap out any place where it says “Custom1” (aka whatever your sample column name is) with the name of your new column.

Sample Column
New column

9. Delete the line of code that says “lineageTag” from your new section of code. The lineage tag only matters if you are editing an existing column, Power BI will generate a new lineage tag once this column is officially added to the schema.

10. Hit “Execute” or F5 to push the schema change to the data model in the service. The message at the bottom will run through a few items, but the final response should look like the one in the image below.

11. The final step is to refresh your data model in the Power BI service on demand or by using the script below. To run this script, you’ll need to select your main table then select “New Query”. You should see your measures and metadata populate as the analysis service cube is exposed. Once you see that, you can copy/paste the code below to refresh the table (this is also how you can refresh one table at a time if needed, hint hint). Execute and your new column will now have data in it, yay!

{
“refresh”: {
“type”: “automatic”,
"objects": [
{
"database": "YOUR DATASET NAME HERE",
"table": "YOUR TABLE NAME HERE"
}
]
}
}

12. Test it out! Go into your report in the service, hit the edit button and update your report with the your new column! But remember, you no longer have the option to download the PBIX file. So any changes that need to be made to the data model (i.e. new measures) need to be done through the XMLA end point, and any visual changes must be done in the online service.

Additional Resources: