Power BI Adding Translations to Rename Columns – XMLA, TOM, C#

If you are new to using C# and the Tabular Object Model (TOM), please check out the previous blog post (https://dataonwheels.wordpress.com/2021/10/15/power-bi-meets-programmability-tom-xmla-and-c/) for both an introduction to the topic and detailed instructions on getting the demo stood up.

For the TOM and XMLA experts, imagine this. Your customer wants to dynamically rename columns without using the Power BI Desktop and would prefer all existing report visuals not get broken by the new name. Impossible? Not with TOM, XMLA, and translations within Power BI.

If you’ve ever tried to change a column name in a Power BI source, you’ve likely run into this error on any visuals that contained the renamed column. And when you hit that “See Details”, it will tell you the column that you simply renamed is no longer available for your visual.

So how do we get around that? Translations. Translations are typically used to translate report entities to other languages that will change depending on what language the end user has set on their browser. However, we can hijack this functionality to rename columns without having to impact the data model. It is a bit confusing on why this works, but imagine this: you build a Lego pyramid, but learn that one of the blocks needs to be changed from blue to green. Couple of options, you can take apart the entire pyramid (this would be akin to reopening the PBIX in Power BI Desktop and changing all of your visuals) OR you can take a green marker and color that blue brick green (adding a translation from blue to green).

If you don’t need to put this code into C#, the Tabular Editor is an excellent tool for adding translations to your data model (https://tabulareditor.com/creating-multilingual-power-bi-datasets/). However if you would like to programmatically update column names using C#, feel free to use the script below in your solution.

At a high level, here’s the hierarchy of entities used:
Workspace – Dataset – Data Model – Cultures – Object Translations
Workspace – Dataset – Data Model – Table – Column – Translated Properties

Note: There can only be one translated property per culture.

To add translations, we first need to set which culture this translation belongs in. For this example, we will use “en-US” because that is what default browser we want these names applied to. The code snippet below will list out all the cultures (aka website language codes) that are configured in this data model and list out all the translated objects (data columns in this case) that already exist.

After setting the culture/language, narrow down the column that this translation will be applied to and create a variable for the translation object. The translation object consists of two parts, the metadata object (column in this example) and the property of that metadata that we want to translate (caption in this example which is essentially display name).

Once we have these elements, we can check to see if this column already has a translation for this culture. If it does, this script will remove the old translation to allow for overwriting. If it does not, it will add the new translation to the culture within the data model.

And that’s it!

Here’s what it looks like in the service. Don’t forget to refresh your report page if you have it open for the new name to appear. There’s no need to refresh the dataset.

Full C# code:

using System;
using Microsoft.AnalysisServices.Tabular;



namespace PowerBI_TOM_Testing
{
    class Program
    {
        static void Main()
        {

            // create the connect string - powerbi://api.powerbi.com/v1.0/myorg/WORKSPACE_NAME
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/YOURWORKSPACE";
            string connectString = $"DataSource={workspaceConnection};";

            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);
            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine($"ID : {database.ID}, Name : {database.Name}, CompatibilityLevel: database.CompatibilityLevel}, Last Updated : {database.LastSchemaUpdate}");
            }
            
            // enumerate through tables in one database (use the database ID from previous step)
            Model model = server.Databases["bb44a298-f82c-4ec3-a510-e9c1a9a28af2"].Model; 
            
            //if you don't specify a database, it will only grab models from the first database in the list
            foreach (Table table in model.Tables)
            {
                Console.WriteLine($"Table : {table.Name} IsHidden? : {table.IsHidden}");

            }
           
            // Specify a single table in the dataset
            Table table_product = model.Tables["Product"];

            
            
            // List out the columns in the product table
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
             }


            //Translations can be used to rename existing columns without rebuilding the model. This also updates any visuals that use that column. 
            // List of translations on the model
            foreach (Culture culture in model.Cultures)
            {
                Console.WriteLine($"Existing Culture: {culture.Name}"); 
            }

            // Let's get a list of the existing translations within the en_US culture
            Culture enUsCulture = model.Cultures.Find("en-US");
            
            foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations) 
            {
                Console.WriteLine($"Translated Object: {objectTranslation.Value}");
            }
            // Narrow down what column within this culture/language you would like to add the translation to
            MetadataObject dataColumn = table_product.Columns.Find("Description"); //this needs to always be the original column name within the data model.
            ObjectTranslation proposedTranslation = enUsCulture.ObjectTranslations[dataColumn, TranslatedProperty.Caption];

            // Only one translation per entity per culture.
            if (proposedTranslation != null)
            {
                Console.WriteLine($"Translation Exists for this Culture & Column combo");
                enUsCulture.ObjectTranslations.Remove(proposedTranslation); //need to remove the existing translation to overwrite it
                ObjectTranslation overwriteTranslation = new ObjectTranslation()
                {
                    Object = dataColumn,
                    Property = TranslatedProperty.Caption,
                    Value = "Blue"
                };
                enUsCulture.ObjectTranslations.Add(overwriteTranslation);
            }
            else
            {
                ObjectTranslation newTranslation = new ObjectTranslation()
                {
                    Object = dataColumn,
                    Property = TranslatedProperty.Caption,
                    Value = "Blue"
                };
                enUsCulture.ObjectTranslations.Add(newTranslation);
            }

            

            // List out the translations to see what they are now that we have run the script    
            foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations)
                {
                    Console.WriteLine($"Final Translated Object: {objectTranslation.Value}");
                }
            
model.SaveChanges(); //make sure this is the last line! 
       


        }
    }
}

Additional Resources:

https://www.kasperonbi.com/setting-up-translations-for-power-bi-premium/
https://tabulareditor.com/creating-multilingual-power-bi-datasets/
https://www.sqlbi.com/tools/ssas-tabular-translator/
https://docs.microsoft.com/en-us/analysis-services/tabular-models/translations-in-tabular-models-analysis-services?view=asallproducts-allversions
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.culture?view=analysisservices-dotnet
https://docs.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular.culture.objecttranslations?view=analysisservices-dotnet#Microsoft_AnalysisServices_Tabular_Culture_ObjectTranslations

Power BI: Data Model Optimization

Saving the Day from Delay Part 1

Optimizing your data model can be a daunting task. If you read the intro to this series, you know one of the most efficient and sustainable solutions to a bogged-down data model is to remove native date queries and use a date table. This post will dig into how and why this will speed up performance in both refreshes and in the online PBI service and how to make date and time keys.

The key to optimization is compression. An efficiently compressed data model is a lean, mean, query running machine. There are two types of compression – horizontal and vertical. Horizontal compression occurs on a row by row basis while vertical compression occurs column by column. Power BI uses the Vertipaq Engine, a vertical compression model, to compress data inside the data model. While vertical is more CPU intensive, it is also more efficient as it finds the best option for compression based on the data type in the column (values/whole numbers are most efficient). Data mozart does an in-depth look on this process that I highly recommend reviewing if you have more questions (link at bottom of this post).

Vertical compression is significantly slower on date time columns than value columns. DateKeys are your best friend in compressing your data model because they allow you to capture vital date information but store it in a value format (the most optimal format – think whole number). Converting all your primary date fields to a DateKey will allow all calculations using that primary date field to run much faster as Vertipaq can process the requests more efficiently.

Next post we’ll cover making time and date keys in the Power Query Editor.

External Resources:
https://data-mozart.com/inside-vertipaq-compress-for-success/
https://github.com/AnytsirkGTZ/TimeTable_MCode/blob/main/MQuery%20Time

Power BI Data Security – Power BI Report Server

Power BI Security LogoPower BI Report Server was released as a way to host reports on premises. It was one of the highest requested features for Power BI. Power BI Report Server offers a subset of Power BI Service capabilities and as such does not have the rich collaborative or security options as seen in the service (online). I have a number of other thoughts on this tool, but that is not the purpose of this post. I am often asked in sessions on Power BI Data Security about Power BI Report Server. The problem is that Power BI Report Server is not what Power BI was intended to be and as such security is very different.

Power BI Report Server Uses Files and Folder Security Model

Unlike Power BI Service which leverages the Office 365 security model with workspaces and apps, Power BI Report Server only supports deploying Power BI Desktop files as Power BI Reports in SQL Server Reporting Services. Each file is uploaded to the Report Server and it is viewed by opening the report.

Power BI Report Server - Portal

You have essentially three layers of access to the report file security in Power BI Report Server.

  1. The portal itself can be secured. You can and should limit access to the reports by only allowing specific users or groups access to the report portal.
  2. Folders can be used to provide more granular security over a group of assets in the report portal. In the image above, I created a folder called PBI Secure Reports. A specific AD group has access to this folder. If a user does not have permissions to the folder, the folder does not show up in the portal and they cannot access the folder or the assets, including Power BI reports, stored in this folder.
  3. Individual reports can be secured as well. I never recommend this option as it becomes administratively difficult to manage. However, the capability is there is a single asset needs to be secured in this fashion.

These options work for any asset stored in the Report Portal and are not limited to Power BI reports.

Power BI Report Server Report Nuances

If you have read many of my posts around Power BI Data Security, I have discussed gateways, workspaces, and even Office 365 groups. The following items are uniquely related to Power BI reports stored in the report server.

Data Sources

Because it is not possible to use a Reporting Services Shared Data Source with your Power BI reports at this time (not sure if this will change), the data source information will need to be managed by report. This is not dissimilar to the Power BI Service. However, for on premises data we use the gateway with the service. There is not a common or shared data access feature in Power BI Report Server. Data sources are included in the report and can be managed in the deployed report by clicking the ellipses on the icon and selecting manage.

Power BI Report Server - Data Source

Only the report creator, Content Managers, and Publishers have permissions to see and modify this information.

Protecting Data Sources

One concern raised is whether data sources with credentials are downloaded when the Power BI file is downloaded. First key idea is that only users with the Publisher and Content Manager roles can download the files

When a Power BI report file is downloaded, you are prompted for credentials when refreshing data. Passwords are encrypted so they must be reentered when the file is opened or data refreshed in the desktop.

Mapping SSRS Roles to Power BI Functions

The portal, folders, and files are secured using SSRS roles. Here is a high level summary of the roles and how it impacts Power BI Reports

  • Browser – This is similar to a read only function or if you would deliver the report as a Power BI App in the service. This should be used by anyone who needs access to the report, but does not create content.
  • Content Manager – This is for a content creator with admin privileges. This role can manage content and user access.
  • Publisher – This is for content creators who do not need to manage users. Most content creators fall into this category.
  • Report Builder – does not apply to Power BI Reports which use Power BI Desktop to develop reports.

Avoid Deploying to the Portal Home Page

I would recommend you not allow Power BI reports to be deployed to the primary portal, but create folders to manage the group of content creators and managers based on the department or group level needs.

Reports deployed to the home page of the portal should be managed as enterprise assets. Use the folders and related security groups to manage who can deploy at this  level.

Wrap Up

I would like to extend a special thanks to Kathy Vick, Principal Consultant at Pragmatic Works. She provided guidance on implementing Power BI Report Server. Thanks Kathy for the help on this topic. Check out Kathy’s Twitter (@MsKathyV)  and LinkedIn account to learn more.

Check out more about Power BI Data Security in my blog series.

PowerPoint–My Dashboard and Report Design Tool

At some point I think that I am becoming a Microsoft OfficeMALL13_Badge_See125x125 specialist as opposed to a BI Architect.  All of this work in Excel and now PowerPoint.  Okay, done with the ramblings.  As I have noted in a couple previous posts, I am working with a team on the Modern Apps Live! conference which is in Vegas next week.  Well, this is another “lesson learned” that I wanted to pass along as a result of doing that work.  (Hope to see you there.)

Using PowerPoint 2013

Microsoft Powerpoint 2013 IconSo I had to create two types of data visualizations for this conference.  Usually, I would use paper or white board to sketch it out and then proceed to make it a reality.  Somewhere along the way, I heard that Microsoft uses PowerPoint to lay out UIs.  Not sure if it is true or not, but it seemed easier and less expensive than Blend or Visio, so I thought I would give it a try.

So, I first needed to create a summary report for a poll within the app that was created.  I used the standard tools with in PowerPoint such as tables, charts, text boxes, and images to mock up my report.  What I liked was I was able to add notations to the mockup for future reference.

image

I had some frustration creating the charts as I wanted them to be representative.  But overall not a bad experience.  The next task I was taking on was working with the dashboards I was going to create in Excel 2013.  I still wanted to lay it out so I knew what I would be trying to design.  This was when I stumbled onto the Storyboarding menu.

image

I actually like using the shapes in this toolset better.  Turns out this is available when you install Visual Studio Ultimate, Visual Studio Premium (my version), or Visual Studio Test Professional.  More on that can be found on MSDN – Storyboard Using PowerPoint.  This can be integrated into TFS and directly associated to work items.  I am not a UX expert, but I like the ability to add tabs like I will have in Excel and there is even a SharePoint page background.

image

However, as you can see, even if you don’t have Storyboarding you can still effectively build up a PowerPoint slide to look like the report, dashboard, or even SharePoint page.  I was not sure if I would be able to embrace this, but in the end I really like the simplicity and using PowerPoint allows for comments, versioning in SharePoint, and other mechanisms to support dashboard design.

I also wanted to pass along another blog post I found from Jason Zander on the Windows Azure team on the same subject:  My Favorite Features: Creating Storyboards with PowerPoint.  Hopefully this gives you another simple way to mock up reports and dashboards when you can’t find that User Experience Pro.

Oracle Tips for MSBI Devs #3: Choosing Drivers

When working with Oracle, drivers are truly a pain to get working correctly.  I will discuss my preferred choice and why for the following tools – SSIS, SSAS, and SSRS.

SSIS Drivers

Without much question, you should use the Attunity tools for working with Oracle data in the Data Flow task.  In SSIS 2008, the SSIS Connector is free and can be found here:  http://www.microsoft.com/download/en/search.aspx?q=oracle%20connector.  It includes the connection manager, source component and destination component.  Without a doubt this is the only way to work with Oracle data components in the Data Flow task.  (NOTE: I cannot find the SSIS 2012 equivalent at the moment.  However, Matt Massan’s blog post after PASS Summit 2011 notes more work is being done with Attunity.)  UPDATE: I wrote this prior to a blog post from Matt Massan on support for SSIS 2012 with v2.0 of the Microsoft Connector.  Check out Matt’s update on this: http://blogs.msdn.com/b/mattm/archive/2012/04/04/microsoft-connectors-v2-0-for-oracle-and-teradata-now-available.aspx.

However, this connector does you no good when working with the Execute SQL task.  In SSIS 2008, I use the OLE DB provider from Oracle to create the connection used with Execute SQL Task.  In my work with procedures in my first tip, I used the OLE DB provider with 2012 as well and it worked fine.

SSAS Drivers

When working with SSAS 2008 and, until I know differently, SSAS 2012, I would recommend using the Oracle OLE DB driver from Oracle.  This driver is not the fastest I have seen (third party drivers are marginally faster and the .NET driver is faster as well), but it has provided consistent results for the right price.  Third party drivers will improve the throughput, but not substantially.  The Oracle provided .NET driver is faster as well, but has an unchangeable active query timeout of one hour.  If you have any processing times that exceed this, it will unceremoniously drop the connection.  For these reasons, I have stuck with the OLE DB provider from Oracle which is not necessarily the fastest, but it has been the least painful to work with.

SSRS Drivers

It is with SSRS I have seen mixed results.  Primarily because of the better performance in the Oracle .NET driver.  If you can guarantee that your reports will return their data in under an hour, this seems to be the best option.  However, if you want to manage to a single driver set across all tools, you may find that the management of the OLE DB driver as the only driver makes sense in your organization.

Test, Test, Test

I have given you my experience using the drivers above.  However, you may find value in purchasing a third party driver or you may find a different experience when you implement in your environment.  Be sure to test and understand the implications in maintenance and system cost when choosing different drivers across your solutions.