SQL Bits 2023 Recap

I had an amazing time adventuring to Wales with other members of the data community last week! It was my first international conference, and I had heard so many incredible things about it, but still it blew away all my expectations. I’m so honored to have met so many incredible people throughout the week! Thank you to new friends and old for making it a conference to remember.

Below are the links to GitHub for my session as well as some notes from the sessions I attended. Disclaimer, these notes are not even close to the real thing. Be sure to check out these sessions if you see them being held at your local user group or SQL Saturday in the future! I tend to take very bullet point oriented notes to best organize my thoughts, hopefully these spark your interest and guide you to professionals who know about specific technologies within the data realm.

Check out the agenda and feel free to reach out to folks with topics you’re interested in! They may have their slide deck easily accessible to folks who may not make it to the conference.

Power BI Meets Programmability with Yours Truly

Thank you so much to everyone who made it out to my session!

It was incredible to have 48 people in person and another 21 virtual! It was a humbling experience to see that room fill up and know they were all there to learn something new and exciting to help with their day-to-day work. Can’t wait to look through the feedback and learn where to tweak my presentation to make an even bigger impact in the future!

  • Github with slide deck, sample C# code, and sample PBIX file (with Excel file that powers it): Github Link
  • Category of blog posts related to TOM and C#: https://dataonwheels.wordpress.com/category/c-tom-and-power-bi/
  • Fun tip someone showed me after the session:
    If you use /* and */ to comment out chunks of code (like I do frequently during the session), you can use /* and –*/ so that you only need to comment out the /* to run that block. Pretty neat!

Supercharge Power BI with Azure Synapse Analytics with Mathias Halkjaer

Limitations of PBI

  • Source load minimization doesn’t exist in most cases (lots of queries against the same source)
  • Out of sight transformation layer
  • Not best tool for data warehouse tasks
  • No logs or output from quality checks
  • Doesn’t handle humongous data very well

Enchantments needed for PBI from Synapse

  • Time-travel (historical data & change tracking)
  • Enlarge (massive scale)
  • Counter spell (reverse ETL)
  • Wish (supports multiple programming languages)
  • Divination (AI)
  • Regenerate (CI/CD and source control)

Azure Synapse

  • A toolbox full of tools: serverless sql, data warehouse, spark engine, etc.

Data Lake

  • Landing zone for data
  • Cheap
  • Redundant
  • Quick and dirty analytics
  • Performance
  • Most efficient way to load data into data warehouse
  • Easily consumed

Report Design & Psychology (Quick Tips) with Jon Lunn

Dual process theory: intuitive vs attentive

You can mentally process up to 15 items intuitively. Past that it gets pushed to attentive.

Things can move from system 2 (attentive) to system 1 (intuitive) with repetition (aka driving or typing).

Layout – left to right, top to bottom. Move most important KPI to top left. Remove distractions (excess colors, 3d visuals, pie charts, etc). Titles in upper left make it easier to know what’s contained within the visual.

Tip – to visualize dots use dice patterns to move it to a system 1 process.

Ratios (how many, relative) vs percentages (how likely, chance risk probability). Both are good options to contextualize a number more. We usually understand ratios much better because how many is more intuitive than how likely.

Intuitive systems are more prone to errors or bias. If you want to hide something, go for percentage.

Use the power of 10’s (aka 8 out of 10 preferred it). Round and use a ratio to move it into intuitive process.

Power BI Datamarts What, How, Why with Marthe Moengen

You can manage RLS roles within datamart, join tables via GUI, and join with similar UI to PBI desktop modeling view. In preview only. You can use as a SQL endpoint to connect to your datamart.

Why? Access data through a SQL endpoint. Do simple aggregation using SQL script instead of M. There are potential challenges since this is in preview.

It keeps getting easier to move Oracle and Open Source workloads to Azure with David Levy

  • Cloud migration phases:
    • Discover
    • Assess
    • Migrate
    • Optimize
    • Secure & Manage
  • Azure Migrate: central hub of tools for datacenter migration
    • Primary tool for discover and assess
    • Azure Database Migration Service – Gen 2 has much more control for end users to say how many resources that item gets
  • Use Oracle Assessments in Azure Data Studio to get recommendations for sku size
  • Oracle databases in Oracle cloud can use OCI Interconnect to pull data back and forth as Azure IaaS
  • A lot of people move to PostgresSQL to pay less and get open source resources
  • Migration strategy
    • Rehost
    • Refactor
    • Rearchitect
    • Replace
  • No better way to run software than by SaaS. Why invest the time and resources when the vendor can maintain this for you and allow you to focus on your business?
  • The Oracle Assessments in Azure Data Studio can see all the database details, SKU recommendations, and Feature Compatibility. Each section has a migration effort attached, very nice
  • Azure is a great place for Postgres SQL:
    • High availability
    • Maintained
    • PostgresSQL migration extension can run assessment in ADS and understand the migration readiness
    • Integrated Assessment
    • Easy set up and config in ADS
  • To migrate postgres
    • Pg_dump scripts out the PostgresSQL schema
    • Pg-sql allows creating databases and tables
    • Use the PostgreSQL to Azure Database for PostgreSQL Online Migration Wizard
    • You can start cutover with pending changes
  • Azure Database for MySQL
    • Flexible server is now available, allows mission critical apps to use zone redundancy and fine-grain maintenance scheduling
    • Azure Database Migration Service (ADMS)
      • Migrates schema an data
      • Preview = replicate changes
      • You can do this as an online migration
  • Optimizing your environment
    • Costs during & after migration
      • During: Azure TCO Calculator, Azure Migrate, Azure Hybrid Benefit & Reserved Instances, and join Azure Migration & Modernization Program
        • Do reservations early, you’ll lose money otherwise
        • AMMP = Azure Migration & Modernization Program, provides coaching with best practice
    • Make sure you get good data for trouble times so your migrated instance is ready for the worst case scenario
    • Execute iteratively

5 Things You Can Do to build better looking reports – James McGillivray

  • Design is subjective
  • No clear end point
  • No one solution
  • Design is often seen as less important
  • Blog: jimbabwe.co.za
  • Tip Techniques
    • Grid Layout
      • Pro: left brain, less layout tweaking happens, looks professional
      • Con: can look boxy,
      • Gutters = white space between visuals
      • Margins = white space along edge of report
    • Maximize Real Estate
      • Pro: bigger visuals = more value, fewer visuals = less distractions
      • Con: often hard to convince stakeholders, all questions not answered by default
    • Beautiful Colors
      • Pro: use color to convey meaning, highlight data, show continuous data, qualitative sequential divergent
      • Con: many pitfalls, accessibility concerns, can be polarising
    • Consider the Audience
      • Pro: most important elements first, remove unneeded elements, hierarchy view (summary on top, grouped middle, detail on bottom)

Identifying and Preventing Unauthorized Power BI Gateways with Angela Henry

  • The problem: available to all, gateways are not only for PBI, results in chaos
  • How do we find gateways?
    • Manage connections & gateways. Let’s us identify gateways. Need to be Azure AD global, PBI service admin, or Gateway Admin
  • Reason to restrict: governance

SQL Saturday Atlanta 2023 Recap

This past weekend, I had the pleasure of attending and speaking at SQL Saturday Atlanta! If you’re in the area, I highly recommend connecting with the local user group and getting to know fellow data nerds near you. Thank you to everyone who was able to make it out, it was great to see such a large in person SQL Saturday event post-pandemic! Big shout out to the volunteers, organizers, and sponsors who made it all happen. Thank you all for your time, hard work, and commitment that will lead to rebuilding the data community. This was the first conference I felt confident enough in my presentation to attend a session in every timeslot! Below are some takeaways I took from the sessions I attended, but there were a lot of other incredible sessions and I recommend checking out the schedule for any that interest you and reaching out to the speaker.

Attended Session Takeaways

  • Practical Use Cases for Composite Models (Kevin Arnold)
    • Never thought of using composite models for pulling in an enterprise model with a thin slice specific to the use case of the report. Genius method for maintaining a focused enterprise model while meeting the needs of your end users.
    • Perspectives can be used with personalized visuals instead of hiding columns, so end users are not overwhelmed by column and measure options.
    • Field parameters can also be used/created by end users for a cultivated experience that meets their business needs without impacting the larger audience of your enterprise model. If you haven’t heard of them (I hadn’t), highly recommend checking out this link.
  • Planning Steps for a Power BI Report (Belinda Allen)
    • Always ask stakeholders what their experience is with Power BI, it will help put all their questions and assumptions in context.
    • Ask your stakeholder for the scope of success. If they can’t define what success is for the project, you have the wrong person or the wrong client.
    • Show nothing in a needs gathering session. Listen and take notes. Similar to watching a movie before reading a book, it will severely limit the imagination necessary for an impactful report.
    • Ask who is maintaining the data currently and who will continue to do so.
    • Check out PowerBI.tips Podcast.
    • Ask if they want data access or data analytics. This will let you know if a visual report is a waste of resources for them and/or if paginated report or something similar better fits their needs.
    • Check out Chris Wagner’s blog, he has a great slide deck for a wireframing session with success owner after the needs gathering session.
    • Host office hours or something similar to foster on-going user growth
    • After project, always ask if we achieved defined success benchmarks. Try to give them a concrete ROI (ie x hours saved = x $ saved based on average salary).
    • Linktr.ee/msbelindaallen
  • Introduction to Azure Synapse Studio Development Tools (Russel Loski)
    • Synapse workspace can allow you to create T-SQL and python notebooks off items in Azure Data Lake Storage like csv and parquet files.
    • Notebooks allow markdown to be side-by-side with code
    • ctrl + space will bring up snippets to use within a notebook
    • No indexing since it’s serverless, prepare for some wait time.
    • We can promote column headers using a variable HEADER_ROW = TRUE
  • DataOps 101 – A Better Way to Develop and Deliver Data Analytics (John Kerski)
    • Check out the The DataOps Manifesto – Read The 18 DataOps Principles
    • Principles are repeatable and adaptable to new technologies
    • Make everything reproducible
    • Versioning and automated testing are keys to building sustainable solutions
    • Check out the DataOps Cookbook and pbi-tools
  • Power BI Performance in 6 demos (Patrick LeBlanc & Adam Saxton from Guy in a Cube)
    • To reduce the “other” line item in performance analyzer, limit the number of visual objects on a page.
    • Optimize DAX when the line item is over 120 milliseconds.
    • SLA for page loads is 5 seconds.
    • Using drop downs in your slicer will delay DAX from running for that visual object. That deferred execution aids in speeding up the initial load.
    • Tooltips run DAX behind the scenes on the initial load for visuals (you can see this by copying out the DAX query into the DAX Studio). To delay this execution until it’s needed, use a tooltip page.
    • If the storage engine in DAX Studio is over 20 milliseconds, there’s opportunity to optimize.
    • Variables limit the number of times a sub-measure will be run and speed up DAX queries behind visuals.
    • Keep in mind while performance tuning, Power BI desktop has 3 caches – visual, report, and analysis services engine. You can clear all caches within the desktop tool except visual. To clear that cache, you need to close and reopen the PBIX file.

My Session

I cannot express enough how grateful I am for everyone who was able to make it to my session! To have so many established professionals in the field approach me afterwards telling me how well it went was a dream come true. If you’re interested in reviewing the slides and code, please check out my GitHub folder for all you will need to recreate the demo we went through. Miss it? No worries! I’ll be presenting this topic at SQLBits and the Power BI Cruise, so come join me! I’m also open to presenting at various user groups, feel free to reach out to me at kristyna@dataonwheels.com.

Again, thank you so much to everyone who made this weekend possible and please connect with me on LinkedIN and Twitter! I’d love to stay connected!

Introducing… Data On Rails!

Data On Rails is a Data On Wheels project designed to give a platform for up and coming data professionals in the data community. This project is the brainchild of Steve and Kristyna Hughes with the intent to support members of the data community who are excited to contribute but not sure where to start. Data On Rails is intended to be a launchpad for people’s brand and community involvement. Our goal is to see the next generation of data professionals and data community leaders accelerate their industry presence.

Our goal is to see the next generation of data professionals and data community leaders accelerate their industry presence.

Steve and Kristyna are happy to provide a platform similar to the platform Kristyna used to launch her community involvement and career. Building on the Data On Wheels brand as well as leaning on champions within the data community, our goal is to promote and propel the thought leadership for a motivated group of individuals who needed a platform to launch from.

Promote and propel thought leadership!

Data On Rails provides a platform for blogs as well as video content that can be shared and leveraged to build one’s resume. Using champions, we also have the opportunity to promote opportunities for speaking and other content creation throughout the community. Data On Rails promotes and shares content through its channels, however, it is on the content creator to create valuable content and build their brand as they move forward with their career. We expect many of these individuals to build their presence in the community and launch their own blogs or video channels in the future.

Thank you to those of you who are supporting these content creators as they grow their skills and share what they learn. If you are interested in becoming a content creator, please reach out to Steve or Kristyna Hughes, and we will walk you through what it takes to be set up in Data on Rails.

Power BI Embedded: Using Row-Level Security for Multilingual Reports

Making multilingual reports in Power BI requires a lot of different elements. Translations can be added to PBIX files to translate column names, visual titles, etc. but these translations will not impact the data itself. One way to translate the dimensions themselves (star schema introduction) is to have a translation table that contains a display language and value column for each ID of the dimension. In the example below, the translation table would replace the original table and we could apply RLS to the Language column.

Data Model Edits

To make RLS work in this manner, that is to filter down one table and not the entire data model, there are a couple of things that need to be set up. First, you will need a bridge table for every translated table that has a unique list of the ID’s. Second, the RLS filter needs to be added to each of the tables that has a translation. Lastly, the relationships need to be re-established so that the translated tables are only related to their bridge table.

Add RLS Filters

In order to apply RLS dynamically in an embedded Power BI scenario, the locale needs to be passed through the embedded token as an argument within the UPN (dynamic RLS instructions ). Unfortunately, Power BI can only receive one value for each embedded token, so if you need true RLS on the report along with dynamic filtering for the translation, you will need to pass both arguments through the embedded token and separate them later. For example, if the UPN of the user was example@test.com, you can program the embedded token to send the default browser locale send by that user and the UPN of the signed in user in this format: UPN|locale.

It looks a bit goofy, but the pipe delimitator will allow us to separate out the two RLS arguments via the DAX function PATHITEM(). With PATHITEM(), we can tell the RLS filter to either choose what’s before the pipe (the UPN in this case) or what’s after (the locale) to pass through for the role filter. The filter for the example above would be:

[Language] = PATHITEM(USERPRINCIPALNAME(),2)

Once you have applied this filter to all of the tables with the Language field, you’ll need to make some adjustments to your data model.

Reconnect Relationships

Per the diagram below, you will need to connect the bridge table into your existing data model. Remember, the ONLY relationship your translation table should have would be to the bridge table. Every other fact table can join to the bridge table. The joins are as follows – Translation Table has ONE bidirectional join as * to 1 to the Bridge Table, Bridge table joins to every fact table single direction with Bridge Table filtering the fact tables in a 1 to * relationship.

NOTE: do not use CROSSFILTER() in measures with tables that have RLS applied. They will break in the online Power BI service.

And that’s all there is to it! Hopefully this saves you time when building a multilingual, embedded report. Happy coding!

Additional Resources

2022 PASS Data Community Summit Recap: Power BI Meets Programmability – XMLA, TOM, C#

Thank you to everyone who made it out to PASS Data Community Summit! I have greatly enjoyed meeting so many fellow data nerds and have learned so much from all of you. Speaking in person is always a joy, but this crowd was one of the best. Thank you for laughing with me and asking great engaging questions. If you haven’t already, please connect with me on LinkedIN and reach out with any additional questions or fun stories as you begin to scale your Power BI! If you have seen this presentation before, please check out the section where we use actual DAX inside a calculated column and measure instead of just text. It was an adventure to update the code the day of presenting, but well worth it to add a little more flair to the presentation.

As promised, below is the code used in my presentation as well as a link to my GitHub with the presentation and zip file to run the code on your own laptop. Happy coding everyone!

Abstract

Tune in to learn how to programmatically add columns and measures to Power BI data models using TOM, XMLA, and C#! It is rare to find a Power BI developer who has a background in C#, but C# and other programming languages offer a lot of automation and scalability that is lacking in Power BI development. The XMLA endpoint is a powerful tool available in the online Power BI service that allows report developers to connect to their data model and adjust a variety of entities outside the Power BI Desktop application. For example, the XMLA endpoints can be used within a pipeline triggered by an application to update a Power BI model schema. This allows end users to create custom UDFs (user defined fields) on the fly and delete them. Similarly, developers can create and use translations for customer specific column renames without worrying about breaking visuals and complicated data models.

Links

Final C# Code (now with DAX calculated columns!)

using System;
using Microsoft.AnalysisServices.Tabular;

namespace TOM_Testing_Live
{
        internal class Program
        {
        static void Main(string[] args)
        {
//-------------- Connect to PBI Premium Workspace ---------------- //

            // create the connect string - powerbi://api.powerbi.com/v1.0/myorg/WORKSPACE_NAME
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/Happy%20Coding";
            string connectString = $"DataSource={workspaceConnection};";

            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);


//---------------- List out current state before we make changes -------------------//

            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine($"ID : {database.ID}, Name : {database.Name}, CompatibilityLevel: {database.CompatibilityLevel}, Last Updated : {database.LastSchemaUpdate}");
            }

            // enumerate through tables in one database (use the database ID from previous step)
            Model model = server.Databases["5d161e6b-697a-49b6-a3e6-7d19b940a8fd"].Model;

            //if you don't specify a database, it will only grab models from the first database in the list
            foreach (Table table in model.Tables)
            {
                Console.WriteLine($"Table : {table.Name} IsHidden? : {table.IsHidden}");
            }

            // Specify a single table in the dataset
            Table table_product = model.Tables["Product"];

            // Specify a single table in the dataset (our measure table)
            Table table_measures = model.Tables["_Measures"];

            // List out the columns in the product table
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
            }

            // List out the measures in the product table
            foreach (Measure productmeasures in table_product.Measures)
            {
                Console.WriteLine($"Measures: {productmeasures.Name}");
            }

            // List of translations on the model
            foreach (Culture culture in model.Cultures)
            {
                Console.WriteLine($"Existing Culture: {culture.Name}");
            }

            // List out the hierarchies in the product table
            foreach (Hierarchy hierarchy in table_product.Hierarchies)
            {
                Console.WriteLine($"Hierarchies: {hierarchy.Name}, Lineage Tag = {hierarchy.LineageTag}");
            }

            // List out the levels in our category hierarchy
            if (table_product.Hierarchies.ContainsName("Category Hierarchy"))
                {
                Hierarchy hierarchy_category_ = table_product.Hierarchies["Category Hierarchy"];
                foreach (Level level_categoryhierarchy in hierarchy_category_.Levels)
                {
                    Console.WriteLine($"Category Hierarchy Level {level_categoryhierarchy.Ordinal}: {level_categoryhierarchy.Name} Lineage Tag: {level_categoryhierarchy.LineageTag} from {level_categoryhierarchy.Column.Name}");
                }
            }
            else
                {
                Console.WriteLine($"No Category Hierarchy");
                }


        
 //----------------Deleting columns and measures ----------------//
   /*         
           // Deleting a column if it exists
             if (table_product.Columns.ContainsName("Testing"))
             //this looks to see if there is a column already named "Testing"
             {
                 Console.WriteLine($"Column Exists");
                 table_product.Columns.Remove("Testing"); //if the column exists, this will remove it
                 Console.WriteLine($"Column Deleted");
             }
             else
             {
                 Console.WriteLine($"Column Does Not Exist");
             } 

             // Deleting a measure if it exists
             if (table_measures.Measures.ContainsName("Test Measure"))
             //this looks to see if there is a measure already named "Test Measure"
             {
                  Console.WriteLine($"Measure Exists");
                  table_measures.Measures.Remove("Test Measure"); //if the measure exists, this will remove it
                  Console.WriteLine($"Measure Deleted");
              }
              else
              {
                  Console.WriteLine($"Measure Does Not Exist");
              }

*/
//--------------- Adding columns and measures --------------------//            

              // Adding our column if it doesn't already exist
                if (table_product.Columns.ContainsName("Testing")) //this looks to see if there is a column already named "Testing"
                {
                    Console.WriteLine($"Column Exists");
                }
                else
                {
                    Column column_testing = new DataColumn() //this will add the column
                    {
                        Name = "Testing",  //name your column for Power BI
                        DataType = DataType.String, //set the data type
                        SourceColumn = "Product", //this must match the name of the column your source 
                        LineageTag = "product-testing-column"
                    };
                    table_product.Columns.Add(column_testing);
                    Console.WriteLine($"Column Created!");
                }

                //Get the partition sources for each table
                foreach (DataSource datasource in model.DataSources)
                {
                    Console.WriteLine($"Datasource : {datasource.Name}");

                };

                // Add a measure if it doesn't already exist in a specified table 
                if (table_measures.Measures.ContainsName("Test Measure"))
                {
                    Measure measure = table_measures.Measures["Test Measure"];
                    measure.Expression = "\"Hello there\""; //you can update an existing measure using this script
                    Console.WriteLine($"Measure Exists");
                }
                else
                {
                    Measure measure = new Measure()
                    {
                        Name = "Test Measure",
                        Expression = "\"Hello World\"", //you can also use DAX here
                        LineageTag = "test-measure"
                    };
                    table_measures.Measures.Add(measure);
                    Console.WriteLine($"Measure Added");
                }

//--------------- Adding columns and measures with DAX functions! --------------------//            

            // Adding our column if it doesn't already exist
            if (table_product.Columns.ContainsName("List Price w Tax")) //this looks to see if there is a column already named "Testing"
            {
                Console.WriteLine($"Calculated Column Exists");
            }
            else
            {
                Column column_testing = new CalculatedColumn() //this will add the column
                {
                    Name = "List Price w Tax",  //name your column for Power BI
                    DataType = DataType.Decimal, //set the data type
                    Expression = "'Product'[List Price] * 1.05", //this is the DAX used to create your new column!
                    LineageTag = "testing_w_dax-column"
                };
                table_product.Columns.Add(column_testing);
                Console.WriteLine($"Calculated Column Created!");
            }


            // Add a measure if it doesn't already exist 
            if (table_measures.Measures.ContainsName("DAX Measure Example"))
            {
                Measure measure = table_measures.Measures["DAX Measure Example"];
                measure.Expression = "SUM(Sales[Total Product Cost])+100"; //you can update an existing measure using this script
                Console.WriteLine($"DAX Measure Exists");
            }
            else
            {
                Measure measure = new Measure()
                {
                    Name = "DAX Measure Example",
                    Expression = "SUM(Sales[Total Product Cost])", 
                    LineageTag = "test-measure-with-dax"
                };
                table_measures.Measures.Add(measure);
                Console.WriteLine($"DAX Measure Added");
            }




//------------------- Manipulating Hierarchies ---------------------//

                        //Editing an existing hierarchy originally called Category Hierarchy
                        //Once you rename the hierarchy, you'll have to adjust this code to the renamed Hierarchy
                        Hierarchy hierarchy_category = table_product.Hierarchies["Category Hierarchy Rename"];
                        {
                            hierarchy_category.Name = "Category Hierarchy Rename"; //this renames the hierarchy, note the lineage tag will remain unchanged
                            Console.WriteLine($"Category Hierarchy Renamed");
                         }

                        //Editing an existing hierarchy level 
                          Level level_Category = hierarchy_category.Levels.FindByLineageTag("ca792793-d3c3-4b5d-9bee-2b46c01833bb");
                          Level level_Subcategory = hierarchy_category.Levels.FindByLineageTag("4304e645-bb8f-4d7e-b25c-f629be2110d8");
                          {
                              level_Category.Name = "Category";
                              level_Category.Ordinal = 1;
                              level_Subcategory.Name = "Subcategory";
                              level_Subcategory.Ordinal = 0;
                              Console.WriteLine($"Category Hierarchy Levels Renamed & Reordered");
                          }

                        //Adding a new level to the hierarchy if it doesn't already exist
                          if (hierarchy_category.Levels.ContainsName("Model"))
                          {
                              Console.WriteLine($"Hierarchy Level Exists");
                          }
                          else
                          {
                              Level level_Model = new Level()
                              {
                                  Name = "Model",
                                  Ordinal = 2,
                                  Column = table_product.Columns.Find("Model"),
                                  LineageTag = "hierarchy-model-column"
                              };
                              hierarchy_category.Levels.Add(level_Model);
                              Console.WriteLine($"Hierarchy Level Added");
                          }


                         //Add a new hierarchy if it doesn't already exist
                           if (table_product.Hierarchies.ContainsName("New Hierarchy"))
                           {
                               Console.WriteLine($"New Hierarchy Exists");
                           }
                           else
                           {
                               Hierarchy hiearchy_new = new Hierarchy()
                               {
                                   Name = "New Hierarchy",
                                   LineageTag = "new-hierarchy-lineage-tag"
                               };
                               table_product.Hierarchies.Add(hiearchy_new);
                               Console.WriteLine($"Hierarchy Added");
                         //Creating levels to add to the new hierarchy
                               Level level_one = new Level()
                               {
                                   Name = "Model",
                                   Ordinal = 0,
                                   Column = table_product.Columns.Find("Model"),
                                   LineageTag = "product-model-column"
                               };
                               Level level_two = new Level()
                               {
                                   Name = "Product",
                                   Ordinal = 1,
                                   Column = table_product.Columns.Find("Product"),
                                   LineageTag = "product-product-column"
                               };
                               hiearchy_new.Levels.Add(level_one);
                               hiearchy_new.Levels.Add(level_two);
                               Console.WriteLine($"Levels added to new hiearchy");
                           };

             
            //-------------------------- Translations ------------------------------//

            //Translations can be used to rename existing columns without rebuilding the model. This also updates any visuals that use that column.          
            
                        // List of translations on the model
                        foreach (Culture culture in model.Cultures)
                        {
                            Console.WriteLine($"Existing Culture: {culture.Name}");
                        }
                        // Let's get a list of the existing translations within the en_US culture
                        Culture enUsCulture = model.Cultures.Find("en-US");

                        foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations)
                        {
                            Console.WriteLine($"Translated Object: {objectTranslation.Value}");
                        }
                        // Narrow down what column within this culture/language you would like to add the translation to
                        MetadataObject dataColumn = table_product.Columns.Find("SKU"); //this needs to always be the original column name within the data model.
                        ObjectTranslation proposedTranslation = enUsCulture.ObjectTranslations[dataColumn, TranslatedProperty.Caption];

                        // Only one translation per entity per culture.
                        if (proposedTranslation != null)
                        {
                           Console.WriteLine($"Translation Exists for this Culture & Column combo");
                           enUsCulture.ObjectTranslations.Remove(proposedTranslation); //need to remove the existing translation to overwrite it
                           ObjectTranslation overwriteTranslation = new ObjectTranslation()
                           {
                               Object = dataColumn,
                               Property = TranslatedProperty.Caption,
                               Value = "Cool Stuff"
                           };
                           enUsCulture.ObjectTranslations.Add(overwriteTranslation);
                        }
                        else
                        {
                           ObjectTranslation newTranslation = new ObjectTranslation()
                           {
                               Object = dataColumn,
                               Property = TranslatedProperty.Caption,
                               Value = "Total Rad"
                           };
                           enUsCulture.ObjectTranslations.Add(newTranslation);
                        }             

                       // List out the translations to see what they are now that we have run the script    
                       foreach (ObjectTranslation objectTranslation in enUsCulture.ObjectTranslations)
                       {
                           Console.WriteLine($"Final Translated Object: {objectTranslation.Value}");
                       }
            
            //------------------- List out end state --------------------------------//

            // List out the columns in the product table one more time to make sure our column is added
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
            }         

            // List out the measures in the table one more time to make sure our measure is added
            foreach (Measure productmeasures in table_measures.Measures)
            {
                Console.WriteLine($"Measures: {productmeasures.Name}");
            }

            // List out the hierarchies in the product table
            foreach (Hierarchy hierarchy in table_product.Hierarchies)
            {
                Console.WriteLine($"Hierarchies: {hierarchy.Name}, Lineage Tag: {hierarchy.LineageTag}");
                foreach (Level level_hierarchy in hierarchy.Levels)
                {
                    Console.WriteLine($"Level {level_hierarchy.Ordinal}: {level_hierarchy.Name}, Lineage Tag: {level_hierarchy.LineageTag} from {level_hierarchy.Column.Name}");
                }
            }

//-------------- Refresh our version of the data model then push/save changes back to the model in the PBI service -------------//
            table_product.RequestRefresh(RefreshType.Full);
            table_measures.RequestRefresh(RefreshType.Full);
            model.RequestRefresh(RefreshType.Full);
            model.SaveChanges(); 
            //make sure this is the last line! Note, this line will not work unless your dataset has proper data source credentials connected in the service

            Console.WriteLine($"Script Complete!");


        }
    }
}