All posts by Steve

Starting and Stopping SQL Server with PowerShell

Have you ever had this issue? You are trying to explore new features of SQL Server or you want to install the latest version for some testing, but SQL Server is a resource hog and kills your the performance of your PC. I have been evaluating SQL Server 2019 including both flavors of Analysis Services. That means I am installing the Developer edition of SQL Server on my laptop (a Yoga 900) running Windows 10.

See the source image

I have dealt with this issue in the past by opening the services window and setting the Startup Type to Manual and then turning it on when I wanted to work with them. In my current use case, I did not want to have to manage that with a lot of clicks. So, my thought was this should be easy with PowerShell. Of course, nothing is easy when it is not something you do everyday. So I started digging.

I found it is definitely possible but you need to know the steps that matter including making sure that PowerShell will execute the scripts on your PC. So here are the steps and code.

Empowering Your PC to Run Your PowerShell

The first issue is that PowerShell will not execute on your PC until you allow it. You will need to open PowerShell as an Administrator. You can leave the window open once we are done here. The following code will allow code created on your PC to be executed on your PC.

Set-ExecutionPolicy RemoteSigned

Shutting Down SQL Server with PowerShell

I chose to create two PowerShell files that will allow me to execute these steps for all the services I am managing in. In my examples, I have three instances of SQL Server – Data Engine Services, Analysis Services – Multidimensional Model, and Analysis Services – Tabular Mode.

The code required to startup SQL Server is noted below.

Set-Service 'MSSQL$DOWSQL2019' -StartupType Disabled
Stop-Service -Name 'MSSQL$DOWSQL2019' -Force

Let’s break it down. The first line will actually set the StartupType to Disabled. This will prevent it from restarting via a reboot. You will need to be intentional about restarting the service. The second line is the command to stop the service. The “Force” flag will shut down dependant services like SQL Agent if you have that running.

You will need to know the service name. SQL Server Data Engine Services are typically named “MSSQL$” followed by the instance name. If you are using Analysis Services, the naming using “MSOLAP$” as the prefix.

You can run these scripts directly in your PowerShell window. I did this while testing them. I wrote them in Visual Studio Code, but had issues executing them. You may be fine, I wanted to let you know my experience. Once I had my three instances scripted, I saved them in a file called SQLServerOff.ps1.

Turning Them on with PowerShell

The process and the code is similar. You first need to enable the services then start them up.

Set-Service 'MSSQL$DOWSQL2019' -StartupType Manual
Start-Service -Name 'MSSQL$DOWSQL2019' 

I chose the Startup Type “Manual” so I still need to turn them on. It is possible to set that to Automatic if you want it to start up on a reboot for instance. I then saved these in a file called SQLServerOn.ps1.

Running the PS1 Files

There are a couple of options to execute your file. While I was testing and I used the PowerShell window I had open. In order to execute your file replace “YOUR PATH” with the full path to the script.

PS C:\WINDOWS\system32> & "YOUR PATH\SQLServicesOn.ps1"

While a good pattern, I was still looking for the “one-click” solution. Some more digging resulted in me finding the way to execute PowerShell file from a shortcut. I did this by going to the desktop, right clicking an empty space and creating a new, blank short cut. Here is the command you can use for the shortcut:

powershell.exe -noexit -File “YOUR PATH\SQLServicesOn.ps1”

Add the snippet above with the proper path for your code to the shortcut location. Once you have named your shortcut and saved it, you need to open the properties on the shortcut. Go to the advanced settings and run this as Administrator (this will not work without that setting turned on). You can now add the shortcut to a place convenient for you such as your task bar.

While I focused on SQL Server services, this will work other services as well. Enjoy!

Power BI, Dataflows, CDM, and Dynamics 365 – My Journey

PBIDataFlowsAndCDMSeries
Journey into Dataflows, Dynamics, and CDM

My company, Pragmatic Works, is working through a Dynamics 365 migration. As part of that migration, we need reports. We have made the decision, seemingly obvious given who we are, to use Power BI for all of our internal reporting. Through the process, I have been working with our migration team a lot and have been tasked to handle key reporting for the consulting team. We are implementing both CRM and PSA (Project Service Automation) as part of the rollout. I am responsible for reporting that supports the non-Sales operations for the consulting organization. This series of posts will follow my journey to get a good solution in place. I will give you the resources I used, the advice from the pros on my team, and anything else I can share to help your journey along as well.

I want to caveat that this is my journey through the process. I am sure some mistakes will be made along the way, but we should all learn together. I am not doing this in a vacuum. I have some very talented Power BI team members helping me with strategy and design. Their contributions will be reflected throughout the process, but want to give them credit now for sure.

Evaluating Power BI Dataflows vs Power BI Shared Datasets

I started the process by trying to determine what is the best option for building data models that can be used for ongoing report creation within our company. I was familiar with shared datasets and with the latest improvements in the service, it was a good place to start. However, I have been talking with the team about Power BI Dataflows (and Azure Data Factory Dataflows, but that is not relevant here). I put it out to the group above to discuss pros and cons. Overall, the team pointed out that Dataflows with Dynamics would be the best fit, however, there is not much out there on Dataflows in action.

Brian Knight and I were having a different conversation about the Common Data Model (CDM) and PowerApps for other projects. During that conversation, Brian also mentioned that the CDM was ideal for working with Dynamics data.

Coming full circle, the team agreed that CDM with Dynamics and Dataflows is a good way to go. And so starts the journey. I will be documenting the research and progress along the way here. I will be posting whenever I can. Here we go!

Power Platform Conference – 7/23-24/2019

Pragmatic Works is hosting a 2-day Power Platform online conference. If you are unable to get out to onsite conference but need some help with Microsoft Flow, PowerApps, or Power BI, this is a great opportunity to learn from some experts from around the world.power-platform-virtual-conference-v04_facebook Information About the Conference

This is a 2-day virtual conference covering Power BI, PowerApps and Flow. The cost is $49 for all sessions, recordings and bonus materials. There will be 12 sessions each day.

 Dates: July 23 & 24, 2019

Times: 7 a.m. to 7 p.m. ET

Website: https://powerplatformconference.com/

I hope to see many of you there virtually!

It’s Been a While …

Last Post – SQL Saturday MN 2018

When last I wrote, it was after I spoke at SQL Saturday in Minnesota. I co-presented a pre-con with Josh Owen. Prior to that was a few weeks of blogs on Azure and Azure Data Week. As Pragmatic Works moved to an Azure focused company, I, too, was learning much about Azure and what it takes to move workloads to the cloud.sqlsat796_header                                                  Post Linkden

The New Role – Director of Consulting

In October 2018, I was promoted to Director of Consulting at Pragmatic Works. Over the past 6 months or so, I have had the chance to lead and grow the consulting team at Pragmatic Works. This has been a great growth opportunity for me and my career. I am not perfect by any stretch, but the team is awesome! Pragmatic Works Consulting continues to grow and do awesome work. email-signature

The Beat Goes On …

As I look forward to the next few months, there are some great things happening at home. I get to see my youngest daughter graduate from high school, my second child and oldest son will be graduating from Liberty University next month, and then he will be getting married to his college sweetheart. It is will be a busy summer and into the fall at my house. But it will be great as well. Love watching the family grow up and move us all to the next stage all of life.

As Pragmatic Works continues to grow this year, I look forward to seeing new team members join our team and existing team members expand skills and capabilities. In the last two years, Pragmatic Works has transformed itself from a SQL Server based solution company focused on Microsoft Business Intelligence to an Azure solutions company specializing in bring customers to Azure and Azure Data Services. It is an exciting time in technology today and we are a part of that excitement.

Here’s to a great year!

 

SQL Saturday #796 – Minnesota, 2018

sqlsat796_header

First, many thanks to the SQL Saturday and MNPass team for putting on another great event and letting me participate.

I co-presented with Joshuha Owen (@JoshSQL) in a precon on Modern Enterprise Data Warehousing on Azure. Thanks to those who attended and participated in the conversations around changing the way we implement data warehouse capabilities in Azure. Josh and I will be talking more about this in the future.

Now, those of you who attended my Saturday presentation on Consumption Based Architecture, I wanted to get you the slide deck and reference materials here. Thanks again for attending.

The slide deck from the session is here.

cba-featured-pic

I also have blog post around this topic: Consumption Based Architecture for Modern Data Analytics. Feel free to join the conversation there around this.

ERPs and the Consumption Based Architecture Conversation

One of the key topics that came up during the session was related to handling ERPs with minimal change. The key issue surrounding ERP solutions is with the data structure in those systems. Whether you work with SAP, JackHenry, or Dynamics, you have a situation where the data model is very complex and definitely not user friendly. In Consumption Based Architecture, we try to minimize data transformation and reshaping, but ERP solutions are by nature cryptic and complex. By definition, they are not consumable. So in the consumable space, we typically recommend using the vendor supplied solutions such as JHKnow, SAP BW and so on. These solutions provide a vendor managed interpretation of the data in the ERP for reporting and other solutions.

Security in this Architecture

The question was raised during the session around how to secure this. This does not have a simple answer. Each solution may have implemented security differently. For instance, an Oracle database may use user names and not have AD integration. This means that you need to determine how to secure your consumable space. For instance, if you pick Azure Active Directory, you would move data to AAD compliant structures in Azure such as Azure SQL Database, Azure SQL Datawarehouse, and Azure Databricks. This means you might need to use a tool like Goldengate with CDC to update a SQL DB which you can apply security to. This will allow you to centralize security for your consumable data. You will need to plan for security in whatever you do and create what you need to support it.

Thanks again everyone for joining us at SQL Saturday.