SQL PASS Summit – Day 3 (the rest)

After a info filled keynote, on to the rest of the day.  I decided to attend the Bare Metal Instructor Lab and work through a full installation of SQL Server 2012 for the next 3 days.  However, I will likely switch to hands on labs of my choice for the rest week.

The lab started with the installation of SQL Server 2012 on 3 servers using prebuilt scripts.  Including the first time I have ever been on Windows Server Core which has no GUI.  This is the seriously ligthweight version of Windows Server.  To build a configuration file from setup, you can cancel an GUI based install right at the end and it will generate this file.  This file can be used with command line installs.  I had some extra time before lunch, so I fired up a different hands on lab – Exploring Project “Crescent” (which is now Power View).  It was a quick demo and that tool is still really cool.

My next session was What Does it Take to Add MDS to Your Data Warehouse/Data Mart.  They demonstrated how to manage dimensions within SQL Server Managed Data Services (MDS) Denali.  The presenters were Yair Helman and Tyler Graham from Microsoft.  Here are some the things I learned in this session:

  • MDS has an Excel add-in.  From Excel, you can do the initial data build in MDS.  The data is managed on the server while you work with it in Excel.
  • All of this meta data and the data are also available in the MDS UI which is web based.  Not only can it be viewed here, it can be manipulated here as well.
  • You can create and manage business rules about the data in MDS.  Once created you can apply it for a portion or all of the data.  In both cases, you can see what failed the business rule.  Once you make changes on the data, you can annotate the changes in bulk or individually when publishing the changes back into the system.
  • This appears to be a significant upgrade over the 2008 version of MDS.  The MDS Service requires SQL Server 2012, but can use the underlying SQL Server 2008 Data Engine.  If you have an old version or were using Denali CTP 3, you can upgrade to the new version easily.  Kudos to Microsoft on improving this tool rapidly.
  • They recommend using the data warehouse as a starting point for this tool as it is fairly clean.  Start small and continue to build it out over time.  However, I am not sure how significant the impact to cubes would be.

Final session of the day was Introduction to Data Mining in SQL Server Analysis Services by Brian Knight.

  • Data Mining has financial benefits.  Guess I need to get better at this.
  • Data Mining does not require a cube.  A table will suffice.
  • The core two algorithms to know – Decision Tree and Clustering.  Clustering is primarily for fraud detection.  Decision Tree is primarily for answering why a decision was made.
  • Check out http://sqlserverdatamining.com.
  • There is a lot of trial and error when configuring the algorithm and validating your hypothesis about the data.  You can usually prototype a solution in a week and have it ready after trial and error in about a month.
  • SQL Server Data Mining can deliver 80-90% of the data mining capabilities SAS delivers.  The basics and most common usages are there, for a bit less coin.
  • DMX is really not that hard, but very functional. It is actually simpler than MDX.

Besides these sessions and the labs, I spent some time visiting the vendors and got some good information on PDW and other data warehouse appliances.  These are exciting tools to bring BI solutions to market much faster as projects and as solutions.

Thus ends the training from Day 3, on to networking and socializing with my SQL peers.

SQL PASS Summit – Day 2

Today started for me with the SQL Saturday Round Table.  Andy Warren led the conversation around how SQL Saturdays continue to grow and improve.

One of the key points is that we want to see SQL Saturday expand into new cities.  Are you interested in having one?  Check out http://sqlsaturday.com for details.  (And of course, I am working with SQL Saturday #99 in Minnesota and hope to see many of you there: http://sqlsaturday.com/99/eventhome.aspx.)

Quick sidebar to SQL Saturday 99 attendees. One of the new features you should take advantage of is the Schedule Builder on the event site.  I encourage you to get on and build your schedule for the event.

After the SQL Saturday round table was done, I spent some time with other organizers and look forward to working with them in the future.  I managed to invite myself to a couple other organizational meetings for PASS chapter leadership.  Do you participate in your local SQL Server/PASS User Group?  There are great learning and networking opportunities at these events.  Join us sometime at at the Minnesota SQL Server User Group (http://minnesota.sqlpass.org).  Looking for a user group in your area check out http://www.sqlpass.org for groups near you.

Enjoyed the Welcome Reception and follow up at SQL Karaoke.  Looking forward to the first full day.

SQL PASS Summit–Some of What I Learned Day 1

I made it to the PASS Summit in Seattle.  I am going to take you on the ride with me.  I am going to pass along some of the highlights I learned and who did the teaching.

Today, I am attended the pre-conference session with Brent Ozar (@brento) on the basics of SANs and Virtualization for DBAs.  This is an area that I am not as strong as I would like to be in, but am always trying to get or provide advice on solutions with these technologies.  Here are some tidbits I gleaned from the session today.   Nice work Brent.

SAN basics:

  • Remove as much as  possible work from TempDB as more and more system work is being done there. You should consider “scratch” databases to support some of this work instead.
  • Avoid RAID 0, you will definitely get an outage.
  • RAID 5 – 1 write turns into 2 reads and 2 writes.  RAID 6 does this more.  Thus RAID 5 while costs less money, it costs more in performance.  I knew this before, now I understand why.
  • Shared arrays in SANs are easier to manage than Dedicated arrays when you can create a big pool of 30-50 drives to be shared by many servers.
  • SAN snapshots are a great tool for back ups, in particular in the data warehouse.  Be sure that it is integrated with SQL Server.  This is not cloning, which is a bad idea.  Snapshots should not take longer than 10 seconds.
  • Always review the SAN vendor documentation.  They usually have recommendations for setting SQL Server up correctly on their product.
  • Data compression is a good thing.  Especially for OLAP data.
  • Be sure to check the pathing in your SAN.  It is frequently the real botteneck on the SAN.
  • Use stats from your backup jobs to monitor SAN performance over time. You can find this information in MSDB. This allows you to see changes to the SAN that affected overall performance easily.
  • SQLIO is used to test storage in general.  It not just a SQL tool, as a matter of fact, SQL Server is not required.
  • For more details check out Brent’s site on SANs: http://BrentOzar.com/go/san


  • Task manager is worthless in a virtualized environment.
  • VMware has some great new counters to see how the host is working.
  • In some cases, assigning fewer virtual CPUs to each SQL instance may actually perform better by reducing CPU contention.
  • Memory reservations in VM for SQL Server is a great plan.  Successful shops will get this taken care of and good VM admins will balance this with failover requirements.

Check out Brent’s website for more great information on SQL Server:  http://www.BrentOzar.com.

I wrapped up the night at Lowell’s in Pike Place Market at a networking dinner.  Met some SQL guys from around the world and right next door.

Looking forward to tomorrow’s luncheon with SQL Saturday planners.