PASS Summit 2012 Wrap Up

13 11 2012

Wow, what a week. Once again, PASS put on a great event that provided much in the way of events and training for the SQL Server community. If you followed my countdown you know some of what I love about PASS. Last year I blogged everyday but I did not do that this week. So, what was different for me? Well, for one I volunteered much more this year than last and I was privileged to speak twice. I spent more time meeting new people and catching up with friends and that was great as well. Enjoy my wrap up from my week.

Tuesday – Leadership Meetings, Welcome Reception, and some Karaoke

Before the event officially kicked off, I joined community leaders from around the world for a series of leadership meetings. First we had a meeting on SQL Saturdays which was an opportunity to see the immense growth of these free training events around the U.S. and throughout the world. What a great opportunity for SQL Server professionals to improve their skills and for those passionate about the community to improve their abilities by leading these events. Many ideas were shared among the team including a panel on how to effectively run a SQL Saturday on a tight budget.

Once that was completed, the Regional Mentors enjoyed a lunch together and an opportunity to share what we do to support the user groups in our regions. I particularly enjoyed the fact that I was able to spend some time with Regional Mentors from Germany, Holland, and Portugal. This highlighted further the international scope and reach of PASS. This was followed by the Chapter Leaders meeting. That meeting was held as a series of round tables that the chapter leaders could move through. I was working at the table focused on leadership with Ryan Adams ( B | T ) from the North Texas SQL Server User Group – NTSSUG. We had a number of good conversations around building leadership teams for user groups and what is needed to have an effectively led user group. Check out the NTSSUG site for the by-laws sample we discussed multiple times.

All of these meetings were followed up with the Welcome Reception, which I made a small portion of as I was trying to drop my backpack at my hotel and work my way back there. After the reception, I headed out to Bush Gardens with a number of others. During that time, Jes Borland ( T ) managed to get a microphone in my hand and I had my first round of karaoke. Yes, I actually did sing and had fun doing it. All-in-all, it was a good time had by all.

Wednesday – SQL Around the World, Microsoft Announcements, Tabular Models, and Magenic Team Dinner

This was the true kick off to the event. For many, they looked at the key note as the kick off. Before that even began, I was working in the Community Zone encouraging people to participate in the SQL Around the World community activity. It was a great game. You needed to find 10 people from 10 different countries and find out something interesting about them or their country. I found a dancer and someone who had ridden a cheetah as a kid. I also surprised someone from the Czech Republic when she mentioned her home town only to have me let her know I had been to her home town many years ago. It was a fun conversation. If you did this and have other cool stories let me know. It was amazing as well over 50 different countries were represented at PASS.

Next, Ted Kummert had the first keynote session of the day. His keynote was filled with announcements concerning SQL Server including the following:

  • Hekaton: the project code name for a new in-memory OLTP engine
  • The Columnstore Index will be updateable
  • Next version of PDW will be out in H1 2013
  • Polybase: allows you to query across multiple types of data sources such as SQL Server and Hadoop with T-SQL
  • DAX Queries will be able to query SSAS Cubes

He also highlighted some recent announcements related to the SQL Server stack:

  • Microsoft HDInsight Server CTP: Hadoop for Windows Server
  • Windows Azure HDInsight Service Preview: Hadoop for Azure
  • Power View and PowerPivot fully implemented in Excel 2013

After the keynote, I hit a session on BigData and Hive which was put on by SQL CAT and very informative. My big takeaway was to use EXTERNAL tables not INTERNAL tables when working with Hive. I then went to do final prep for my Tabular Model session. In this session, “Building a Tabular Model Database”, I present on what tabular and in-memory is, and then proceed to open up a Visual Studio project and create a database. I think it went well and the attendees seemed to enjoy the upbeat nature for an end of day session. The night wrapped up with dinner with the Magenic team (7 of us). Good chance to grow relationships across offices from around the country.

Thursday – Community Zone and DAX as a Query Language

Thursday was a fairly low key day for me. Once again I spent time in the Community Zone. I had the opportunity to talk with a few people on creating a user group in their area. As always, I like to see people interested in growing their local community.

I also attended Alberto Ferrari’s session on DAX. I think the biggest surprise to me was that you can now query DAX directly from SSMS. I am not sure that I am convinced that it is a full query language yet, but it is definitely closer. The key to it all is the EVALUATE expression which allows you to create the DAX query ironically in the MDX window. Here is just a taste of DAX as a query:

EVALUATE
    ‘DimCurrency’
ORDER BY

    ‘DimCurrency’[CurrencyAlternateKey]

What I found interesting is that you can create columns, build measures, and perform many other operations against the tabular model using DAX. In the end, it will not increase the memory used as storage as it is all calculated. Look for some more on this in later blog posts as I delve more into the in-memory storage and usage when working with DAX.

Friday – More Community Zone, HDInsight, Paul White, and Window Functions

Last day. I spent more time in the Zone. I really did enjoy my time there as I continued to meet more people. I was even present when a contract was completed for the Shanghai user group. Very cool indeed. I then attended a session on HDInsight by Mike Flasco from Microsoft. This is very cool stuff as you can create simple Hadoop cluster on your desktop to test the technology. Microsoft and Hortonworks have done a great job of bringing Hadoop data into the Microsoft stack.

On my way to present my final session of the day and the conference, I stopped in for the second half of Allen White’s ( B ) optimization presentation. In a word (or two), mind-blowing! Wow, who knew that the optimizer did all those things? I was highly impressed and think he should look at a precon on the subject next year. Unlike some three hour presentations, he could have went longer as he was not stretching his content out. Nice work Paul. So, I got to follow that with a presentation on Window Functions in T-SQL. For the second time, I had the last slot of the last day. I think this presentation went well even though we were all worn out from a content-filled week. It was fun to try some ideas from the audience in the demos. That always makes for a more interesting demo. I will be doing a follow up post on what I learned from some of the attendees on the subject as well, proving once again this is a user community event. We all have something to contribute! (If you attended this session, you will find links to the blogs on the subject here.)

What’s Next?

Coming in April is the new Business Analytics conference in Chicago followed by the PASS Summit in Charlotte, North Carolina. Of course, your local user groups will continue to meet with regional SQL Saturdays sprinkled throughout the year as well. How will you participate and contribute in 2013? We look forward to seeing you all again, soon.





Oracle Tips for MSBI Devs #4: Executing an Oracle Stored Proc with IN Parameters in SSIS Execute SQL Task

1 05 2012

The first tip I published discussed how to execute an Oracle procedure with no parameters.  In this tip, I will discuss a technique that works with IN parameters in an Oracle stored procedure.

After many unsuccessful attempts at executing a stored procedure with parameters, the following pattern was developed by a one of my peers working in a blended environment, Brian Hanley (T | B).  With his permission, I have documented the solution here for your use.

The Solution:

The solution involves using variables and creating the SQL script to be executed in an Script task.

image

Here is the syntax for the procedure used in the examples:

CREATE OR REPLACE PROCEDURE
SCOTT.spDelete1 (DEPTNUMBER int) IS
BEGIN
DELETE FROM DEPT WHERE DEPTNO=DEPTNUMBER;
END spDelete1;

Create variables

Create variables to hold the name of the procedure, any parameters, and the finished script.  In my example, I only have one parameter in the procedure, so I only use three variables.  If it fits your needs, you can also separate the user/schema name into a separate variable.

image

The variable used for the statement has been set up to use string formatting with C#.

Prepare the Statement Variable

Use the Script task to build the statement variable (SPStatement).  Start by adding the variables to the script task.  Be sure to add the statement variable to the ReadWriteVariables collection.

image

The following image contains the script syntax to use to set up the variable.  As noted above, the C# String.Format function is used to update the statement variable.

public void Main()
{
    // TODO: Add your code here
    Dts.Variables["SPStatement"].Value = String.Format(Dts.Variables["SPStatement"].Value.ToString()
        ,Dts.Variables["StoredProcName"].Value.ToString()
        ,Dts.Variables["SPVar1"].Value.ToString()                
        );

    String msg = "SPStatement: " + Dts.Variables["SPStatement"].Value.ToString();
    Boolean refire = true;

    Dts.Events.FireInformation(0, "SPStatement", msg, String.Empty, 0, ref refire);

    Dts.TaskResult = (int)ScriptResults.Success;
}

Setting up the Execute SQL Task

In the Execute SQL task, you will set the SQLSourceType property to Variable and set the SourceVariable to the name of this statement variable.  In the case of our example, this is the SPStatement variable.

image

Versions:

This tip has been confirmed to work in both SQL Server 2008 R2 Integration Services and SQL Server 2012 Integration Services.  The Oracle version tested with is Oracle 11g.





Oracle Tips for MSBI Devs #3: Choosing Drivers

24 04 2012

When working with Oracle, drivers are truly a pain to get working correctly.  I will discuss my preferred choice and why for the following tools – SSIS, SSAS, and SSRS.

SSIS Drivers

Without much question, you should use the Attunity tools for working with Oracle data in the Data Flow task.  In SSIS 2008, the SSIS Connector is free and can be found here:  http://www.microsoft.com/download/en/search.aspx?q=oracle%20connector.  It includes the connection manager, source component and destination component.  Without a doubt this is the only way to work with Oracle data components in the Data Flow task.  (NOTE: I cannot find the SSIS 2012 equivalent at the moment.  However, Matt Massan’s blog post after PASS Summit 2011 notes more work is being done with Attunity.)  UPDATE: I wrote this prior to a blog post from Matt Massan on support for SSIS 2012 with v2.0 of the Microsoft Connector.  Check out Matt’s update on this: http://blogs.msdn.com/b/mattm/archive/2012/04/04/microsoft-connectors-v2-0-for-oracle-and-teradata-now-available.aspx.

However, this connector does you no good when working with the Execute SQL task.  In SSIS 2008, I use the OLE DB provider from Oracle to create the connection used with Execute SQL Task.  In my work with procedures in my first tip, I used the OLE DB provider with 2012 as well and it worked fine.

SSAS Drivers

When working with SSAS 2008 and, until I know differently, SSAS 2012, I would recommend using the Oracle OLE DB driver from Oracle.  This driver is not the fastest I have seen (third party drivers are marginally faster and the .NET driver is faster as well), but it has provided consistent results for the right price.  Third party drivers will improve the throughput, but not substantially.  The Oracle provided .NET driver is faster as well, but has an unchangeable active query timeout of one hour.  If you have any processing times that exceed this, it will unceremoniously drop the connection.  For these reasons, I have stuck with the OLE DB provider from Oracle which is not necessarily the fastest, but it has been the least painful to work with.

SSRS Drivers

It is with SSRS I have seen mixed results.  Primarily because of the better performance in the Oracle .NET driver.  If you can guarantee that your reports will return their data in under an hour, this seems to be the best option.  However, if you want to manage to a single driver set across all tools, you may find that the management of the OLE DB driver as the only driver makes sense in your organization.

Test, Test, Test

I have given you my experience using the drivers above.  However, you may find value in purchasing a third party driver or you may find a different experience when you implement in your environment.  Be sure to test and understand the implications in maintenance and system cost when choosing different drivers across your solutions.





X on XMLA: iii. Basic DDL Functions in XMLA (Create, Alter, Delete)

10 04 2012

image_thumb_thumb

XMLA can be used to manage the structure of your multidimensional databases.  While many developers use Visual Studio (BIDS), to deploy changes, as systems move to production or need to be more clearly managed, XMLA comes into play.

Some of the most common DDL type uses for XMLA including partition management, deploying changes, and promoting between environments.  In all of these cases, objects within Analysis Services need to be created, altered or deleted.

Before we dig into the details, I wanted to call out that the Execute method will be used and, keep in mind, that the full syntax is not required when using SQL Server Management Studio (SSMS).  (See X on XMLA: ii. Basic Structure of XMLA for more details.)  Furthermore, in SSMS, you can generate Create, Alter, and Delete XMLA by right clicking on the deployed object and choosing the Script To option.  If you have questions about syntax, definitely use this function to discover more about the syntax and the object you are working with.

The following sections are using the Sales Channel dimension from the Adventure Works sample database to illustrate the command syntax.  (This sample database is available on CodePlex).

Create

The Create command is used to create new objects in the database.  To child elements that are required are ParentObject and ObjectDefinition.  The ParentObject specifies the list of objects that are the parent.  In our illustration below, the parent object is the database.  This is true because the dimensions in Analysis Services belong to the database.  However, if we were creating a partition, the parent would be structured, database then cube then measure group.  This is true because the partition belongs to a specific measure group.  The order of the parent objects matter as they are read top to bottom in the XMLA.

image

Alter

The Alter command is used to modify existing objects in the database.  The Alter command uses two child elements, Object and ObjectDefinition.  The Object defines the object that is targeted.  This, like the ParentObject, is structured from top to bottom with the last object in the list as the object being targeted for alteration.  12820442882053136041exclamation_mark-md[1]

The ObjectDefinition specifies the changes to make to the object.  The changes to the structure MUST include all of the parts of the structure you want to keep the same.  I cannot emphasize how important it is that you keep this in mind.  You cannot send a simple change via XMLA.  You must send the new version in its entirety.  This is true at all levels including the database.  Where this commonly creates some issues is in the user objects at the database level and the partitions in the measure groups.  If you have specific users at the database level that are different between environment, which you should have, you need to update the alter for each environment.  If you are modifying a measure group that you have added partitions to, you must make sure these new partitions are in your script as well.

image

Delete

The Delete command is used to remove existing objects in the database.  It only has one child element – Object.  This is the simplest of the commands here and only needs the proper object definition. Like the ParentObject and Object elements above, you need to have the proper order to delete the correct object.  Use caution as this command will contain the database through the targeted object.  If you execute the command at the database level you will delete your database.

image

Results

The execution results are not always clear.  When the query is successful you will see the following:

image

Not really conclusive.  Error messages are also returned in an XML format and often contain some relevant information as to why the script failed.  Be sure to read it closely as some times multiple errors are returned and the root cause may not be plainly evident.





PASS for Today (and Tomorrow)

20 03 2012

I justPASSMNLogo wanted to plug two events that are free SQL Server training.

First, Thomas LaRock (@sqlrockstar) and Jason Strate (@stratesql) are teaming up to bring you “Choose Your Own Adventure – Performance Tuning”. Join us at the Microsoft Technology Center in Edina, MN or online for this great adventure. This adventure starts at 3:00 PM CDT. More details can be found at http://minnesota.sqlpass.org.

Is that all?, you ask. No. Starting tonight at 0:00 GMT (7:00 PM CDT) is 24 Hours of PASS. Check out the awesome roster of speakers24HOP_Speaker including the likes of Denny Lee from SQLCAT, Marco Russo, and Dejan Sarka. These sessions run through the night and will be close captioned in 15 languages, making it a truly international event.  Oh, did I also mention, I will be speaking at this as well.

Take advantage of these free training opportunities by leaders within the SQL Community.  We all look forward to seeing you at both or either of these events.





X on XMLA: i. Using XMLA

26 01 2012

image_thumb

In my previous blog, I introduced the “X on XMLA” series.  In this blog, I will do more of an introduction to XMLA by discussing how I have used XMLA in my experience with SSAS.

What is XMLA? XML for Analysis or XMLA is the API used to interact with Analysis Services and other similar multidimensional database servers.  The goal was to build out a standard API to meet this goal and it has been widely accepted by the leading OLAP vendors.  You can find out more about the standard at XMLAforAnalysis.  This is a good resource for generic XMLA topics including the current specifications document and a newsfeed.

A key point about XMLA is that it is, as its name states, XML.  The API uses SOAP protocols to communicate with the servers.  (Refer to the specification above if you would like more details.)  I like this because XML is easy to read.  While it can be annoyingly verbose at times, you are able to easily identify what you are doing which reduces the learning curve.

Uses of XMLA.  Now that you have the cursory introduction to XMLA, what is it commonly used for?  I have primarily used XMLA in the areas of loading data and deploying objects.  I use SQL Server Management Studio (SSMS) extensively to interact with my SSAS databases so I do not use the interrogation or query functions within XMLA that often.  However, you should be familiar with all of these uses.

Deploy.  I use XMLA to deploy new objects and alter existing objects on a regular basis.  In particular, this is a standard way to test deployment and to script deployment operations for production releases.  XMLA supports Create, Alter and Delete commands for every object in SSAS.  In SSMS you can generate any of these scripts using the short cut menus as noted in the image below.  You should become familiar with using XMLA as a deployment method as it is an simple way to promote changes between environments.

XMLA Create Script Menu

Load.  The Process commands are used to load data in a variety of methods including full replace or reload, incremental load, and updates.  XMLA also supports commands which will drop and recreate indexes and aggregations if that is required.  SSAS exposes additional processing options through XMLA that are not available in SSMS or Business Intelligence Development Studio (BIDS).  The script below is a basic example of processing a dimension using XMLA.  (Note:  ->> designates a line break.)


<Process xmlns=http://schemas.microsoft.com/  ->&gt;
analysisservices/2003/engine
>
      <Object>
        <DatabaseID>Adventure Works DW 2008R2 SE</DatabaseID>
        <DimensionID>Dim Time</DimensionID>
      </Object>
      <Type>ProcessFull</Type>
  </Process>


Interrogate.  While I have not used XMLA for this activity as much as the two activities noted above, I will be discussing this in detail in one of the later blogs.  The Discover method allows you to retrieve information about the SSAS instance and related database objects.  For instance, you can use XMLA to return a list of available databases on the target server.

Query. XMLA can also retrieve data from the cube.  It is able to accomplish this by executing MDX against the target.  I do not usually have a need to query from XMLA, but I will discuss this usage in this series as well.

Hopefully this gives you an idea of how powerful XMLA is and why it is a necessary tool for any BI developer working with multidimensional database servers.





Introducing “X on XMLA”

19 01 2012

image

XMLA or XML for Analysis is used in SQL Server Analysis Services and other multidimensional data systems to manage the server.  It is a cross between DDL and Server Scripts.

I am starting a new blog series on XMLA called “X on XMLA”. I have had to use this scripting language extensively in my projects and would like to discuss how I implemented it in my SSAS work.  The goal of this series is to provide an introduction to XMLA with practical application shown throughout.

The series will consist of ten (X) entries which is where the pithy name came from.  Here is the current list of topics that are planned.  As more topics are developed this list will be filled out. Topics will include processing, creating, altering, deleting, and deploying SSAS objects with XMLA.  As the entries are completed, this blog will be updated with links to the other entries in the series.

i. Using XMLA
ii. Basic Structure of XMLA
iii. Basic DDL Funcitons in XMLA (Create, Alter, Delete)
iv. Deploying Databases with XMLA
v. Creating XMLA from Visual Studio Projects
vi. Processing and Out-of-Line Bindings in XMLA
vii. Partition Management with XMLA
viii. SSIS and XMLA
ix. Executing a Select with MDX in XMLA
x. Using the Discover Method

XMLA is very powerful and I hope this series will help you realize the potential and power of using XMLA with SQL Server Analysis Services.





Upgrading Denali CTP3 to SQL Server 2012 RC0

4 12 2011

When I started looking into the upgrade path for this, I saw a couple of notes online about the fact that it was not possible.  I turns out there is a Connect item on this issue (https://connect.microsoft.com/SQLServer/feedback/details/709371/no-option-to-upgrade-from-sql-denali-ctp3-to-sql-2012-rc0-version#details).  In that item was a work around to use the SQL Server 2008 R2 upgrade option.  When you start the RC0 installer, choose the Upgrade option which is last I the list as shown below.

image

This will launch the setup wizard and start the Setup Support Rules check after which you will be prompted to select the instance you wish to upgrade.  In my case I have two named instances with CTP3 – DENALI and TABULAR as well as the Shared Components including SSIS.  (NOTE:  The CTP3 version number is 11.0.1.1440.19.)  I started with upgrading my DENALI instance which had all of the services installed.  On this instance, the Analysis Services instance was installed to support multidimensional databases.  (The TABULAR instance only has a tabular Analysis Services Instance.)

As you can see in the Select Features dialog you are not able to change the selected features when upgrading to RC0 from CTP3.

image

My first “gotcha” – this may negatively affect my SharePoint 2010 install.  In particular, integrated Reporting Services.  I chose to risk it and continue.

image

I left the Instance name and accepted the supplied Instance ID in the next step.  I made no changes on the next three steps – Disk Space Requirements, Server Configuration, and Error Reporting.

Second issue I ran into was related to Visual Studio 2010.  I order to pass the next step I needed to update it to Service Pack 1.

image

The installer for Visual Studio 2010 SP1 can be found here:  http://www.microsoft.com/download/en/details.aspx?id=23691. After I had that installed, I had to reboot and then continue with the SQL Server 2012 install.  This allowed me to successfully pass the Upgrade Rules operation and I was now ready to upgrade.

After getting no errors during the upgrade, I was required to reboot.  Now to check the instance.

image

Looking good.  I was able to work with the updated versions of SQL Server 2012 RC0 in SharePoint as well.

All in all, it appears that the upgrade succeeded successfully as noted in the comments of the Connect item.  I hope you have a similar experience.

UPDATE: I did run into an issue when trying to execute 2012 SSIS packages from Management Studio.  A error regarding logging was raised.  This issue has been posted on Connect and you can find the information here: http://bit.ly/vTUjcr.  I have not tried the work around yet which requires fulling uninstalling CTP3 then reinstalling.








Follow

Get every new post delivered to your Inbox.

Join 731 other followers

%d bloggers like this: