Rochester – Welcome to the PASS MN Family

22 01 2014

It started as a vision for expanding the reach of the Minnesota SQL Server User Group a year ago. At the time, Paul Timmerman ( T | B ) and I were discussing the fact that only one user group existed in Minnesota. As we looked at other areas and talked with SQL pros from St. Cloud and Rochester, a dream was born — what if PASSMN, a large and very mature user group, helped kick off meetings in other areas? Throughout the past year, Paul has been working with Patrick out of Rochester to make this a reality.

image Source: http://binged.it/1eB310K

Tonight was our first meeting in Rochester. We had an awesome turn out with over twenty people attending. Everyone is already looking forward to the next meeting.

Paul and the crowd Paul presenting

To begin with, the Rochester group will be supported by PASSMN as an extension group. The meetings are planned to be quarterly and we will be working with local and regional speakers to participate. Our hope is that the group will grow and can become a stand alone chapter. However, the goal of PASS is community and training. Whether independent or as extension group, the goal is to meet these needs within the SQL Server development community where ever we can.

PASSMNLogo

For more information about PASS MN and PASS MN – Rochester meetings, check out http://minnesota.sqlpass.org/.





Traveling and Talking to Wrap Up 2013

16 08 2013

I am writing this blog post en route to New York City for SQL Saturday #235. This begins a fairly busy travel and speaking schedule for me this year. Having spent much of the summer at home with my family which was great, I now embark on some trips, primarily out east.

For starters, I am kicking off my busy end of summer, beginning of imagefall, in New York City. I am looking forward to being there for the second time. This should be a great event. As Regional Mentor for the NorthEast this trip allows me to be in that region for an event.  I will be speaking on Building BI Solutions with Microsoft Excel.

Next, up is some quality time in Boston. I have a contract that will put me in downtown Boston for about 4 weeks. After that trip, I cm-logoskip a week and then return to speak at CodeMastery which is a Magenic event in late September.  If you are in the Boston area, I will be speaking on the difference between SQL Server Analysis Services Models – the Multidimensional and Tabular Model Smackdown.  More details of this event will be coming soon, but pencil it in for Thursday 9/26.

While not traveling far, I am speaking at SQL Saturday #238 in Minneapolis right before PASS Summit in Charlotte. This is right in my backyard, so to speak, and I look forward to another great event in the Upper Midwest. I currently do not have the sessions that were picked, but look forward to seeing you there.

October will be spent at PASS Summit. While I was accepted to speak this year, I look forward to meeting a number of people at the Community Zone and the other events at the conference. I will follow that up by going to SQL Saturday # BI Edition in Charlotte. While I have not received word about speaking, I plan to attend the event in either case. Should be a nice way to wrap up my time there.

In November, I am headed to Atlanta for another CodeMastery event. This should be a nice time to visit there and spend some time with people there.

cm-logo  X 2

Upon returning from Atlanta, I am back in Minneapolis for a CodeMastery event there. Having had a major role in running this event over the past couple of years, I am truly looking forward to speaking there. Check out http://codemastery.com for details on all of the CodeMastery events as they are announced.

Modern Apps Live!SQL Server Live!

Finally, at least at this time, I will be presenting 5 times right before Thanksgiving. This will be my first time presenting at SQL Live and second time presenting at Modern Apps Live. Join us for this awesome 360 Live event in Orlando.

I look forward to meeting new people and visiting with old friends throughout all of these events and travels. If you see me there, say Hi and ask for one of my “famous” superhero cards! See you around.





PASSMN–Passing the Baton: A Board Transition Story

12 12 2012

Each year the Minnesota SQL Server User Group elects new board members and resets its leadership team.  I have been on the board for the past three years during which we switched term lengths to two years with staggered elections, but more about that later.

PASSMNLogoDuring the PASS Summit this past year I had the privilege of hosting a table during the Chapter Leaders meeting on building boards and recruiting leadership for SQL Server User Groups.  First, I have to say, leadership teams/boards come in many flavors, counts, etc.  For instance, some groups are managed by one or two people who are truly passionate about the community and really enjoy leading the groups.  Other groups, such as ours, have regular elections with as few as three and as many as eight on the board.  While I may spend more time later discussing the pros and cons of various approaches, this post is about the peaceful transition within our user group.

First a bit of history, I have worked with the Minnesota SQL Server User Group for a number of years.  We joined PASS quite a while ago but a few years after we started.  After joining we did not change our organizational structure.  We have always had a board of four or more.  We currently have six roles: Chair, Corporate Relations, Programs, Membership and Treasurer, Technology, and SQL Saturday.  We added SQL Saturday last year to have someone focused entirely on getting that program rolling, Paul (@mnDBA) did great!  I have held the Corporate Relations role and this year I was the Chair.

Elections and Role Distribution

Prior to last year, our terms were one year.  However, we rotated the entire board at least once which is very difficult. So, starting last year, we rotate 1/2 of the board each year with terms running twoelection button years.  What is really cool about our election process is that we are able to recruit new board members each year allowing PASSMN to take on some new ideas without issue.   We have also done a good job of mixing up members between consulting and nonconsulting members.  We also try to limit a company’s participation to one board member.  This process has been successful for years as this blend has motivated time changes, content direction and even sponsorships through the years.

Another unique part of our board election is that members are voted to the board, not to a role.  This is strategic as it allows the members to rotate roles as well as work with each other to find a good fit for them within the board.  It also means that the nominees understand that they will work with their fellow members to distribute responsibility.

Transition Lunch

That leads us to today.  We had our transition lunch 12/12/12 at 12:12. (Yes, that was the actual appointment time, and we survived.)  Once again we had a good time as volunteers reflecting on what happened in the past year, allowing the new board to select roles, and handing off insights as we have them.  And thus, it is the beginning of the end for me as a board member.

It’s Been Great!

Personally, it has been and will continue to be great working with such a good group of people.  Thanks to Andy (@SQLQuill)  and Tim (@tjplas) for the time these past couple of years and we welcome Shirley (@SQLLatina), Will (@williamweber), and Mike (@SQLMD) to the board along with returning members Paul, Josh (@SQLJosh) and Bill (@billpreachuk).  I plan to continue working with the board and helping with some of their new initiatives this next year.  Let’s keep the tradition of quality and fun going.





SQL Saturday #149 and CodeMastery–Minnesota Events

18 09 2012

sqlsat149_webWe are less than two weeks away from SQL Saturday #149 in Minneapolis on September 29, 2012 with two preconference sessions on September 28.  In case you haven’t heard, we are having the main event on a Saturday.  Yes, the precons are on Friday this year.  Check out the details here.  I am really excited about this event as we have a great group of local, regional, and national speakers at this event.  There are nine rooms being used for this event, so go out to the site and build your schedule.

cm-logoThe following Tuesday, Magenic is hosting CodeMastery with a BI track at the Microsoft Technology Center in Edina, MN.  This event includes a sessions on managing the BI stack in SharePoint and xVelocity.  The other track is Windows 8 development with sessions on WinRT and Game Development.

I’m a Speaker at Both Events

Besides plugging these two awesome events on their own, I am also a speaker for both events.  Here is what I will be speaking on at each event:

SQL Saturday #149: A Window into Your Data: Using SQL Window Functions

In this session, I will walk through the window functions enabled by the OVER clause in SQL Server.  Come join me as we celebrate the SQL Server 2012 release of analytic functions and expansion of aggregate functionality to support tasks such as running totals and previous row values.  Thankfully, this is a demo heavy session as it is one of the last sessions of the day.

CodeMastery: Data Mining with the Tools You Already Have

The next week, I will be presenting on data mining tools which Microsoft has made available to us in SSAS and Excel.  The goal of this session is to help developers understand how to implement data mining algorithms into their business intelligence solutions.

I look forward to seeing you at both events.  They are priced right, FREE!





Upcoming BI and Data Events with Magenic in Fall 2012

14 08 2012

As I started looking over my event calendar, I was amazed at how many events Magenic is sponsoring, hosting or providing speakers for over the next couple of months.  As always, one of my goals is to make everyone aware of opportunities to increase their knowledge in BI and data or even network with other data pros.  Not all of the events are Magenic events, but Magenic has stepped up to sponsor or provide speakers for the event.  I will update this blog with more details as they become available, but I want to get you thinking about the possibilities to learn and connect.bi-summit-logo

BI Summit (Chicago, August 23, 2012)

This event is coming up very soon.  Register today to hear about BI topics that matter to you as a customer and in particular as a business user.  Ted Krueger ( B | T ) and Aaron Lowe ( B | T ) will be showing off some of the new Microsoft BI capabilities.  I will be joining them for the round table at the end of the day.  Sign up today as space is limited!

Code Mastery (Chicago, September 12, 2012)

cm-logoNot to be outdone, a few weeks later Magenic’s Code Mastery event comes to Chicago.  There is an opening session by Ted Krueger discussing query tuning for developers and admins alike.  Then, a full track will be offered on a variety of BI topics.  The speakers include your truly, James Phillips ( T ), and Aaron Lowe.

SQL Saturday #156 (Providence, September 15, 2012)

Join James Phillips at this event as well.  If you see him at Code Mastery in Chicago then at SQL Saturday in Providence let him know.  I am sure we all would be interested in how and why you SQL Saturdaytraveled the country to hear him.  He will be speaking on SSIS 2012 and loading data warehouses.

SQL Saturday #149 (Minneapolis, September 28-29, 2012)

We are loaded to bear for this event.  Steve Hughes and Ted Krueger will be conducting preconference sessions.  Beyond that, a number of Magenicons have submitted sessions.  We expect the final list before the end of the week and I will update this post when they list is finalized.  Magenic is also a Gold Sponsor for this event.  Oh, and this year the precons are on Friday with the full event on Saturday.  Thought we would try something new this year.

8/15/2012 UPDATE:  Magenic speakers announced:  Ted Krueger, Aaron Lowe, Jeff Prom and I have sessions at this event.

Code Mastery (Minneapolis, October 2, 2012)

This event is still being finalized, but hold the date!  We have a BI track that includes implementing BI with SharePoint.  Our speakers include Jeff Prom ( B ), Josh Owens ( T ), Curtis Smith and Greg Moser ( T ).  Look for more details coming soon.

Minnesota SQL Server User Group (Minneapolis, October 14, 2012)

Magenic is the sponsor for October’s user group meeting and Jeff Prom will be speaking on Data Quality Services.  PASS_2012_120x240

PASS Summit (Seattle, November 6-9, 2012)

And the Summit wraps up a busy fall season.  I have two sessions at this conference.  Ted will be a part of Ask the Experts and Birds of a Feather.  A number of other Magenicon’s will also be in attendance.  We will look forward to wrapping up a this fall’s data and BI community events at the Summit.

Join Magenic’s data pro’s at any of these conferences or meetings.  Magenic is a huge promoter of business intelligence and data related education within the SQL Server community.  We look forward to seeing you at any of these events.





O, There’s the Data: Using OData in SSIS

23 07 2012

image

The Open Data Protocol (OData) is an open specification created Microsoft to enable exposing data in a standard way from a variety of sources.  OData is natively supported in many of Microsoft’s products including PowerPivot, Excel 2013, SQL Server 2012 Analysis Services Tabular Model, Windows Communication Foundation (WCF), and Entity Framework to name a few.  Furthermore, Microsoft uses OData to expose data feeds from the Windows Azure Data Marketplace as well.

I pursued adding an OData source to SSIS as a result of Mark Souza’s presentation at the Minnesota SQL Server User Group in April 2012.  I posed a question about easier interaction with Oracle.  He mentioned that OData would be a good way to solve that issue.  This led me to put together a presentation which I delivered for PASSMN in July 2012 entitled O, There’s My Data: The Open Data Protocol.  At that presentation, I reviewed the “pain and agony” of a data pro putting together a data feed using Entity Framework in C# and WCF to expose it.  For the most part, with the help of .NET pros at Magenic including Dave Stienessen ( B ) and Sergey Barskiy ( B ), I was able to create my first entity model and expose it using WCF.  After that I worked on how to consume the feed without purchasing a 3rd party tool.  Here is the rest of the story.

Using ATOM as Shown in a Channel 9 Exercise

While looking for solutions that allowed me to implement an OData feed into an SSIS package, I came across a Hands on Lab on Channel 9.  While the focus was on Reporting Services, I was able to use the steps to create a package that would read a feed and make the data available to the ETL process.  In a nutshell, this exercise involved three tasks – creating an ATOM file, processing the ATOM file and loading the data using an HTTP connection manager pointed to the OData feed.  While you are creating this package, you should run each step after you have created it in order to use the files created in the following steps.

image

Task 1 – Create ATOM File (Script Task)

In the Main method,  I used the following code which was copied and adapted from the Channel 9 exercise. (NOTE: The code for this script has an error.  The object declaration should be condensed to one line to work properly.)

public void Main()
 {
 // Get the unmanaged connection
 object nativeObject = Dts.Connections["TestSvc"].AcquireConnection(null);
    // Create a new HTTP client connection
 HttpClientConnection connection = new HttpClientConnection(nativeObject);
    // Save the file from the connection manager to the local path specified
 string filename = "C:\\Source\\SSIS 2012 Projects\\ODataIntegration\\Departments.atom";
 connection.DownloadFile(filename, true);
Dts.TaskResult = (int)ScriptResults.Success;

}

This task will create an ATOM file that will be used in the next step to retrieve the data.

Task 2 – Process ATOM File (XML Task)

This task will use the new ATOM file to create an XML file with the data.  It uses the XSLT operation type pointing to the File Connection Manager created in the previous step as the source.  This will result in another File Connection Manager to support the destination XML file with the data.  Finally, in the exercise as second operand set of XML is used to clear unsupported headers.  Admittedly, I just copied this straight from the example and still am not sure of the details of what it does.

Here is a look at the XML Task Editor so you can see the settings I used.

image

Here is the code from the Channel 9 exercise used in the SecondOperand property:

<?xml version="1.0" encoding="utf-8" ?>
  <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
  <xsl:output method="xml" indent="no" />
 <xsl:template match="/|comment()|processing-instruction()">
 <xsl:copy>
 <xsl:apply-templates />
  </xsl:copy>
  </xsl:template>
  <xsl:template match="*">
  <xsl:element name="{local-name()}">
  <xsl:apply-templates select="@*|node()" /> </xsl:element>
  </xsl:template>
  <xsl:template match="@*">
  <xsl:attribute name="{local-name()}">
  <xsl:value-of select="." />
  </xsl:attribute>
  </xsl:template>
  </xsl:stylesheet> 

Task 3 – Load Data (Data Flow Task)

The final task is a straightforward data load using the XML Source Component pointed at the file XML file I created.  Then I created a matching table in a database which I used as the destination.image

Wrap Up on the ATOM Feed Option

This will work with SSIS 2008 and SSIS 2012.  I tested most of the work in 2012, but the code in the illustration supports 2008.  This option does require that the package write at least two files to the server to work correctly.  In some cases, this will not work in enterprise environments as the data will now rest on the server for a period of time or the admins do not want files created on the server.

Using a Custom SSIS Source to Get the Data

NOTE: This is the preferred solution, but is not available in SSIS 2008 which uses the .NET 2.0 Framework.  This solution requires the .NET 3.5 Framework.

This version uses a custom SSIS source to connect to the OData feed and populate the data flow pipeline.  I did not find this option illustrated anywhere and used help from the Dave and Sergey to put this together.  I spent many hours trying to solve this issue and at the end of the day, it is fairly simple.  So, hopefully, this will save you some time as well.

This package only has one workflow task – a data flow task which contains the rest of the code.  In the data flow task, I have a Script Component implemented as a source and a Row Count with a data viewer on the pipeline to check results.

image

This was my first experience creating a custom source.  I used a post from SSIS Talk – SSIS Using a Script Component as a Source as a reference.  If you need help creating your first script source check it out.

Be sure to set your outputs prior to creating the script or you will not have them available to map to in the code.  You also need to add the HTTP Connection Manager you are using to point to your OData feed.

Add References, Using Statements, and Declarations

Once you have the basics set up, you need to add some references including the targeted data service and System.Data.Services.Client.  These are the key references for the code we are implementing.

image

Once you have these references you will need to add the following to the Using statements to the Namespaces region.

using System.Data.Services.Client;
 using SC_68e99fec2dce4cd794450383662f6ac7.TestSvc;

The SC_ reference is the internal name for your script component and will be different from mine, although it will likely be in the same format.

Next, you need to add the following declarations in the ScriptMain class as shown here.

public class ScriptMain : UserComponent
 {
private Uri svcUri = new Uri  (http://localhost/ODataSQL/TestSvc1.svc);
 private AdventureWorksEntities context;

The AdventureWorksEntities is from the service reference I created. You will need to know the context name for the service reference you are using.

The Working Code: Using DataServiceQuery

In the CreateNewOutputRows method in the SSIS script you will add code that runs a DataServiceQuery which adds the data to the data flow pipeline. In my case, my Output was called Departments and created the buffer reference you see in the code.  It has the output fields I defined for my source.  Here is the code I used to implement the solution.

public override void CreateNewOutputRows()
 {
 context = new AdventureWorksEntities(svcUri);
 DataServiceQuery<Department> dept = context.Departments;
    foreach (Department d in dept)
 {
 DepartmentsBuffer.AddRow();
        DepartmentsBuffer.DeptID = d.DepartmentID;
 DepartmentsBuffer.DeptName = d.Name;
 DepartmentsBuffer.GroupName = d.GroupName;
 }

This will query the service and return the rows. Alas, that is all it really took to solve this problem.  While this solution does not work in SSIS 2008, if you are planning to use a lot of OData, I would recommend using this as another reason to upgrade to SQL Server 2012.

SSIS Needs an OData Source Component

What I found interesting is that Microsoft does not have a native method to load OData feeds into the Data Flow Task in SSIS.  I have since created an Connect item to see if we can get this added.  Vote here if you agree.

Resources Used throughout the Process

Connecting to Windows Azure SQL Database Through WCF

Loading Data from an ATOM Data Feed into SQL Server

SSIS – Using a Script Component as a Source

DataServiceContext Class

Chris Woodruff – 31 Days of OData Blog Series

PASSMN Presentation – July 17, 2012

Consuming SharePoint Lists via OData and SSIS – Uses Linq





Great User Group Meeting Tuesday, Now on to Madison and SQL Saturday #118

19 04 2012

PASSMNLogoMinnesota SQL Server User Group Meeting Review (4/17/12)

First of all, I have to thank the awesome speakers who presented at our user group meeting on Tuesday night, April 17.  Mark Souza ( T | B ) for a great question and answer session.  He was open about SQL Server and where it is heading.  Some items that interested me is that SQL Azure was running the SQL Server 2012 platform in production by December 2011.  Microsoft is committed to releasing more changed through the SQL Azure platform at a faster clip.  This will allow them to continually improve the quality of releases for the boxed version as well.  (Speaking of which, the boxed version is not going away any time soon.)

Mark also spoke about leaving data at its source and using tools to bring the data together for reporting and analysis.  He touted the Hadoop integration efforts as key part of this strategy.  If you have followed my blog for a bit, you know that I do a lot of work with Oracle and MSBI.  As a result, I had to ask when a good version of Oracle drivers would be available.  While he did not have a good answer on this at the time, he did mention that OData is a data access solution for me to look at. Look for a future blog post as I explore this protocol to ease some of my Oracle.

After Mark finished answering the variety of questions asked by the crowd we handed the microphone to Itzik Ben-Gan ( T | B ).  As usual, he took an entire hour to help us understand that we did not know all of the possible ways to use the APPLY operator.  However, I also learned about the DBCC OPTIMIZER_WHATIF function which allows you to change the system settings the Optimizer uses to create a plan.  For instance, DBCC OPTIMIZER_WHATIF(1,16) will tell the optimizer to use 16 processors in its plan.

SQL Saturday #118 – Madison, WI (4/21/12)sqlsat118_web

Finally, I will be out at SQL Saturday #118 in Madison this weekend.  I have two presentations there – A Window Into Your Data: Using SQL Window Functions and Performance Monitoring and Tuning in SSAS.  Maybe I will see you there.








Follow

Get every new post delivered to your Inbox.

Join 786 other followers

%d bloggers like this: