A Month, Almost Two Months Offline–Was the Silence Deafening?

17 06 2014

The past couple of months has been very busy at home and at work so I have fallen behind a bit on my blogging. The big event was the graduation of my oldest child, Kristyna, from Burnsville High School. It was our first grad party and a lot had to happen to make that go.

Before that party, we had our annual camping trip with friends. During that same period I spoke at SQL Saturday in Orange County where my wife joined me. She helped out at SQL Saturday and we were able to visit my daughter’s college choice – Biola University. I followed up that by organizing an internal technical conference at Magenic, Tech Summit. I was only able to attend a part of that weekend’s events as I wrapped up the weekend at Boy Scout camp.

All in all, my family took precedence followed by increased work responsibilities. Up next is my Sabbatical which will be six weeks off at the end of the summer. I will try to get a few of the updates in my queue loaded up here then more silence.

My advice to everyone, enjoy your family. They will matter more than work and contributions here.

Enjoy some pics from my months off…

Huntington Beach with my wife, Sheila (celebrated 21 years on June 5):

Huntington Beach

Our deck, ready for rain and for the grad party (yes, we are Scouts):

Deck for Grad Party

Here is my daughter Kristy with the family at her graduation:

Kristy and Family at Graduation

Andrew’s Dutch Oven cake at our Memorial Day Campout:

Drew and Cake

Finally, check out my Fun Info page to see my youngest daughter’s new video contribution.

Thanks for indulging me celebrating some extra family time here and at home. Next up, another Excel BI Tip.





Techfuse, a New Laptop, and How Microsoft Azure Helped Save the Day

24 04 2014

On Tuesday, April 22, I had the opportunity to speak at the Techfuse conference in Minneapolis. I was presenting a session on the differences between tabular and multidimensional models with a focus on the developer’s experience. My deck has tenTechFuse_logo  slides including title, references, and bio. The rest of the time is spent in Visual Studio building out equivalent models in using SSAS Tabular and SSAS Multidimensional.

The previous week, I was issued a new laptop from my company, a Dell 7440. This is a very nice machine and I had it all set for the presentation. About 11 AM (I was scheduled to speak at 1:15 PM) it occurred to me that I did not recall seeing a VGA port only HDMI. Next question, did the projectors at the convention center support HDMI? Answer, No. Now I had about an hour and a half to resolve this issue. Simple, I decided to head downtown and get the convertor from Office Depot. This was about 8 blocks away. I could do that and get some exercise in.

I took off at about 11:30. First, I stopped at Target, it was closer. No luck. So on to Office Depot. Keep in mind that Office Depot sells laptops like mine with only HDMI support and it stands to reason that they would have the converter. No such luck. I was able to get the HDMI converted to DVI, but that would not help as I later found out. They directed me to Radio Shack where I promptly picked up a DVI – VGA converter. Now I have three pieces that when strung together should support my needs. I headed back to the convention center and arrived with 30 minutes to spare. Working with the AV guy, we got it all plugged in only to still have it not work. Turns out you need a convertor to convert the digital signal to analog for use in the older projectors. Now what?

The moderator for my room offered me her laptop to use for the presentation. Which was AWESOME! So now I have a way to give the presentation, all ten slides. However, she did not have Visual Studio with SSDT for BI and SQL Server installed. Which was fine, because I didn’t expect her to.

Here is where Azure comes in. I had created a VM with SQL Server Tabular installed along with Visual Studio 2012 and the SQL Server Data Tools for BI. So, I firedth9CGBMYN6 up the VM right before I gave the presentation. I warned the crowd about what had happened and decided to push the demos to the end of the presentation so everyone could leave if nothing worked and all the material could be covered.

I was able to get into the VM, fire up Visual Studio. Since the demo was a live build of a tabular model and multidimensional model, I used a database I had created in SQL Azure as the data source and we built it the models live. Granted we were not able to do a complete multidimensional model because the database was not formatted as star schema, but it helped highlight the difference between what needs to be done prior to development. Overall it went very well (I think, surveys are forthcoming…). At the end of the day, without the work I had been doing in Azure I would not have been able to demo and it would have been a very short presentation.

Some lessons learned -

  • Be sure to have what you need to support presenting in a variety of scenarios. I should have made sure to have a converter prior to the conference as most convention centers and other facilities haven’t upgraded their projectors yet.
  • I will likely set up Azure VMs to support more demos. Just in case. It is always good to have a backup plan though a wireless connection would have painful to do that on.
  • Roll with it. Don’t give up, try to make the best of a bad situation. People understand things don’t always go perfectly. At the end of the day, I came to talk about multidimensional and tabular model development. I could have opened the floor up for discussion and did Q&A. Make the most of every situation.




Oracle Tips for MSBI Devs #6: Supporting SSAS Tabular Development

14 04 2014

As SQL Server Analysis Services Tabular Models become more popular, models will use Oracle databases as sources. One of the key issues whenever you work with Oracle is understanding how to properly configure the necessary components to enable development.

Getting Started

If you have worked with Oracle before, you are very aware of a few things you need to be successful. First, you need to install the Oracle client. Here is where the details get messy. When you are working with MSBI tools, you will be using SQL Server Data Tools in Visual Studio which is still only 32 bit. Of the BI tools in SSDT, only SSIS has run modes to support 32 bit and 64 bit configurations. As a result, you need to install the 32 bit Oracle client in order to develop your tabular model.

Once that has been installed you will need to update the TNSNAMES.ORA file with the servers you will be targeting during development. Ideally, your Oracle DBAs have a file for you to use so you don’t need to create one. One nice thing is that the Oracle 12c client updates the PATH environment variable with the location of the bin folder. (Yes, Oracle still uses environment variables.) I would also recommend adding or using the TNS_ADMIN variable to specify the location of the TNSNAMES.ORA file. (See http://www.orafaq.com/wiki/TNS_ADMIN for details.)

NOTE: It took me many hours to work through a variety of configuration issues related to working with the Oracle client install. A couple of reinstalls, reboots, TNSNames.ORA tweaks, and lots of fruitless searching were all required to get this working. Be warned, working with Oracle clients are neither fun nor simple.

The Issue

Now that you have the 32 bit client installed you can connect to the Oracle database through the tabular model designer. As shown below, you can connect to Oracle through the Table Import Wizard.

image

You will be able to successfully test the connection as noted here.

image

And you will be able to execute a query and get results. You can also use the option to select tables and views.

image

However, once you decide to import the data you will encounter the following error:

image

The issue is that while you can do most of your work within Visual Studio using the 32 bit client, the import process targets the SQL Server tabular instance you specified when you created the project. While the 32 bit version of SQL Server is still available, most of us would not install that, even in our development environments. If you do not encounter this error, you are either using the 32 bit client of SQL Server or you have the 64 bit Oracle client installed (more on that next). As long as Visual Studio is only 32 bit compliant and you choose to use the 64 version of SQL Server you will see this issue.

The Resolution

The resolution is fairly simple. You need to download and install the 64 bit Oracle client. I would recommend that you get it installed, then reboot your development PC. While this may not be required, it seems to have helped me with a number of connectivity issues. You will need to be prepared for some “interesting” issues as you will have more than one Oracle home installed and you have the potential of many types of ORA-XXXXX errors. Once you are up and running you should be able to develop tabular models built on Oracle databases.

Some Parting Thoughts

First, I want to be clear that I think that Oracle is a solid database platform. However, I have never been at a client site or on a project where the connectivity or client installs were totally correct or functional without some work between the Oracle team and the BI development team. I think that the .NET driver is supposed to better and I may try that out for a later post (when I have the hours to spare).

I did the testing for this completely on Azure (and my Surface). I set up an Oracle VM and a SQL Server VM on Azure. The Microsoft team put together a great reference on setting up your Oracle VM. Check it out. I also did a previous post on setting up Oracle in an Azure VM. Both VM types can be pricey, but in a testing environment all was not too bad. I encourage you to use Azure to for these types of scenarios. But be sure to turn it off when you are done.





Ten Years and Counting …

14 03 2014

On March 8, 2014, I reached ten years of service at Magenic Technologies. When I started in 1999 (don’t do the math just yet), Magenic had only been around a few years and it was my first consulting job. I stayed at Magenic for about 5 1/2 years. During that time, I went from an application development consultant in Visual Basic to a Business Intelligence Architect specializing in SQL Server Analysis Services and MDX.

Magenic Logo At that point in my career, I was beginning to travel more and take on a much larger leadership role as Magenic was growing beyond that first office in Minneapolis, MN. I went to work for one our customers, XATA Corporation, as their Data Architect. For the next 4 1/2 years I learned a lot about transportation management and how important data was in that area. (My Twitter handle and domain name, DataOnWheels, was established during this time.) I also learned what it meant to live with your code and architecture after the consultant left. While at XATA, I continued to be involved in the SQL Server community which was very helpful for the next step.

There came a time in my stint at XATA, that it was clear that it was time for me to move on. I started the process of looking at what was available in the market. When I went to Magenic’s career site, I found that my job or its nearest equivalent was available. I called Carole Cuthbertson, “V-mom”, and asked about coming back. We had lunch a week later and the transition back to Magenic was set in motion. (So, if you leave a company you like for good reasons, never burn bridges! You might want to go back.)

When I told my wife that I was going back, she was glad I was. This kind of surprised me as I had been doing a lot of traveling before I left. She noted that I was happier working at Magenic which was better for our family. That was the final push I needed and I returned to Magenic. As I come up on this anniversary, I am really at great point in my career. I truly enjoy being a Practice Lead at Magenic and still consider it a a great place to work.

I recently was talking with a customer who pointed out that communicating ideas with 3 points is very effective. Ironically, I probably should have known that from all of the ministry training I went through. So, I thought I would conclude this post with the three reasons why I enjoy working for Magenic.

1. Family

As I noted above, my wife was happy when I went back to Magenic. What I realized is that no matter how hard it got, Magenic would do what they could to help your family while you worked for them. I realize that everyone’s experience is different, but I saw them work with consultants who were going through tough times with their family.

My Kids During my first couple of years there, I lost both of my maternal grandparents over two successive Christmases. I was close to my grandparents so when the news came, I still remember Tim Wold, my project manager and Magenic’s employee #1, telling me to go take care of my family and he would work out the details on his end. That was characteristic of management and leaders throughout my time here. Family matters to the owners, Greg Frankenfield and Paul Fridman, and that is carried through to the company.

My family has always been treated very well and they enjoy participating at the various events that the company has had from the Christmas party to the summer picnics.

2. Opportunity

When I joined Magenic, I had no idea what it meant to be a consultant. With mentoring and coaching, I was able to grow from a consultant to a practice lead. I am a technical leader in the company and have been able to expand my career throughout the years with their help. One of the biggest career changes was moving from application development to business intelligence. This has significantly shaped my career today in ways I would have never thought 15 years ago. During my time there, I was also encouraged to give back both internally and externally. This has opened up some great opportunities through the years.

3. Appreciation

Appreciation can take on a couple of different forms. Of course, being paid to do what you love is awesome! Even better when it pays well. However, there is more to appreciation than the paycheck. Magenic has an interesting culture which I had to learn through the years. Because we are hired to go and work at customer sites, it was not always known what or how we were doing. Magenic is a place where you can and should speak up about what your doing. While seeming counterintuitive and self serving at times, letting managers and other leaders know what I was doing, paid dividends for my career. Just as importantly, they were able to understand more about what was going on.

To me and my experience with Magenic, these are 3 key reasons that I returned and stay today. I am now in a place in my career where I need to give recognition to those whom it is due more than receive it. I have the opportunity to impact others in their careers and to help Magenic continue to grow.

I encourage all of you who read this to look at where you work. What is the culture like?Source: http://www.michaelhammack.com/2012/10/05/the-grass-is-always-greener-on-the-other-side/ Do you like what you do? Can you see yourself there in 5, 10, 15 years? Keep your core values in mind as you seek to change companies or careers. If you have had a job move conversation with me, I have likely told you something like “The grass is always greener on the other side, but a lot times it’s because of the manure.”





Setting Up Tabular Models on Windows Azure

12 03 2014

In my last post, I discussed how to set up Oracle in Windows Azure. During a customer call, there were questions about using SQL Server Analysis Services (SSAS) tabular models with Azure. This post will walk through setting up an Azure VM and deploy a tabular model to that VM.

If you do not have an Windows Azure account you can use a trial account with your Microsoft or Live account. Check out http://www.windowsazure.com for details on how to “try it free.”

Setting Up the VM in Azure

From the Management Portal on your Azure account, select Virtual Machines from the left then Add at the bottom. On the next screen, choose to create your VM from the gallery. You should see the Choose an Image option as seen below. As you can see, I have the SQL Server 2012 SP1 Enterprise image highlighted. You will need to use the Enterprise license as Tabular does not run on Standard. In this case, the Data Warehousing image is NOT the Business Intelligence Edition of SQL Server.

image

You can also choose to create a “blank” VM and load up SQL Server on your own. I chose to use the image to speed up the process – which it did substantially.

After selecting the image, the next few steps guide you through setting up the VM. For the most part, the defaults will work fine. Be aware that once this is turned on, you will be charged for it running. It is up to you to make sure you understand the costs, even if you are using the free trial.

During the setup steps, you will create the VM and its related cloud service. Remember that the account is your admin account for the VM and you will need those credentials to Remote Desktop into the VM. On the last, setup page is the Endpoints. Leave the defaults, we will add an endpoint for our tabular model later.

At this point, it will take a few minutes to set up your new VM. Once it is setup, open a Remote Desktop session into it. If you look at services or at the SQL Configuration console you will notice that everything except a tabular instance have been set up for you. As a result, I would not recommend using this gallery image for a production deployment. You should look at creating your own template if you want a more locked down and refined setup.

Setting Up the Tabular Instance in Azure

As noted before, the tabular instance is not set up. The installation media is on the server, so you can run that to install your tabular instance. I won’t walk through the install process, but this was great to find because that meant I did not have to copy media to my VM.

Making the Tabular Instance Available

This section covers the tedious tasks required to make your tabular instance available for querying outside of the VM. There are three basic steps to getting your instance “online”: setting the port number in SSAS, updating the firewall, and adding endpoints. I will walk you through the steps I used to get this done followed by some references that helped me get here.

Setting the Port Number in SSAS

By default, SSAS, both multidimensional and tabular instances, use dynamic ports. In order, to allow connections through the firewall and endpoints, the port number needs to be fixed. I used guidance from TechNet and did the following steps to set the port.

    1. Opened the Task Manager to get the PID for MSOLAP$<<instance name>>.
    2. Ran netstat –ao –p TCP to get a list of ports used by current processes. Once I had identified my port number, I also noted the server IP address which is required in the next step.
    3. I chose to confirm that I had the correct port by connecting to the instance using the IP address and port number.
    4. Next, we have to go old school and modify the msmdsrv.ini file. The typical install path for this file is C:\Program Files\Microsoft SQL Server\<<instance name>>\OLAP\Config\msmdsrv.ini.
    5. Open the file in notepad and find the <Port>0</Port> tag.
    6. Change the port number to the port number that was identified above. (Technically we could have used any open port number. I chose to do this because I was sure the port number was available.)
    7. Save the changes and restart the service.
    8. Once again confirm you can connect to the server with SSMS using the IP address and port number.

Now you have set up the SSAS instance to use a fixed port number.

Updating the Firewall

Now that we have a port number, we can create a firewall rule. We access the firewall rules from the Server Manger. In the Windows Firewall console, we will be creating a new Inbound Rule..

image

  1. The rule type is Port
  2. We will apply the rule to TCP and specify the port we defined above.
  3. On the action tab, we selected Allow the Connection. (If you are planning to use this in a production environment, you will need to verify your connection requirements.)
  4. Next, we clear any connection we don’t want to apply.
  5. Finally, I named the rule with a descriptive name.

Now we have the firewall rule in place to allow external connections for the tabular instance.

Adding Endpoints

The final step to making the instance available is to add the endpoints in Azure. In the WIndows Azure portal, we need to go the VMs page again, select our VM, and open the ENDPOINTS tab. From here we create a new endpoint.

  1. We are creating a standalone endpoint.
  2. Next, we select the TCP protocol and add the port number to both the private and public port textboxes.
  3. Finally, we apply the changes.

We are now ready to test connectivity.

References

Setting up VM Endpoints

Configuring Windows Firewall

Configuring Windows Firewall with SSAS

Connecting to the Tabular Instance in Azure

So to verify this works, I wanted to connect to the model with SSMS on my desktop. However, it uses Windows authentication and I am not on the same domain. (My IT staff is likely still laughing about me joining my VM to our domain.)

Thankfully, Greg Galloway (blog) reminded me of how to set up runas to use different credentials to connect to SSAS. Here is the syntax I used to connect to the tabular instance on Azure using a command window:

runas /netonly /user:<<VM name>>\<<username>> “C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Ssms.exe”

This allowed me to open SSMS using those credentials. Next, I used the VM + port to connect. You will see that the Windows credentials in the dialog are not what you entered on the command line. This is expected and the credentials you entered in the command line will be passed through to the VM.

Deploying a Tabular Model to Azure

I attempted three different ways to deploy my model to the Azure VM. Two worked, one did not.

Deploying from Visual Studio on My Desktop

My first attempt was to deploy from Visual Studio on my desktop. I set the deployment properties to point to the Azure instance using the same credentials I had used with SSMS. I also set up a command line execution to use runas like with SSMS.

It appeared I could connect, but I continually got a permissions error which is shown below. After much frustration, I gave up on it and moved to the next options.

image

Using XMLA to Deploy

This is the most straightforward way to deploy an SSAS DB on a different server. I used SSMS to generate the Create Database XMLA statement. Because I had not deployed it locally, I needed to modify the XMLA statement to remove the user name and guid from the database name and database ID. (AdvWorksTab1_SteveH_<<Some GUID>>)

In a bit of irony, I can use the SSMS connection from my desktop using the runas to deploy the database to the VM.

The reality is that this is easy and acceptable way to deploy SSAS databases to production environments, in Azure or in your datacenter.

Deploying from Visual Studio on the VM

The final method I used was deploying Visual Studio onto the VM and deploying from there. I used VisualStudio.com (TFS online) to transfer the source code from my desktop to the VM. I had to install the TFS client on the VM, but SSDT with BI tools is already there.

  1. Installed the VS 2010 TFS Client: http://www.microsoft.com/en-us/download/details.aspx?id=329
  2. Then installed Visual Studio SP1  http://social.msdn.microsoft.com/Forums/vstudio/en-US/4e4851dc-eb29-4081-9484-d38a6efa07ee/unable-to-connect-to-tfs-online?forum=TFService
  3. Finally installed VS2010 Team Foundation Server Compatibility GDR (KB2662296) http://www.microsoft.com/en-us/download/details.aspx?id=29082

Now it will connect to TFS Online. I got the latest from my project and pointed the project to my tabular instance.

Be sure to check the impersonation settings.

Next, I deployed the project to the local tabular instance on the VM and it worked great. This might make sense for development, but I would not use this method in a production environment.

Some Closing Thoughts

I was amazed at how simple it was to create the VM and set up tabular in Azure. Knowing what I know now, I would be able to set up a usable instance fairly quickly and deploy a database using XMLA without much effort. That was very nice.

Doesn’t work with Office 365

I started this project to determine the connectivity capability with Office 365. Well, that does not work in my current configuration. I was able to create a workbook on my desktop using my Azure tabular model and Excel. It works just as you would expect. However, when I deployed the workbook to Office 365, data refresh always failed.

image

My next thought was to use a data gateway, but those only work with SQL Server Data Engine and Oracle, not SSAS. This is still a significant issue with making Power BI able to fully take advantage of existing BI assets at a company.

Using Azure Active Directory

My next step would be to use Azure Active Directory services to try to get Windows Authentication to work throughout. But that is for a later project and post.





Oracle Tips for MSBI Devs #5: Working with Oracle on Windows Azure

23 02 2014

As you have likely noticed in my series, Oracle Tips for MSBI Devs, I have done a lot of work with Oracle through the years while delivering BI solutions. One of the pain points of working with Oracle in development is setting up an Oracle development server. Even though I have installed Oracle a number of times, it is never seems to be an easy process.

So, I decided to try out the Oracle Virtual Machine template in Windows Azure. I will walk through the setup process here. I need to use Oracle as a data source for some SSIS development.

Setting Up the VM

From the Windows Azure portal, select the Virtual Machines tab then “Create a Virtual Machine”. This will open up the variety of options available to create the VM. Select the FROM GALLERY option which will open another dialog.

image

On the next screen, you pick the edition Oracle you want to use for the VM. (NOTE: at the moment, Oracle images are in preview. Microsoft recently that Oracle VMs will be be available on March 12. You can find more information here.)

image

I will be using the Oracle 11g R2 Standard Edition on Windows Server 2008 R2. The next step is to name and pick the size of the VM. The minimum size for this is Small and is what I used. I then completed the setup including setting up the endpoints and creating a new user.

I had originally tried to use Oracle 12c, but significant changes have been made to support multitenancy which make set up considerably more tedious with very few good examples available on the web. Most of the advice given by Oracle pros was to “Read the _____ Manual!” While “sensible”, I just needed a simple dev environment. This is one of the significant advantages of working with SQL Server, community help is abundant and usually pleasant. For instance, Microsoft recently published a document for setting up the Oracle 12c VM. I used it to work through some of the setup instructions below.

Once the initialization was complete I used the connect image button to open an RDP connection to the VM from the Azure dashboard. One thing to keep in mind, be sure to keep track of the user name and password you created. This your admin account and you will need it to log in to the VM. Now you have a running VM. At this point, I went and found the Oracle tools that I typically use and pinned them to the task bar.

Creating and Loading an Oracle Schema

Because I always for get some of these steps, and I really don’t want to read the manual, I listed the steps I used to create a simple set of data for use. This is not good enough for production, but it is a starting point.

Step 1: Create a Listener. This is required before you can create a database. To do this open the Oracle Net Configuration Assistant. From here you can create your first listener. I left the default settings for the setup.

Step 2: Create the database. (This is the equivalent of an instance in SQL Server.)  I used the Database Configuration Assistant for Oracle Database to create my first database on the server. This can be found in the Oracle home directory on the start menu.I chose the General Purpose template for my database. Most of the steps make some sense. I did choose to add the sample schemas as this is the easiest way to verify I can connect and work with the data. After all of this, the database will be created based on your choices.

Step 3: Using SQL*Plus, I connected to the SYSTEM schema. The user-name in this case is “SYSTEM”. Now we

Step 4: Create a new user and schema. (This is similar to the SQL Server database, not a SQL Server Schema.) This will give a location to create tables for loading data in the next steps. In accordance with typical Oracle support you can read about how to do this here: http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_8003.htm#i2065278. Or I can give you a good starting script and save you time.

CREATE USER someusername
IDENTIFIED BY <<add password here>>
DEFAULT TABLESPACE example
QUOTA 10M ON example
TEMPORARY TABLESPACE temp
QUOTA 5M ON system;

Your welcome! This will create a database using existing tablespaces. This is NOT a production level script and it is barely good enough for development. But in my case, I am using Oracle as a source and don’t plan to do much development or work on it so it meets my needs. If you need more insight, I guess you will need to read the documentation.

Step 5: Create a table and add rows. I continued to use the SYSTEM login and created a couple of simple tables within my new schema. I then used simple INSERT INTO statements to add data.

Now you have some basic data to work with to test connectivity with SSIS or SSAS.

Making the Oracle Database Accessible

In order to access your Oracle database on the VM you need to enable the port. You do this by going to the Azure portal and selecting the VMs tab. Once there, go to the Endpoints tab. You may recall that when you created the VM, you were asked about the Remote Desktop and PowerShell ports. Here are the steps to create the Endpoint to support Oracle.

  1. Click Add to open the Add Endpoint dialog.
  2. On the first page, leave the default which will add a stand-alone endpoint.
  3. On the second page you need to add a name (I used “Oracle”), select the TCP protocol, and put port 1521 in both the private and public port textboxes.

Once completed you should see the new endpoint in the list of available endpoints as shown below.

image

Connecting SSIS to that Oracle Database

Now that we have data in the db, it is time to create the connection to SSIS and load data and run queries.

The first thing I needed to do was load the client tools. I used the newer Oracle Data Access Components (ODTwithODAC12012). Once that was loaded, I added the following entry to my TNSNames.ora file (look for that in a directly like the following: c:\app\<username>\product\12.1.0\client_1\Network\Admin):

ORACLEDW =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = <servername>.cloudapp.net) (PORT  1521)
(CONNECT_DATA =
(SERVER=DEDICATED)
(SERVICE_NAME = ORACLEDW)
)
)

The key parts to get right are the HOST, PORT, and SERVICE_NAME as highlighted above.

Once TNS was in place, I was able to create an ODP.NET connection to the database and proceed to load the data.

I know that some of this has been simplistic but it is great that I don’t have to install Oracle myself. This functionality makes Azure even more appealing as a hosting solution.





Rochester – Welcome to the PASS MN Family

22 01 2014

It started as a vision for expanding the reach of the Minnesota SQL Server User Group a year ago. At the time, Paul Timmerman ( T | B ) and I were discussing the fact that only one user group existed in Minnesota. As we looked at other areas and talked with SQL pros from St. Cloud and Rochester, a dream was born — what if PASSMN, a large and very mature user group, helped kick off meetings in other areas? Throughout the past year, Paul has been working with Patrick out of Rochester to make this a reality.

image Source: http://binged.it/1eB310K

Tonight was our first meeting in Rochester. We had an awesome turn out with over twenty people attending. Everyone is already looking forward to the next meeting.

Paul and the crowd Paul presenting

To begin with, the Rochester group will be supported by PASSMN as an extension group. The meetings are planned to be quarterly and we will be working with local and regional speakers to participate. Our hope is that the group will grow and can become a stand alone chapter. However, the goal of PASS is community and training. Whether independent or as extension group, the goal is to meet these needs within the SQL Server development community where ever we can.

PASSMNLogo

For more information about PASS MN and PASS MN – Rochester meetings, check out http://minnesota.sqlpass.org/.








Follow

Get every new post delivered to your Inbox.

Join 730 other followers

%d bloggers like this: