I presented at three events in the past 10 days.
- Magenic Lunch & Learn – SSRS Training – Overview of SQL Server Reporting Services functionality.
- Minnesota SQL Server User Group – SQL Azure – a good discussion on SQL Azure, it’s capabilities, and cost effectiveness for our solutions.
- Techfuse 2011 – Performance Monitoring and Tuning with SSAS – a look at the query execution workflow and the tools to monitor and tune Analysis Services.
I hope these provide value to you. Feel free to comment here with questions related to any of these presentations.
Last week, I got a look at the new Microsoft Technology Center in Minneapolis. Very cool place. I am glad to see one here that we will be able to use to help our customers with their POCs. They have a training room, a number of POC areas, and server farm you can leverage for your POCs.
The Grand Opening for the MTC is Tuesday, April 5. If you are able take time to check this out as it as cool new facility I encourage you to do so. You can find out more at http://www.microsoft.com/en-us/mtc/locations/minneapolis.aspx.
At my current customer, we are putting together a sandbox MSBI environment with SharePoint as well. We installed and configured SharePoint, SQL Server, Analysis Services, and integrated Reporting Services. I was preparing to put together a PowerPivot demo in SharePoint and started to install it. So here is where the fun began.
First, you need to use the SQL Server 2008 R2 install to
Next, you need to LOGIN using your Farm admin account and from what I could piece together, that user MUST have the following privileges.
- SysAdmin on SQL Server
- Local and Domain Administrator
- SharePoint Farm Administrator
Of course, all of this violates any concept of minimum privileges for users. While this level of access may not have been absolutely necessary, it is definitely where I ended up to get this working. (The moral of this story is to install PowerPivot on new farms only?)
Here are some of the links I used to get me pointed in the right direction. I would be interested in hearing if anyone else has had this issue and resolved it differently.
Now that it is running I can put the rest of my demo together. This sure seemed more painful than it had to be.
February PASSMN Meeting & Newsletter
Sponsored by Digineer
8300 Norman Center Drive, 9th Floor, Bloomington, MN 55437
February 15th, 2011
3:00 – 5:00
Please click here for meeting details and to RSVP
High Availability & DR Options for SQL Server
Tim Plas, Virteva
A comparison of SQL HA & DR options, by a practitioner who has implemented & managed all the SQL HA and DR approaches (& various combinations thereof). Tim is an operational DBA, charged with keeping SQL servers up & running & optimized, for managed-services customers. We will compare trade-offs between the various SQL HA & DR options: for complexity, usability, hardware requirements, licensing, failover speed, initial costs, ongoing support costs, staff skill requirements, etc.
Also, as you may be aware, Microsoft has announced a set of very powerful “AlwaysOn” features for the upcoming version of SQL (“Denali”), features popularly referred to as “HADRON” (“High Availability Disaster Recovery always ON”). We’ll provide a brief overview of those features now, and will have a full presentation on that later in the year.
In the first two parts of this topic, I discussed how data is managed in SQL Azure and what the cost would be to an organization. In this installment, I wanted to propose some data solutions that might be a fit for a SQL Azure database.
Here is some quick history that is shaping my thoughts on this. While at Magenic, my focus is on business intelligence solutions on the Microsoft SQL Server product stack. Prior to returning to Magenic, I worked at XATA Corporation as the data architect for a hosted, SaaS solution serving the trucking industry. So, out of this background, I base my first suggestion. In SaaS (Software as a Service), customers often use tools provided by the vendor for specific tasks which tasks generate data. Most SaaS solutions offer some form of reporting or query based analysis for the data that is collected. However, some users require a greater level of interaction with the data. The issue is that the data is stored in the SaaS vendor’s multi-tenant data store which is usually not conducive to having ad hoc queries run against it. This has led to one of the most common solutions to this problem – export the data to customer. The problem is that the customer must now host the data on premise and is often responsible for merging new data as it comes from the vendor. In this solution, SQL Azure could act as the “go-between” between the multi-tenant data store and the customer. This will allow the vendor to provide a value-added BI service that the customer can leverage in a number of ways including reports, Excel, and even Access. The vendor can keep the data synchronized and the customer can leverage the data as needed.
Beyond the SaaS BI ad hoc solution, SQL Azure can be used to support development of solutions that require a shared data store without putting the data on each desktop. In keeping with the concept of the cloud being “anywhere”, SQL Azure can also be used to support distributed solutions that require a SQL data store to function.
Obviously, there are still issues with using SQL Azure as a primary, production data store due to the lower SLAs from Microsoft. However, it is not too early to investigate creative ways that will help your business leverage a relational database in the cloud. I would encourage you to check out the free trial options from Microsoft to experiment with the platform.