Troubleshooting a Data Type Issue during SSAS Processing

Situation

I have built a cube on a series of views in SQL Server that mimicked similar views in Oracle.  The views create a “virtual star schema” over an operational data store that consists of imports from various mainframe sources.  There is no plan at the moment to move to a stored star schema, so views are being used to fill the gap.  During the development cycle issues arose with performance from the Oracle environment.  While those issues were being investigated, the data store tables used to support the SSAS project were moved to a SQL Server instance for the interim.  We retained the views to minimize the impact for delivery once Oracle issues were resolved.

I am currently deploying changes only from Visual Studio (see image below for Project Configuration settings).  After that, I used SSMS with XMLA to process the dimensions and measure groups in a targeted fashion.

clip_image001

Issue

The issue was that along the way, the data types had to be adjusted to support the more robust data typing in SQL Server.  As a result of the data type changes, I started to see errors during processing which noted that the data was being truncated.

Troubleshooting

I tried a number of things to resolve this particular error.  Along the way I fixed other various but unrelated issues.  Here are the activities I tried to resolve the issue which did not work.

  • Rebuild all dimensions
  • Reviewed ALL relationships, keys, and measures for offending data types.  Some of this was reviewed through the BIDSHelper tool – Dimension Data Type Discrepancy Check.
  • Reduced data set.  This returned the error faster, but also allowed for quicker processing during the resolution of the issue.
  • Eliminated relationships and measures.  As this issue was clearly in the measure group, I removed the relationships and the currency measure from the measure group.  This worked.

Resolution

The last step was the key piece – by eliminating the relationships, I was able to determine which reset all of the relationship values.  While I had done this manually, it appears that I needed to remove all of the relationships and then add them back in as I processed.  This appears to have resolved the issue I was encountering.

The moral of this story is that the BIDSHelper tool showed me where the error was likely occurring, but as is the case with many of the BI tools in Visual Studio, the change was not propagating to all of the locations required within the XMLA until I retouched them all.  Now if I can just get the memory to release properly, but that is a topic for another day.

SSAS 2008 R2 Operations Guide Released by SQLCAT

If you work in SSAS, you should become very familiar with this new guide from the SQL Server Customer Advisory Team (aka SQLCAT).

http://msdn.microsoft.com/en-us/library/hh226085.aspx

They have also previously published the Analysis Services 2008 Performance Guide.

http://sqlcat.com/whitepapers/archive/2009/02/15/the-analysis-services-2008-performance-guide.aspx

Together these resources provide valuable insight for anyone developing solutions with or managing instances of Analysis Services. 

Simple batch script to generate XMLA and deploy SSAS DB

As I was working through some issues with my cube design, I realized that I wanted to deploy my SSAS databases using XMLA in SSMS.  This can be easily scripted from SSMS, but I had not deployed the database yet.  As a result, I created a batch file to build the XMLA used to deploy a database.  This script can be used for automated builds or even automated deployments if that fits into your model.  I don’t write a lot of batch files so you may want to update the syntax as needed.

The script I used is here (I saved this as a .bat file on my desktop):

—————SCRIPT START———————-

REM Use this batch file to build and create the database XMLA script

ECHO ON
ECHO Build started . . .
”C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\IDE\devenv.exe” “<< location of your SSAS solution or project>> /build development /out ssasbuild.log

ECHO Build completed . . .
ECHO XMLA Script generation started

Microsoft.AnalysisServices.Deployment.exe “<< project path >>\bin\<<ProjectName>>.asdatabase /d /o:”<<target location and file name>>.xmla”

ECHO XMLA Script generation complete . . .
ECHO OFF

PAUSE Completed

—————SCRIPT END———————-

The two key executables are devenv.exe and Microsoft.Analysis Services .Deployment.exe.

Devenv.exe may not be in the same location as mine.  This is the Visual Studio executable.  If you have multiple versions of Visual Studio installed, be sure to pick the correct one.  The parameters used are:

  • The location of the SSAS solution (.sln) or project (.dwproj).  Keep in mind that if you build the solution, you will build all of the projects in your solution.
  • /build selects which configuration you are planning to build.  This is a part of the project properties.  In my case, I picked the development build.
  • /out specifies the destination of the log file.  This log file will log the warnings and errors that occur during the build process.

The next executable is the actual deployment executable – Microsoft.AnalysisServices.Deployment.exe.  Once the database has been built it results in an asdatabase file which this process will generate an XMLA script from.  If you built multiple projects in the previous step, you will need to repeat this step for each database you wish to create an XMLA script for.  The parameters used are:

  • The location of the .asdatabase file.  You will usually find it in the bin folder of your project after you have built the project.
  • /d specifies that the deployment executable not connect to the target server during the build of the XMLA which is what we want in this scenario.
  • /o specifies the file location and name of the XMLA output.

(NOTE:  The ECHO and PAUSE statements were used to make this more friendly for my use and are not required to build or deploy the XMLA script.)

Once you have the XMLA file, you can open it in SSMS and choose the target SSAS instance you want to deploy to.  You can also make changes required prior to the deployment such as data source connection strings.

MSDN reference for the deployment utility:  http://msdn.microsoft.com/en-us/library/ms162758.aspx

I also referred to Vidas Mitalis’s FAQ creating the same script from SSAS 2005 here while constructing my SSAS 2008 script:  http://www.ssas-info.com/analysis-services-faq/29-mgmt/104-how-create-a-script-that-generates-xmla-script-from-solution-project.

Installing PowerPivot on an Existing Farm

At my current customer, we are putting together a sandbox MSBI environment with SharePoint as well.  We installed and configured SharePoint, SQL Server, Analysis Services, and integrated Reporting Services.  I was preparing to put together a PowerPivot demo in SharePoint and started to install it.  So here is where the fun began.

First, you need to use the SQL Server 2008 R2 install to

Next, you need to LOGIN using your Farm admin account and from what I could piece together, that user MUST have the following privileges.

  • SysAdmin on SQL Server
  • Local and Domain Administrator
  • SharePoint Farm Administrator

Of course, all of this violates any concept of minimum privileges for users.  While this level of access may not have been absolutely necessary, it is definitely where I ended up to get this working.  (The moral of this story is to install PowerPivot on new farms only?)

Here are some of the links I used to get me pointed in the right direction.  I would be interested in hearing if anyone else has had this issue and resolved it differently.

Now that it is running I can put the rest of my demo together.  This sure seemed more painful than it had to be.

SQL Azure’s place in the data world (Part 3)

In the first two parts of this topic, I discussed how data is managed in SQL Azure and what the cost would be to an organization.  In this installment, I wanted to propose some data solutions that might be a fit for a SQL Azure database.

Here is some quick history that is shaping my thoughts on this.  While at Magenic, my focus is on business intelligence solutions on the Microsoft SQL Server product stack.  Prior to returning to Magenic, I worked at XATA Corporation as the data architect for a hosted, SaaS solution serving the trucking industry.  So, out of this background, I base my first suggestion.  In SaaS (Software as a Service), customers often use tools provided by the vendor for specific tasks which tasks generate data.  Most SaaS solutions offer some form of reporting or query based analysis for the data that is collected.  However, some users require a greater level of interaction with the data.  The issue is that the data is stored in the SaaS vendor’s multi-tenant data store which is usually not conducive to having ad hoc queries run against it.  This has led to one of the most common solutions to this problem – export the data to customer.  The problem is that the customer must now host the data on premise and is often responsible for merging new data as it comes from the vendor.  In this solution, SQL Azure could act as the “go-between” between the multi-tenant data store and the customer.  This will allow the vendor to provide a value-added BI service that the customer can leverage in a number of ways including reports, Excel, and even Access.  The vendor can keep the data synchronized and the customer can leverage the data as needed.

Beyond the SaaS BI ad hoc solution, SQL Azure can be used to support development of solutions that require a shared data store without putting the data on each desktop.  In keeping with the concept of the cloud being “anywhere”, SQL Azure can also be used to support distributed solutions that require a SQL data store to function. 

Obviously, there are still issues with using SQL Azure as a primary, production data store due to the lower SLAs from Microsoft.  However, it is not too early to investigate creative ways that will help your business leverage a relational database in the cloud.  I would encourage you to check out the free trial options from Microsoft to experiment with the platform.