Recovering SharePoint 2010 with BI on Windows 7

While working with Microsoft BI projects, I am often required to deliver the visualizations through SharePoint.  On my current project, I have a SharePoint 2010 development environment on Windows 7 that has SharePoint and all the SQL Server services running.  Last week I was preparing to demo our progress when the site became unresponsive.  The demo site had the wonderfully descriptive message:  “An unexpected error has occurred.”  So, I attempted to open the management console but that page could not be found (404 error).  I turned on the expanded error messaging by modifying the web.config file and checked the log files.  The errors pointed to the inability to connect to a database.  Upon further investigation I discovered that three of our databases went suspect including the admin and content databases.

At this point, I proceeded to try to restore my full SharePoint backup.  Which works in all situations except for recovering the admin database. Based on the online resources, this is to be expected.  After conferring with the coworkers, I was able to determine that the admin database was unrecoverable and the best bet was a reinstall of SharePoint.  After working through some options I was able to get the site back up and running without recreating the entire solution.  The rest of the blog walks through the recovery scenario and calls out the nuances of having BI components as part of the content.


First, the SharePoint Farm must be Farm backed up.  I did this through the Central Administration console. 


Reinstalling SharePoint

The next step involves uninstalling SharePoint.  I did this by running Setup and choosing the Remove Option.  (The Repair option did not work.) Once I had SharePoint uninstalled, I proceeded to reinstall it.  After SharePoint was installed, I ran the Configuration wizard. This step recreated the Admin database.  Once this was complete, I verified that the starter site and the Central Administration site were operational. 

Recovering the Farm

You can recover the farm from either the Central Administration site or by using PowerShell.  In this case, I used Central Administration as it is easier to select and understand the options. 

I first browsed to the Backup and Restore menu page. From this page, I selected the “Restore from a backup” option under the Farm Backup and Restore header.

NOTE:  In order to run the backup or restore farm functions, the Administration service must be running.  If it is not running, start it in the Windows Services console.

There are three pages or steps to recovering the backup. 

Step 1.  Select the backup to restore.


Step 2.  Select the components to restore.


For my purposes, I selected the Farm option which restores all of the options to the farm from the backup.

Step 3.  Select Restore Options.


The first two selections are the most important.  I selected the “Restore content and configuration” and “Same configuration”.  This uses the same application and site names for the full restore of your site.  The “New configuration” option would have required me to rename all of the applications and services I needed to restore. There were a lot of applications and services to rename which I chose not to do.

When I restored at both the client and on my personal development environments I needed to remove the following services before running the restore:

  • Application Registry Service
  • Web Analytics Service Application
  • User Profile Service Application
  • Search Service Application.

When I did not remove these, my restore failed due to duplicate names.  You may run into a similar issue, so removing them before the restore is easiest.  These applications and services should be removed from the Central Administration site.  Once these services are removed, run the restore.

Bringing BI Back Online

At this point, my site would have been operational except I had a Reporting Services webpart on the homepage of my site.  Because of that I saw an error below.


As the error message notes, the issue is with the Reporting Services webpart. The issue is that the full restore did not restore the Reporting Services integration.  The integration needed to be redone manually. 

Once Reporting Services was reintegrated I was able to open the site.  However, neither the Reporting Services or PerformancePoint webparts were working. 


These issues are primarily security related.  To fix the Reporting Services issue, I openend the data source and reset the credentials.  Once reestablished, the report webpart came back online. 

To resolve the PerformancePoint issue, I first went to Central Administration to refresh the Secure Store Service key.  I was able to use my passphrase refresh the key.  (NOTE: if necessary you can create a new key.) Next, I went in to the PerformancePoint Service Application settings and reset the unattended service account. 

Once these steps are complete, the site was fully restored.  While the full backup is extremely useful for site failures it will not recover all of the BI components and as such feels very incomplete. I hope this saves someone some time later.

Troubleshooting a Data Type Issue during SSAS Processing


I have built a cube on a series of views in SQL Server that mimicked similar views in Oracle.  The views create a “virtual star schema” over an operational data store that consists of imports from various mainframe sources.  There is no plan at the moment to move to a stored star schema, so views are being used to fill the gap.  During the development cycle issues arose with performance from the Oracle environment.  While those issues were being investigated, the data store tables used to support the SSAS project were moved to a SQL Server instance for the interim.  We retained the views to minimize the impact for delivery once Oracle issues were resolved.

I am currently deploying changes only from Visual Studio (see image below for Project Configuration settings).  After that, I used SSMS with XMLA to process the dimensions and measure groups in a targeted fashion.



The issue was that along the way, the data types had to be adjusted to support the more robust data typing in SQL Server.  As a result of the data type changes, I started to see errors during processing which noted that the data was being truncated.


I tried a number of things to resolve this particular error.  Along the way I fixed other various but unrelated issues.  Here are the activities I tried to resolve the issue which did not work.

  • Rebuild all dimensions
  • Reviewed ALL relationships, keys, and measures for offending data types.  Some of this was reviewed through the BIDSHelper tool – Dimension Data Type Discrepancy Check.
  • Reduced data set.  This returned the error faster, but also allowed for quicker processing during the resolution of the issue.
  • Eliminated relationships and measures.  As this issue was clearly in the measure group, I removed the relationships and the currency measure from the measure group.  This worked.


The last step was the key piece – by eliminating the relationships, I was able to determine which reset all of the relationship values.  While I had done this manually, it appears that I needed to remove all of the relationships and then add them back in as I processed.  This appears to have resolved the issue I was encountering.

The moral of this story is that the BIDSHelper tool showed me where the error was likely occurring, but as is the case with many of the BI tools in Visual Studio, the change was not propagating to all of the locations required within the XMLA until I retouched them all.  Now if I can just get the memory to release properly, but that is a topic for another day.

SSAS 2008 R2 Operations Guide Released by SQLCAT

If you work in SSAS, you should become very familiar with this new guide from the SQL Server Customer Advisory Team (aka SQLCAT).

They have also previously published the Analysis Services 2008 Performance Guide.

Together these resources provide valuable insight for anyone developing solutions with or managing instances of Analysis Services. 

Simple batch script to generate XMLA and deploy SSAS DB

As I was working through some issues with my cube design, I realized that I wanted to deploy my SSAS databases using XMLA in SSMS.  This can be easily scripted from SSMS, but I had not deployed the database yet.  As a result, I created a batch file to build the XMLA used to deploy a database.  This script can be used for automated builds or even automated deployments if that fits into your model.  I don’t write a lot of batch files so you may want to update the syntax as needed.

The script I used is here (I saved this as a .bat file on my desktop):

—————SCRIPT START———————-

REM Use this batch file to build and create the database XMLA script

ECHO Build started . . .
”C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\IDE\devenv.exe” “<< location of your SSAS solution or project>> /build development /out ssasbuild.log

ECHO Build completed . . .
ECHO XMLA Script generation started

Microsoft.AnalysisServices.Deployment.exe “<< project path >>\bin\<<ProjectName>>.asdatabase /d /o:”<<target location and file name>>.xmla”

ECHO XMLA Script generation complete . . .

PAUSE Completed

—————SCRIPT END———————-

The two key executables are devenv.exe and Microsoft.Analysis Services .Deployment.exe.

Devenv.exe may not be in the same location as mine.  This is the Visual Studio executable.  If you have multiple versions of Visual Studio installed, be sure to pick the correct one.  The parameters used are:

  • The location of the SSAS solution (.sln) or project (.dwproj).  Keep in mind that if you build the solution, you will build all of the projects in your solution.
  • /build selects which configuration you are planning to build.  This is a part of the project properties.  In my case, I picked the development build.
  • /out specifies the destination of the log file.  This log file will log the warnings and errors that occur during the build process.

The next executable is the actual deployment executable – Microsoft.AnalysisServices.Deployment.exe.  Once the database has been built it results in an asdatabase file which this process will generate an XMLA script from.  If you built multiple projects in the previous step, you will need to repeat this step for each database you wish to create an XMLA script for.  The parameters used are:

  • The location of the .asdatabase file.  You will usually find it in the bin folder of your project after you have built the project.
  • /d specifies that the deployment executable not connect to the target server during the build of the XMLA which is what we want in this scenario.
  • /o specifies the file location and name of the XMLA output.

(NOTE:  The ECHO and PAUSE statements were used to make this more friendly for my use and are not required to build or deploy the XMLA script.)

Once you have the XMLA file, you can open it in SSMS and choose the target SSAS instance you want to deploy to.  You can also make changes required prior to the deployment such as data source connection strings.

MSDN reference for the deployment utility:

I also referred to Vidas Mitalis’s FAQ creating the same script from SSAS 2005 here while constructing my SSAS 2008 script:

March Presentations Uploaded on SkyDrive

I presented at three events in the past 10 days. 

  • Magenic Lunch & Learn – SSRS Training – Overview of SQL Server Reporting Services functionality.
  • Minnesota SQL Server User Group – SQL Azure – a good discussion on SQL Azure, it’s capabilities, and cost effectiveness for our solutions.
  • Techfuse 2011 – Performance Monitoring and Tuning with SSAS – a look at the query execution workflow and the tools to monitor and tune Analysis Services.

I hope these provide value to you.  Feel free to comment here with questions related to any of these presentations.