February 2011 - Posts - Sparks from the Anvil

February 2011 - Posts

The most dangerous SQL Script in the world!

In my last blog entry, I outlined how to automate SQL Server database builds from concatenated SQL Scripts. However, I did not mention how I ensure the database is clean before I rebuild it. Clearly a simple DROP/CREATE DATABASE command would suffice; but you may not have permission to execute such commands, especially in a corporate environment controlled by a centralised DBA team. However, you should at least have database owner permissions on the development database so you can actually do your job! Then you can employ my universal "drop all" script which will clear down your database before you run your SQL Scripts to rebuild all the database objects.

Why start with a clean database?

During the development process, it is all too easy to leave old objects hanging around in the database which can have unforeseen consequences. For example, when you rename a table you may forget to delete the old table and change all the related views to use the new table. Clearly this will mean an end-user querying the views will get the wrong data and your reputation will take a nose dive as a result!

Starting with a clean, empty database and then building all your database objects using SQL Scripts using the technique outlined in my previous blog means you know exactly what you have in your database. The database can then be repopulated using SSIS and bingo; you have a data mart "to go".

My universal "drop all" SQL Script

To ensure you start with a clean database run my universal "drop all" script which you can download from here: 100_drop_all.zip

By using the database catalog views, the script finds and drops all of the following database objects:

  1. Foreign key relationships
  2. Stored procedures
  3. Triggers
  4. Database triggers
  6. Tables
  7. Functions
  8. Partition schemes
  9. Partition functions
  10. XML Schema Collections
  11. Schemas
  12. Types
  13. Service broker services
  14. Service broker queues
  15. Service broker contracts
  16. Service broker message types
  17. SQLCLR assemblies

There are two optional sections to the script: drop users and drop roles. You may use these at your peril, particularly as you may well remove your own permissions!

Note that the script has a verbose mode which displays the SQL commands it is executing. This can be switched on by setting @debug=1.

Running this script against one of the system databases is certainly not recommended! So I advise you to keep a USE database statement at the top of the file.

Good luck and be careful!!

Posted by DrJohn with no comments
Filed under:

SSIS: Building SQL databases on-the-fly using concatenated SQL scripts

Over the years I have developed many techniques which help automate the whole SQL Server build process. In my current process, where I need to build entire OLAP data marts on-the-fly, I make regular use of a simple but very effective mechanism to concatenate all the SQL Scripts together from my SSMS (SQL Server Management Studio) projects. This proves invaluable because in two clicks I can redeploy an entire SQL Server database with all tables, views, stored procedures etc. Indeed, I can also use the concatenated SQL scripts with SSIS to build SQL Server databases on-the-fly.

You may be surprised to learn that I often redeploy the database several times per day, or even several times per hour, during the development process. This is because the deployment errors are logged and you can quickly see where SQL Scripts have object dependency errors. For example, after changing a table structure you may have forgotten to change any related views. The deployment log immediately points out all the objects which failed to build so you can fix and redeploy the database very quickly. The alternative approach (i.e. doing changes in the database directly using the SSMS UI) would require you to check all dependent objects before making changes. The chances are that you will miss something and wonder why your app returns the wrong data – a common problem caused by changing a table without re-creating dependent views.

Using SQL Projects in SSMS

A great many developers fail to make use of SQL Projects in SSMS (SQL Server Management Studio). To me they are invaluable way of organizing your SQL Scripts. The screenshot below shows a typical SSMS solution made up of several projects – one project for tables, another for views etc. The key point is that the projects naturally fall into the right order in file system because of the project name. The number in the folder or file name ensures that the projects the SQL scripts are concatenated together in the order that they need to be executed. Hence the script filenames start with 100, 110 etc.

Concatenating SQL Scripts

To concatenate the SQL Scripts together into one file, I use notepad.exe to create a simple batch file (see example screenshot) which uses the TYPE command to write the content of the SQL Script files into a combined file. As the SQL Scripts are in several folders, I simply use several TYPE command multiple times and append the output together.

If you are unfamiliar with batch files, you may not know that the angled bracket (>) means write output of the program into a file. Two angled brackets (>>) means append output of this program into a file. So the command-line

DIR > filelist.txt

would write the content of the DIR command into a file called filelist.txt. In the example shown above, the concatenated file is called SB_DDS.sql

If, like me you place the concatenated file under source code control, then the source code control system will change the file's attribute to "read-only" which in turn would cause the TYPE command to fail. The ATTRIB command can be used to remove the read-only flag.

Using SQLCmd to execute the concatenated file

Now that the SQL Scripts are all in one big file, we can execute the script against a database using SQLCmd using another batch file as shown below:

SQLCmd has numerous options, but the script shown above simply executes the SS_DDS.sql file against the SB_DDS_DB database on the local machine and logs the errors to a file called SB_DDS.log. So after executing the batch file you can simply check the error log to see if your database built without a hitch. If you have errors, then simply fix the source files, re-create the concatenated file and re-run the SQLCmd to rebuild the database. This two click operation allows you to quickly identify and fix errors in your entire database definition.

Using SSIS to execute the concatenated file

To execute the concatenated SQL script using SSIS, you simply drop an Execute SQL task into your package and set the database connection as normal and then select File Connection as the SQLSourceType (as shown below). Create a file connection to your concatenated SQL script and you are ready to go.


Tips and Tricks

  • Add a new-line at end of every file
    The most common problem encountered with this approach is that the GO statement on the last line of one file is placed on the same line as the comment at the top of the next file by the TYPE command. The easy fix to this is to ensure all your files have a new-line at the end.
  • Remove all USE database statements
    The SQLCmd identifies which database the script should be run against.  So you should remove all USE database commands from your scripts - otherwise you may get unintentional side effects!!
  • Do the Create Database separately
    If you are using SSIS to create the database as well as create the objects and populate the database, then invoke the CREATE DATABASE command against the master database using a separate package before calling the package that executes the concatenated SQL script.  


Posted by DrJohn with 2 comment(s)
Filed under: , ,

SSIS: Deploying OLAP cubes using C# script tasks and AMO

As part of the continuing series on Building dynamic OLAP data marts on-the-fly, this blog entry will focus on how to automate the deployment of OLAP cubes using SQL Server Integration Services (SSIS) and Analysis Services Management Objects (AMO).

OLAP cube deployment is usually done using the Analysis Services Deployment Wizard. However, this option was dismissed for a variety of reasons. Firstly, invoking external processes from SSIS is fraught with problems as (a) it is not always possible to ensure SSIS waits for the external program to terminate; (b) we cannot log the outcome properly and (c) it is not always possible to control the server's configuration to ensure the executable works correctly. Another reason for rejecting the Deployment Wizard is that it requires the 'answers' to be written into four XML files. These XML files record the three things we need to change: the name of the server, the name of the OLAP database and the connection string to the data mart. Although it would be reasonably straight forward to change the content of the XML files programmatically, this adds another set of complication and level of obscurity to the overall process.

When I first investigated the possibility of using C# to deploy a cube, I was surprised to find that there are no other blog entries about the topic. I can only assume everyone else is happy with the Deployment Wizard!

SSIS "forgets" assembly references

If you build your script task from scratch, you will have to remember how to overcome one of the major annoyances of working with SSIS script tasks: the forgetful nature of SSIS when it comes to assembly references. Basically, you can go through the process of adding an assembly reference using the Add Reference dialog, but when you close the script window, SSIS "forgets" the assembly reference so the script will not compile. After repeating the operation several times, you will find that SSIS only remembers the assembly reference when you specifically press the Save All icon in the script window. This problem is not unique to the AMO assembly and has certainly been a "feature" since SQL Server 2005, so I am not amazed it is still present in SQL Server 2008 R2!

Sample Package

So let's take a look at the sample SSIS package I have provided which can be downloaded from here: DeployOlapCubeExample.zip  Below is a screenshot after a successful run.

Connection Managers

The package has three connection managers:

  • AsDatabaseDefinitionFile is a file connection manager pointing to the .asdatabase file you wish to deploy. Note that this can be found in the bin directory of you OLAP database project once you have clicked the "Build" button in Visual Studio
  • TargetOlapServerCS is an Analysis Services connection manager which identifies both the deployment server and the target database name.
  • SourceDataMart is an OLEDB connection manager pointing to the data mart which is to act as the source of data for your cube. This will be used to replace the connection string found in your .asdatabase file

Once you have configured the connection managers, the sample should run and deploy your OLAP database in a few seconds. Of course, in a production environment, these connection managers would be associated with package configurations or set at runtime.

When you run the sample, you should see that the script logs its activity to the output screen (see screenshot above). If you configure logging for the package, then these messages will also appear in your SSIS logging.

Sample Code Walkthrough

Next let's walk through the code. The first step is to parse the connection string provided by the TargetOlapServerCS connection manager and obtain the name of both the target OLAP server and also the name of the OLAP database. Note that the target database does not have to exist to be referenced in an AS connection manager, so I am using this as a convenient way to define both properties.

We now connect to the server and check for the existence of the OLAP database. If it exists, we drop the database so we can re-deploy.


if (svr.Connected)


// Drop the OLAP database if it already exists

Database db = svr.Databases.FindByName(olapDatabaseName);

if (db != null)




// rest of script


Next we start building the XMLA command that will actually perform the deployment. Basically this is a small chuck of XML which we need to wrap around the large .asdatabase file generated by the Visual Studio build process.

// Start generating the main part of the XMLA command

XmlDocument xmlaCommand = new XmlDocument();

xmlaCommand.LoadXml(string.Format("<Batch Transaction='false' xmlns='http://schemas.microsoft.com/analysisservices/2003/engine'><Alter AllowCreate='true' ObjectExpansion='ExpandFull'><Object><DatabaseID>{0}</DatabaseID></Object><ObjectDefinition/></Alter></Batch>", olapDatabaseName));


Next we need to merge two XML files which we can do by simply using setting the InnerXml property of the ObjectDefinition node as follows:

// load OLAP Database definition from .asdatabase file identified by connection manager

XmlDocument olapCubeDef = new XmlDocument();


// merge the two XML files by obtain a reference to the ObjectDefinition node

oaRootNode.InnerXml = olapCubeDef.InnerXml;


One hurdle I had to overcome was removing detritus from the .asdabase file left by the Visual Studio build. Through an iterative process, I found I needed to remove several nodes as they caused the deployment to fail. The XMLA error message read "Cannot set read-only node: CreatedTimestamp" or similar. In comparing the XMLA generated with by the Deployment Wizard with that generated by my code, these read-only nodes were missing, so clearly I just needed to strip them out. This was easily achieved using XPath to find the relevant XML nodes, of which I show one example below:

foreach (XmlNode node in rootNode.SelectNodes("//ns1:CreatedTimestamp", nsManager))




Now we need to change the database name in both the ID and Name nodes using code such as:

XmlNode databaseID = xmlaCommand.SelectSingleNode("//ns1:Database/ns1:ID", nsManager);

if (databaseID != null)

databaseID.InnerText = olapDatabaseName;

Finally we need to change the connection string to point at the relevant data mart. Again this is easily achieved using XPath to search for the relevant nodes and then replace the content of the node with the new name or connection string.

XmlNode connectionStringNode = xmlaCommand.SelectSingleNode("//ns1:DataSources/ns1:DataSource/ns1:ConnectionString", nsManager);

if (connectionStringNode != null)


connectionStringNode.InnerText = Dts.Connections["SourceDataMart"].ConnectionString;


Finally we need to perform the deployment using the Execute XMLA command and check the returned XmlaResultCollection for errors before setting the Dts.TaskResult.

XmlaResultCollection oResults = svr.Execute(xmlaCommand.InnerXml);


// check for errors during deployment

foreach (Microsoft.AnalysisServices.XmlaResult oResult in oResults)


foreach (Microsoft.AnalysisServices.XmlaMessage oMessage in oResult.Messages)


if ((oMessage.GetType().Name == "XmlaError"))



HadError = true;




If you are not familiar with XML programming, all this may all seem a bit daunting, but perceiver as the sample code is pretty short.

If you would like the script to process the OLAP database, simply uncomment the lines in the vicinity of Process method. Of course, you can extend the script to perform your own custom processing and to even synchronize the database to a front-end server. Personally, I like to keep the deployment and processing separate as the code can become overly complex for support staff.

If you want to know more, come see my session at the forthcoming SQLBits conference.

Posted by DrJohn with 2 comment(s)

Building dynamic OLAP data marts on-the-fly

At the forthcoming SQLBits conference, I will be presenting a session on how to dynamically build an OLAP data mart on-the-fly. This blog entry is intended to clarify exactly what I mean by an OLAP data mart, why you may need to build them on-the-fly and finally outline the steps needed to build them dynamically. In subsequent blog entries, I will present exactly how to implement some of the techniques involved.

What is an OLAP data mart?

In data warehousing parlance, a data mart is a subset of the overall corporate data provided to business users to meet specific business needs. Of course, the term does not specify the technology involved, so I coined the term "OLAP data mart" to identify a subset of data which is delivered in the form of an OLAP cube which may be accompanied by the relational database upon which it was built. To clarify, the relational database is specifically create and loaded with the subset of data and then the OLAP cube is built and processed to make the data available to the end-users via standard OLAP client tools.

Why build OLAP data marts?

Market research companies sell data to their clients to make money. To gain competitive advantage, market research providers like to "add value" to their data by providing systems that enhance analytics, thereby allowing clients to make best use of the data. As such, OLAP cubes have become a standard way of delivering added value to clients. They can be built on-the-fly to hold specific data sets and meet particular needs and then hosted on a secure intranet site for remote access, or shipped to clients' own infrastructure for hosting. Even better, they support a wide range of different tools for analytical purposes, including the ever popular Microsoft Excel.

Extension Attributes: The Challenge

One of the key challenges in building multiple OLAP data marts based on the same 'template' is handling extension attributes. These are attributes that meet the client's specific reporting needs, but do not form part of the standard template. Now clearly, these extension attributes have to come into the system via additional files and ultimately be added to relational tables so they can end up in the OLAP cube. However, processing these files and filling dynamically altered tables with SSIS is a challenge as SSIS packages tend to break as soon as the database schema changes.

There are two approaches to this: (1) dynamically build an SSIS package in memory to match the new database schema using C#, or (2) have the extension attributes provided as name/value pairs so the file's schema does not change and can easily be loaded using SSIS. The problem with the first approach is the complexity of writing an awful lot of complex C# code. The problem of the second approach is that name/value pairs are useless to an OLAP cube; so they have to be pivoted back into a proper relational table somewhere in the data load process WITHOUT breaking SSIS. How this can be done will be part of future blog entry.

What is involved in building an OLAP data mart?

There are a great many steps involved in building OLAP data marts on-the-fly. The key point is that all the steps must be automated to allow for the production of multiple OLAP data marts per day (i.e. many thousands, each with its own specific data set and attributes).

Now most of these steps have a great deal in common with standard data warehouse practices. The key difference is that the databases are all built to order. The only permanent database is the metadata database (shown in orange) which holds all the metadata needed to build everything else (i.e. client orders, configuration information, connection strings, client specific requirements and attributes etc.). The staging database (shown in red) has a short life: it is built, populated and then ripped down as soon as the OLAP Data Mart has been populated. In the diagram below, the OLAP data mart comprises the two blue components: the Data Mart which is a relational database and the OLAP Cube which is an OLAP database implemented using Microsoft Analysis Services (SSAS). The client may receive just the OLAP cube or both components together depending on their reporting requirements.


So, in broad terms the steps required to fulfil a client order are as follows:

Step 1: Prepare metadata

  • Create a set of database names unique to the client's order
  • Modify all package connection strings to be used by SSIS to point to new databases and file locations.

Step 2: Create relational databases

  • Create the staging and data mart relational databases using dynamic SQL and set the database recovery mode to SIMPLE as we do not need the overhead of logging anything
  • Execute SQL scripts to build all database objects (tables, views, functions and stored procedures) in the two databases

Step 3: Load staging database

  • Use SSIS to load all data files into the staging database in a parallel operation
  • Load extension files containing name/value pairs. These will provide client-specific attributes in the OLAP cube.

Step 4: Load data mart relational database

  • Load the data from staging into the data mart relational database, again in parallel where possible
  • Allocate surrogate keys and use SSIS to perform surrogate key lookup during the load of fact tables

Step 5: Load extension tables & attributes

  • Pivot the extension attributes from their native name/value pairs into proper relational tables
  • Add the extension attributes to the views used by OLAP cube

Step 6: Deploy & Process OLAP cube

  • Deploy the OLAP database directly to the server using a C# script task in SSIS
  • Modify the connection string used by the OLAP cube to point to the data mart relational database
  • Modify the cube structure to add the extension attributes to both the data source view and the relevant dimensions
  • Remove any standard attributes that not required
  • Process the OLAP cube

Step 7: Backup and drop databases

  • Drop staging database as it is no longer required
  • Backup data mart relational and OLAP database and ship these to the client's infrastructure
  • Drop data mart relational and OLAP database from the build server
  • Mark order complete

Start processing the next order, ad infinitum.

So my future blog posts and my forthcoming session at the SQLBits conference will all focus on some of the more interesting aspects of building OLAP data marts on-the-fly such as handling the load of extension attributes and how to dynamically alter the structure of an OLAP cube using C#.

Posted by DrJohn with 3 comment(s)