SQLBits X–Registrations are open
07 December 11 04:14 PM | MartinBell | with no comments

Today we opened registrations for SQLBits X. Everyone registered on SQLBits.com should have received an email, but in case you aren’t here is the information:

Dear Sir/Madam,
We have two pieces of big news for you this week. In the words of organiser Tim Kent “It’s gonna be massive”

Here at SQLBits we are so honoured and delighted to announce that Microsoft recognises SQLBits X as the official UK SQL Server 2012 Technical Launch Event.

And, if that’s not enough…..then

Registration for SQLBits X - The UK SQL Server 2012 Technical Launch Event, is officially open! We simply could not wait and decided to open up registration immediately.


Oh and also… we have received submissions from Microsoft’s Cathy Dumas, Elad Ziklik and Matt Masson. As well as MVPs; Denny Cherry, Adam Machanic and Stacia Misner. Not to mention the array of other Microsoft Certified Masters, Microsoft Most Valuable Professionals and Microsoft Staff who will be attending. In the meantime, check out the sessions that have already been submitted from both the best SQL Server speakers in the world and from new speakers.

New to SQLBits, allow us to explain what you would expect at a SQLBits event.

The conference will be taking place between the 29th and 31st of March 2012 at the Novotel London West. The event will be the biggest and best yet, we have capacity for 900 people and expect to sell out really quickly.

The format for SQLBits X is the same as always with leading speakers from all around the world – with the added buzz of the UK SQL Server 2012 Technical Launch Event thrown in for good measure.

Thursday will be a series of full day sessions delivered by renowned presenters followed by Fusion-io’s follow-up to last year’s highly entertaining Crappy Code Games in the evening.

Friday is a day filled with regular sessions delivered by our highest rated speakers and Microsoft superstars and will feature an all new SQLBits Social event that should not be missed!

As always we will conclude proceedings on the Saturday with our free Community Day.

We want as many SQL Server professionals to take advantage of this unique event. To that end, for the fourth conference in a row, we are freezing the prices to attend.

The prices for SQLBits X are as follows; register early to take advantage of our early bird discounts.

All prices are exclusive of VAT. Prices are in UK Pounds Sterling (GBP). 

Cut-Off Date
midnight on the


Deep Dive

Full Conference
Thursday & Friday

Community Day

Early Bird

5th Jan 2012






4th Mar 2012




Last Minute




Discounts are available from local user groups. Contact your user group organiser for details.

Extra Discount - 20% off when 6 or more people book for the whole conference.

We are also really excited to announce that the SQLBits website is now running on SQL Server 2012. There are some really exciting additions that we are hoping to bring you based on the new functionality in SQL Server 2012. Why not download SQL Server 2012 RC0 now to experience the new functionality for yourself.

It is a testament to previous SQLBits attendees and sponsors that we have received this recognition. We hope we can thank all attendees, both new and old at the official UK SQL Server 2012 Technical Launch Event!


SQLBits Committee

Calling DTUtil recursively from Powershell
01 December 11 10:47 PM | MartinBell | with no comments

I’ve used DTUtil numerous times to load SSIS packages, but usually it is to load one or more packages to the same location. From the command prompt you can use a command such as

FOR %i in (*.dtsx) DO DTUTIL –File “%i” –Decrypt SecretPassword –Destserver SQLServerInstance –Encrypt SQL; “%~ni”; 5

If you are not familiar with DTUIL check out the
documentation on MSDN.

Using the above command is fine if you have all the packages in one directory and they all go to the same destination. If you have multiple directories containing files then you will need to repeat the command for each directory. With Powershell it is very easy to recurse through directories and process each file and there are many example available such as
this one that will give you a quick start.

The main issue using DTUtil is that if you try to copy a package to a folder that doesn’t exist DTUtil will automatically create the folder, therefore for each directory you need to check the existence of the destination folder. To do this I have used DTUTIL and the /FE flag, then checking the $LASTERROR variable tells me if it exists.

The whole script can be found HERE. There are other ways of doing this, but combining this utility with Powershell gives you a quick solution to the problem.

Filed under: ,
Using Powershell to remove strings in files
31 October 11 10:12 PM | MartinBell | 1 comment(s)

I was on a SQL 2005 site the other day, which was using a batch scripts to create their databases from files in version control. The process works well (if not that quickly!), and you can easily tell if a scripts has worked by the fact that the output file has zero length. This means you can quickly sort the output files and concentrate on the scripts which have failed. Unfortunately it was not that simple on this system because of the high occurrence of the messages like:

Cannot add rows to sys.sql_dependencies for the stored procedure because it depends on the missing table '<Some table>'. The stored procedure will still be created; however, it cannot be successfully executed until the table exists.

So I decided to create a Powershell script to remove the occurrences of these strings in the output files. After a short while I ended up with this, which did the job (this may wrap!):

$Files=get-Childitem "C:\temp\database"  "*.out"  -rec;
foreach ($file in $Files) {
(get-Content $file.PSPath ) | where { $_ -inotmatch "Cannot add rows to sys.sql_dependencies"  } | Add-Content -literalPath ( $file.PSPath -replace '\.out','.txt' ) ;
remove-item -Force -Path $file.PSPath

I started off trying to replace the strings in the current file, but had issues with the file being already open. I then tried writing only lines that didn’t contain a given string and interim, and once processed I renamed this intermediate file. This was fine unless the file only contained lines that had the string in them, as they would be left alone, so I added the remove-item, and left the interim files.

If you need to check for additional string you can use the add additional checks in the where object with the –and option e.g.

$Files=get-Childitem "C:\temp\database" "*.out" -rec;
foreach ($file in $Files) {
(get-Content $file.PSPath ) | where { $_ -inotmatch "Cannot add rows to sys.sql_dependencies" -and $_ -inotmatch "The module will still be created; however, it cannot run successfully until the object exists." } | Add-Content -literalPath ( $file.PSPath -replace '\.out','.txt' ) ;
remove-item -Force -Path $file.PSPath
Filed under:
Exciting times for the UK SQL Server community
10 September 11 11:37 AM | MartinBell | with no comments

At the end of this month we are going to have two extraordinary weeks of community activities in the UK. It will start with the SQLBits 9 “Query Across the Mersey” at the Adelphi Hotel in Liverpool from 29th September to 1st October. 

This will follow the established format of a Pre-conference training day on the Thursday, a day of paid for sessions with a keynote speech on the Friday and the Community day on the Saturday. During the conference we will have a Community Corner, please pay it a visit and find out what is happening in your area and across the country. The conference will also see the launch of the first SQL Relay, a series of events culminating at the Microsoft Office at Cardinal Place, London with Itzik Ben-Gan (SQL Server Author, MVP and Co-Founder of SolidQ) on the 6th October.

Between 3rd and 6th of October 16 user groups will be holding free meetings across the UK. The schedule is as follows:

3rd October - Manchester, Surrey, Kent, Birmingham     
4th October - Leeds, Bristol, London, Hertfordshire, Maidenhead     
5th October - Edinburgh, Exeter, Essex, Southampton, Cambridge, Cardiff      
6th October - London Cardinal Place – Itzik Ben Gan

It is great to see three new user groups holding their inaugural meetings; these are Cambridge, Essex (Basildon) and Maidenhead. The community is growing and gaining strength which can only be good for SQL Server professionals.

Contact details and registration instructions for the above events are available on the UK SQL Server User Group website: http://www.sqlserverfaq.com

If you don’t see an event near you why not start your own group? Visit the community corner at SQLBits or come along to one of the SQL Relay events and find out what’s involved. If you can make those drop me an email.

My BIG problem with SQL Server Denali!
27 August 11 01:48 PM | MartinBell | 1 comment(s)

In the last couple of weeks I’ve done 3 talks on new features for developers in SQL Server Denali and hit the same problem each time; which is, I have run out of time. The reason for this is that there are so many interesting new feature coming out in the next version of SQL Server, it is so difficult to decide what you’re going to miss out! This will probably mean in future I will have to do more specific sessions and it will take a lot of discipline not to go off on a tangent and talk about the other great new things due to be released.

If you haven’t looked at CTP3 of SQL Server Denali you can download it from the Microsoft website. If you want to see some pictures of Juneau, here are some; but if you want to download the SQL Server Developer Tools you can do so from here

Filed under: , ,
Volume Freespace
25 July 11 10:08 PM | MartinBell | with no comments

In my last blog post I talked about a problem where the incorrect permissions on a mount point directory causes a spurious error message regarding space on the root drive. Whilst investigating the problem I needed to find out the space available on the mounted drive. Unlike normal directories the information isn’t immediately obtainable when you look at the properties of the mount point directory when you use windows explorer. With mount points you have to take a further step to view the properties of the mounted drive itself.

So instead of having to trudge through each mount property and then view the properties of the mounted drive, I thought I would use powershell instead. There are loads of posting/articles on how to do this, but this is the command I used:

Get-WmiObject Win32_Volume | select @{name="Mount Point";expression={$_.caption}},@{name="Free Space (GB)";e={$_.Freespace/(1024*1024*1024)}} | sort-object -property "Free Space (GB)" -descending | Format-Table –auto

There are many interesting properties for Win32_Volume, to list them use:

Get-WmiObject Win32_Volume | Get-Member -MemberType Properties

Filed under:
Mount point permission issues
06 July 11 09:56 PM | MartinBell | 2 comment(s)

I thought I’d post about something about a problem I encountered the other day, because the error message was misleading and the solution was not very obvious.

When restoring a database back I started getting the error messages like:

Msg 3257, Level 16, State 1, Line 1
There is insufficient free space on disk volume 'C:\' to create the database. The database requires 423457128448 additional free bytes, while only 113891102720 bytes are available.
Msg 3119, Level 16, State 4, Line 1
Problems were identified while planning for the RESTORE statement. Previous messages provide details.
Msg 3013, Level 16, State 1, Line 1
RESTORE DATABASE is terminating abnormally.

This message didn’t make a great deal of sense as the volume I was restoring the database onto had plenty of space available, and I wasn’t actually writing anything directly on the C:\ drive. What made the problem more perplexing was that I have restored this database from the backup loads of times previously. As the message was reporting a lack of space I tried using trace flag 3104 which disables the check for space, but this had no effect. I therefore incorrectly concluded that trace flag 3104 was no longer working, so asked fellow MVPs if they had come across this problem.
Simon Sabin suggested that the issue may be something to do with permissions of the service account, but how that related to my problem wasn’t clear, so I thought I would check what was different between the current database and the one I was restoring. Using RESTORE FILELISTONLY I could get a list of the file and the sizes of each of these files. Comparing it with the current files revealed only a couple had changed. The most significant difference was in one of the log files, and therefore I compared the mounted volume with the other mounted volumes and found that for the directory where the log file volume was mounted, the service account did not have explicit permissions granted to it, whilst all the other directories for the mounted volumes did. e.g.

If the path that the volume was mounted is C:\data\db\ldffiles then:

cacl C:\data\db\ldffiles


BUILTIN\Administrators: (OI) (CI) F

which was missing


where the domain account WINDOM\db_sql_svc is the SQL Server Service Account

By adding change permission to the service account the problem no longer manifested itself.

This left me wondering why it had not manifested itself before?

I wrote myself a test script that would create log files of a given size, back up the database, drop the second log file and re-create it with a smaller size, it then tried to restore the database. Using this script and altering the initial file sizes and the replaced file size, I tried to find a threshold which caused the problem. Unfortunately I didn't obtain consistent results so I can not give a specific size to look out for. The tests do show that there is some value somewhere between 90 and 100
GB when the problem usually starts to manifest itself. The following is a list of passes for given file sizes against the size of file expansion (values in GB):

Original Size Min Expansion Max Expansion
100 93 99
155 90 99
159 94 99
200 93 100
250 93 100

There does not seem to be an obvious connection to the amount of free space on the volume.

It is possible for the problem to be present and then suddenly manifest itself, as in my case. In these circumstances you will usually look for something that has changed for the source of the problem, but in this case nothing obvious has changed so you can waste a significant amount of time trying to pin down a non-existent difference. It is possible that this problem has been an issue since the original configuration, so if you are currently using mount points you just may want to check the permissions on the mount point directory.

As I could not find feedback on connect that mentions this I added the following bug.

Filed under: ,
Edinburgh User Group Meeting–Wednesday 29th June
26 June 11 09:01 AM | MartinBell | with no comments

On Wednesday we have two guest speakers at the Edinburgh SQL Server User Group Meeting at the Microsoft Offices at Waverley Gate. Tony Rogerson will be presenting a session on using NoSQL to produce a cheap high performance, highly scalable BI system and Colin Mackay will be presenting on SQL Injection Attacks and how to prevent them.

18:30 - 18:50 - Introduction, Networking and Food

18:50 - 19:40
How to get throughput of over 1GBytes per second for less than 2.5K using commodity kit - Tony Rogerson SQL MVP
In this session Tony will talk about the recent paradigm brought into play through the NoSQL movement for dealing with high performance, high scalability requirements but at a fraction of traditional costs. Tony will demonstrate a box built using commodity kit and will talk about how it may be used to good effect in a Business Intelligence setting. It will be an open discussion.

19:40 - 20:00 – Break

20:00 - 20:50SQL Injection Attacks (and how to prevent them) - Colin Angus Mackay
With recent reports of a man convicted of stealing the details of 130million credit cards by use of SQL Injection Attacks, isn't it time to find out how to defend your systems against them? In this talk Colin Mackay will show you what a SQL Injection Attack is, what they look like, how they work and most importantly how to harden your application and database security in order to defend your systems against them.

Although the technologies used in this talk are SQL Server and the .NET Framework, the general ideas presented apply to any database that uses SQL as a query language, and to any framework that may interact with that database.

Colin Angus Mackay
Colin Angus Mackay is a software developer specialising in Microsoft technologies located in Glasgow, Scotland. A former Microsoft MVP (C#) for four years running, the Treasurer of Scottish Developers, Code Project MVP for five years running, and has co-organised four DDD Scotland events. While not involved in software related pursuits is an amateur photographer (which generally involves wondering why his camera's autofocus mechanism chooses the potted plant off to the side rather than the main subject).

20:50 - 21:00 – Close

Pizza and drinks will be supplied in the breaks, and a feedback prize draw will be held.

Sign up for the event at SQLServerFAQ.com

Filed under:
Finding out how much space is used by filestream files
25 June 11 11:10 PM | MartinBell | with no comments

At the Manchester user group the other night a question from the floor was “How do you determine the space used by the Filestream files” so I thought I would look into it.     

I found the same question posted on Stack Overflow and the following was given as a means to get an approximate answer.

SELECT SUM(DATALENGTH(filestreamcolumn)) FROM filestreamtable;

A follow up reply also points you to the sys.database_files system view to retrieve the size of the filestream filegroup, but it also points out that it does not necessarily give you an accurate value of the total space used and it doesn't tell you the amount of space available on the volume. This seems like an ideal situation to use Powershell as it involves the file system and SQL Server, so I then looked into ways of doing it that way 

I found this article on Allen White’s blog which does most of the SMO I need to carry out the task, but to get to get the directory size I would need to do something like this script which I found on TechNet.
After putting it all together, I came up with this script. Most of my additional code is in the Filestream-Path procedure which uses the WMI to get the volume information, but before that I split the filegroup path into it’s ancestry using the split operator. This will then go through each ancestor from the oldest to the youngest and if volume information is returned then it must be a disk mount point. The last volume found will be the most recent ancestor.

When running this script make sure you will get errors if you don’t have sufficient permissions on the folders.

Filed under: ,
PAL for a DBA
14 May 11 10:05 PM | MartinBell | with no comments

There are many articles about what performance counters you should collect when analysing the performance of SQL Server. Even if you decide what counters to collect, then analysing them to find issues can be also be difficult and time consuming. This is where a tool such as PAL (Performance Analysis Tool) proves itself to be invaluable. PAL is available on Codeplex and as a default comes with a template for analysing SQL Server 2005/2008. To make sure you collect the performance counters analysed by PAL you can create a logman input file by using the export function



The export a list of counters that you can use with the logman utility to create a counter log which can then be used with logman or Performance/System monitor. Alternatively your can use SQLDiag to collect your performance monitor counters and analyse them with PAL.

Filed under:
More Posts « Previous page - Next page »

This Blog

SQL Blogs