My blog is moving

I've neglected blogging far too long. I had what you might call a crisis of faith. That and a ton of work! Time to reboot, and I think a more modern blog platform is in order.

So I'm starting again here:

I'd like to thank Tony and all at SQL Blogcasts for enabling me to get started. 

Posted by DavidWimbush | with no comments
Filed under:

Fun with upgrading and BCP

I just had trouble with using BCP out via xp_cmdshell. Probably serves me right but that's a different issue. I got a strange error message 'Unable to resolve column level collations' which turned out to be a bit misleading. I wasted some time comparing the collations of the the server, the database and all the columns in the query. I got so desperate that I even read the Books Online article. Still no joy but then I tried the interweb. It turns out that calling bcp without qualifying it with a path causes Windows to search the folders listed in the Path environment variable - in that order - and execute the first version of BCP it can find. But when you do an in-place version upgrade, the new paths are added on the end of the Path variable so you don't get the latest version of BCP by default.

To check which version you're getting execute bcp -v at the command line. The version number will correspond to SQL Server version numbering (eg. 10.50.n = 2008 R2). To examine and/or edit the Path variable, right-click on My Computer, select Properties, go to the Advanced tab and click on the Environment Variables button. If you change the variable you'll have to restart the SQL Server service before it takes effect.

Posted by DavidWimbush | with no comments

Know your target market

Most companies think they understand their target market, but some really do. I have quite a few flash drives but I wouldn't give any of them a high score for good looks. Until now!

I was passing a well-known electronics retailer when this caught my eye: 32GB, £14.99 name = Intenso. AND IT'S TRANSPARENT!!

About 90 seconds later I owned one.

Posted by DavidWimbush | with no comments

The unknown unknowns

I was spring cleaning some old scripts and I came across my first attempt at establishing a backup strategy. 12 hours later I'm still feeling a bit queasy. When I first put on the DBA hat, we were running with the simple recovery model and doing a full backup every night. I did some homework and thought I'd got it all figured out. My killer improvement? Use full recovery and every night do a log backup and then a full backup. That way we could recover to any point in time. Brilliant!*

Since then I've learned a lot. I know I'm much better than I was. But I can never quite shake that small nagging doubt: how much of what I'm doing now - believing it's good practice - will I one day look back on and shudder? But this is a good thing. It keeps me on my toes, continually re-evaluating what I'm doing, and hungry to learn.


*Just to be clear, I'm being sarcastic. This NOT a good solution. If you have any doubts:

  1. Read Books Online about recovery models and backups.
  2. Read what Paul Randal has written about it in his blog and in TechNet Magazine.
  3. Check out Ola Hallengren's database maintenance solution before you write your own.
Posted by DavidWimbush | with no comments

Reporting on report usage revisited

Nearly three years ago I wrote about how to query the ReportServer database and see who is running which reports. Since then I've made it a bit more focused. Which reports are being used the most? Which users are the busiest and which reports are they looking at? Here's what I came up with.

(NB: This is for 2008 or 2008 R2 Reporting Services in standalone mode. I've no idea what you get in SharePoint mode.)


Report Heatmap

Which reports are being used the most?

If you click on a report you see...

Report Usage

Who has been using that report?



Report User Heatmap

Who are the busy users?

If you click on a user you see...

Report User History

Which reports has that user has been running?


Feel free to download the RDL files and stored procedures and use them yourself.


Strange gotcha with column names

Here's a funny thing I discovered while working with a table I created by importing data from Excel. (I know, I only have myself to blame.) A number of the column names ended up with a trailing space on the end. They were Y/N type columns where the value was either Y or null. I decided to update all the nulls to Ns so the nulls wouldn't trip me up later. So I typed a query like this: update <table> set <column> = isnull(<column>, 'N'). I noticed Intellisense had underlined the column name but I knew what I was doing and assumed it was still refreshing its cache. The update ran fine but when I tried to select that column I got a 'no such column' error. WTF?

I can only assume this is a side-effect of the way you can update things like rowset functions and not just tables. I've done a bit of playing around and found that (surprise surprise) some variants will work and some will fail. Try it for yourself:

create table dbo.TestTable
 [TestColumn ] varchar(20) null

--This update will give the result: (0 row(s) affected)
update dbo.TestTable
set TestColumn = 'hello';

--But this update will fail:
--Msg 207, Level 16, State 1, Line 2
--Invalid column name 'TestColumn2'.
update dbo.TestTable
set TestColumn2 = 'hello';

drop table dbo.TestTable;

I suppose the moral of the story is: concentrate and check that updates affect the number of rows you expect. (Damn, that's not going to work!)


Posted by DavidWimbush | with no comments
Filed under:

SQL Server 2012 edition comparison details are published

Interesting stuff, particularly if you're doing BI. BISM tabular and Power View will not be in Standard Edition, only in the new - presumably more expensive - Business Intelligence Edition. That kind of makes sense as you need a fairly pricey edition of SharePoint to really get all the benefits, but it's a shame there won't be some kind of limited version in Standard Edition. And Always On will be in Standard Edition but limited to 2 nodes. I really expected Always On to be Enterprise-only so this is a great decision. It allows those of us working at a more modest scale to benefit and raises the fault tolerance of SQL Server as a product to a new level.

Read all about it here:

Posted by DavidWimbush | with no comments
Filed under:

Denali Side-by-Side Gotcha

Don't try this at home. I just got my new (new to me anyway) laptop set up nicely and then installed an instance of Denali CTP3 to start exploring. RTFM? Not me! The first thing I thought I'd play with was the tabular model and Project Crescent as this has huge potential for Reporting & BI people. So I click New Project and it says you can only do this on a default instance. Erm, but my default instance is 2008R2. Oh well, back to the (un)installer.
Posted by DavidWimbush | with no comments

An annoying problem with Nulls

I just lost half an hour of my life which I'll never get back on this. Don't make the same mistake. I extracted some customer details for a mailing (it's OK, they ticked the box that they want to hear about our special offers) but Marketing complained that some of the address data contained nulls. I checked the spreadsheet I'd given them and they were right. So I checked my query and followed the trail back to the source record for an example. The value in AddressLine2 was Null. But my select statement said isnull(AddressLine2, '') as AddressLine2 and it was returning Null. WTF?! The sharp-eyed reader has probably already noticed that the value was the string 'Null', not the empty value NULL. I got there in the end but it took a while. Good job it wasn't upper case or I might have had to ask one of the developers for help!

If you get this problem and it's upper case you could try using the lower() function which will change the case of 'NULL' but not NULL:

declare @a varchar(10) = null;
declare @b varchar(10) = 'NULL';
select lower(@a) as a, lower(@b) as b;

Posted by DavidWimbush | with no comments
Filed under:

Failing report subscriptions

We had an interesting problem while I was on holiday. (Why doesn't this stuff ever happen when I'm there?) The sysadmin upgraded our Exchange server to Exchange 2010 and everone's subscriptions stopped. My Subscriptions showed an error message saying that the email address of one of the recipients is invalid. When you create a subscription, Reporting puts your Windows user name into the To field and most users have no permissions to edit it. By default, Reporting leaves it up to exchange to resolve that into an email address. This only works if Exchange is set up to translate aliases or 'short names' into email addresses. It turns out this leaves Exchange open to being used as a relay so it is disabled out of the box. You now have three options:

  1. Open up Exchange. That would be bad.
  2. Give all Reporting users the ability to edit the To field in a subscription. a) They shouldn't have to, it should just work. b) They don't really have any business subscribing anyone but themselves.
  3. Fix the report server to add the domain. This looks like the right choice and it works for us. See below for details.


  • A single email domain name.
  • A clear relationship between the Windows user name and the email address. eg. If the user name is joebloggs, then joebloggs@domainname needs to be the email address or an alias of it.

Warning: Saving changes to the rsreportserver.config file will restart the Report Server service which effectively takes Reporting down for around 30 seconds. Time your action accordingly.

Edit the file rsreportserver.config (most probably in the folder ..\Program Files[ (x86)]\Microsoft SQL Server\MSRS10_50[.instancename]\Reporting Services\ReportServer). There's a setting called DefaultHostName which is empty by default. Enter your email domain name without the leading '@'. Save the file. This domain name will be appended to any destination addresses that don't have a domain name of their own.

More Posts Next page »