October 2010 - Posts

SQLBits 6 Videos now available online
21 October 10 09:53 AM | MartinBell | 2 comment(s)

Thanks to a lot of hard work by Darren the SQLBits VI videos are now available for viewing.

Check out the sessions at
http://www.sqlbits.com/information/event6/PublicSessions.aspx there's lots of great content and information.

Filed under: ,
SQLBits VII feedback competition
18 October 10 07:51 PM | MartinBell | with no comments

Anyone wishing to submit feedback should do so by midnight Friday 22nd October if they wish to be included in the prize draw for the XBox 360.


The draw will be made on Saturday 23rd October at the
Modern .NET Day

Filed under: ,
Using sp_rename
16 October 10 08:23 PM | MartinBell | 2 comment(s)

In my previous post I talked about using sp_rename to rename columns and primary keys/indexes. There was a time when most people recommended that you don’t use sp_rename and if you do a internet search for sp_rename you will see articles such as this one that Andras wrote back in 2006. Things do move on, so when you find a recommendation like this, I would recommend that check that it still applies to the version of SQL Server you are using. You may also find that an article does not explain the whole reason or be vague on the consequences. For example why is it so important that the procedure text held in syscomments so important? If SSMS has a work around when scripting or modifying a procedure, what other problems could it cause? Certain tools may use the text in syscomments to compare two versions of a procedure, but then I would ask if/why such tools should were being used production systems? Surely the definitive stored procedure/view definitions are the ones in version control? Smile

Re-creating views or stored procedures will be less of an issue than dropping/re-creating primary keys, but even with primary keys it is not a black and white decision what you should do. If there is little data in the table or if there are no other indexes on the table you may still want to drop/create the index. If that is the case you would need to remind me again why you are wanting to script the objects from a production system!!

So whether you should use sp_rename for the problem cases above is not a clear one, and you may be the case that you make two different decissions depending on what the target system is!

Filed under:
How far can development tools take you – part 2?
12 October 10 03:19 PM | MartinBell | 2 comment(s)

After my last post about how poor the SQL generated by the schema compare tool in Visual Studio 2010 was. I thought I would look at how a database project would handle the same situation. So I fired up Visual Studio and created a blank SQL Server 2008 database project called DBProject.

image 

This gave me the following project:

image 

I then looked at the project properties:

image 

and changed the target database in the deployment options to point to my testDB_V1 database

image 

As I have created the database already I imported the objects from the database, but you could add the source files from your source code control system.

image 

This requires you to create a new connection or edit and existing one:

image

Once completed, this creates a file for the T1 table

image

The file contains the following SQL code:

CREATE TABLE [dbo].[T1] (
    [id]             INT      NOT
NULL,
    [longcolumnname] CHAR (1) NOT
NULL,
    [othercolumn]    CHAR (1)
NULL,
    PRIMARY KEY CLUSTERED ([id] ASC)
WITH
(ALLOW_PAGE_LOCKS = ON, ALLOW_ROW_LOCKS = ON, PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF, STATISTICS_NORECOMPUTE = OFF
)
);

If you now try to deploy the solution you will get a script that checks the server/database exists and sets the database options.

Looking at the schema view you can expand the nodes into the columns of the T1 table:

image 

Right clicking on the longcolumnname column will  give you are refactor menu option, from which you can rename the column

image 

image 

Now if you create a deployment script it will give you the following SQL statements to rename the column:

PRINT N'The following operation was generated from a refactoring log file 113fc586-f76d-41bd-aa2e-934496ff3852';

PRINT N'Rename [dbo].[T1].[longcolumnname] to shortercolumnname'
;
GO

EXECUTE
sp_rename @objname = N'[dbo].[T1].[longcolumnname]', @newname = N'shortercolumnname', @objtype = N'COLUMN'
;
GO

This is far better than the schema compare tool, but then it does have knowledge about the context in which the change has been made.

So what about renaming the system generated PK?

If you try to change the name in the same way, you will not get the rename option on the refactor menu

image

If you try to change it using by modifying the script to the following SQL:

CREATE TABLE [dbo].[T1] (
    [id]             INT      NOT NULL,
    [shortercolumnname] CHAR (1) NOT NULL,
    [othercolumn]    CHAR (1) NULL,
    CONSTRAINT PK_T1 PRIMARY KEY CLUSTERED ([id] ASC) WITH (ALLOW_PAGE_LOCKS = ON, ALLOW_ROW_LOCKS = ON, PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF, STATISTICS_NORECOMPUTE = OFF)
);

The subsequent deployment script will contain the following SQL to drop and recreate the Primary Key:

PRINT N'Dropping PK__T1__3213E83F7F60ED59...';
GO

ALTER
TABLE [dbo].[T1] DROP CONSTRAINT [PK__T1__3213E83F7F60ED59]
;
GO

PRINT
N'Creating PK_T1...'
;
GO

ALTER TABLE [dbo].[T1]
    ADD CONSTRAINT [PK_T1] PRIMARY KEY CLUSTERED ([id] ASC)
WITH
ALLOW_PAGE_LOCKS
= ON, ALLOW_ROW_LOCKS = ON, PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF, STATISTICS_NORECOMPUTE = OFF
);
GO

Because of the issue scripting renamed system generated primary keys that I mentioned in my previous post, this may be the solution you decide to use, hopefully it won’t take to long to run!


Filed under:
How far can development tool take you?
11 October 10 07:27 PM | MartinBell | 5 comment(s)

End-to-end database development is a great idea, but after a recent project that used the schema comparison tool in Visual Studio 2010 to generate upgrade scripts I had to question how feasible this actually is!

The reason I say this, is because the quality of the script generated using the schema comparison tool in VS 2010 is not always that good. Not everyone will want to totally change the way they currently work and embrace database projects in their entirety, especially if it entails a significant cost to do so. Because the schema comparison tool works to a set of pre-defined options, the code generated is usually the “safest”, but even then it is not necessarily correct and quiet often this also means that it is also the least efficient.

Here is an example, consider the following version of a given database:

CREATE DATABASE TestDB_V1
GO

USE
TestDB_V1
GO

CREATE
TABLE T1 ( id int not null PRIMARY KEY CLUSTERED,
                        longcolumnname char(1) NOT
NULL,
                        othercolumn char(1)
NULL
                  
)
GO

CREATE
DATABASE TestDB_V2
GO

USE TestDB_V2
GO

CREATE
TABLE T1 ( id int not null CONSTRAINT PK_T1 PRIMARY KEY CLUSTERED,
                        shortercolumnname char(1) NOT
NULL,
                        othercolumn char(1) NOT
NULL
                  
)
GO

Here we have two versions of a database that contain the T1 table. Between version 1 and version 2 the column longcolumnname was renamed to shortercolumnname (sic) and the primary key was renamed from the system generated name to one that was more friendly.

If you then compare the two databases in Visual Studio 2010 you will get the following results:


image  

You may already see that there may be problems, but without knowing that the column and the PK were actually renamed how will the tool know determine the most efficient way to do the upgrade?

If you generated the script you would get the following:

PRINT N'Dropping PK__T1__3213E83F7F60ED59...';
GO
ALTER
TABLE [dbo].[T1] DROP CONSTRAINT [PK__T1__3213E83F7F60ED59];
GO
PRINT
N'Starting rebuilding table [dbo].[T1]...';
GO
/*
The column [dbo].[T1].[longcolumnname] is being dropped, data loss could occur.

The column [dbo].[T1].[shortercolumnname] on table [dbo].[T1] must be added, but the column has no default value and does not allow NULL values. If the table contains data, the ALTER script will not work. To avoid this issue, you must add a default value to the column or mark it as allowing NULL values.
*/
GO
SET
TRANSACTION ISOLATION LEVEL SERIALIZABLE;
SET XACT_ABORT ON;
BEGIN TRANSACTION;

CREATE TABLE [dbo].[tmp_ms_xx_T1] (
    [id]                INT      NOT NULL,
    [shortercolumnname] CHAR (1) NOT NULL,
    [othercolumn]       CHAR (1) NULL
);

ALTER TABLE [dbo].[tmp_ms_xx_T1]
    ADD CONSTRAINT [tmp_ms_xx_clusteredindex_PK_T1] PRIMARY KEY CLUSTERED ([id] ASC) WITH (ALLOW_PAGE_LOCKS = ON, ALLOW_ROW_LOCKS = ON, PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF, STATISTICS_NORECOMPUTE = OFF);

IF EXISTS (SELECT TOP 1 1
          
FROM   [dbo].[T1])
    BEGIN
        INSERT INTO [dbo].[tmp_ms_xx_T1] ([id], [othercolumn])
        SELECT   [id],
                 [othercolumn]
        FROM     [dbo].[T1]
        ORDER BY [id] ASC;
    END

DROP TABLE [dbo].[T1];
EXECUTE sp_rename N'[dbo].[tmp_ms_xx_T1]', N'T1';
EXECUTE sp_rename N'[dbo].[tmp_ms_xx_clusteredindex_PK_T1]', N'PK_T1', N'OBJECT';
COMMIT TRANSACTION;
SET TRANSACTION ISOLATION LEVEL READ COMMITTED;
GO

If you allowed the option to accept data loss then you end up loosing all the data in shortercolumnname. The technique of creating a table and sucking the data across was previously used by Enterprise manager when you did this sort of thing in the GUI. There was many complaints about it, and a lot of people came to blows when something that appeared to be a simple column name rename brought systems to their knees for hours. If you now profile Management Studio when doing this operation you would see that it uses sp_rename instead:

EXEC dbo.sp_rename @objname=N'[dbo].[T1].[longcolumnname]', @newname=N'shortercolumnname', @objtype=N'COLUMN'

Therefore all the code that is used to create the new table, suck the data into it and the subsequent renames could be replaced with the single rename statement.

If you change the primary key name in Management Studio that also uses sp_rename:

EXEC sp_rename N'[dbo].[T1].[PK__T1__3213E83F7F60ED59]', N'PK_T1', N'INDEX'

So in theory there is no need to drop the existing key, which could save a significant amount of time. Unfortunately there is a downside to this. If you then tried to script the key you will get the following SQL:

ALTER TABLE [dbo].[T1] ADD PRIMARY KEY CLUSTERED (
     [id] ASC
)
WITH (PAD_INDEX
  = OFF, STATISTICS_NORECOMPUTE  = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
GO

i.e. the name is not scripted.

I compared the system catalogs for the two constraints in the testDB_V1 and testDB_V2 databases

SELECT *

FROM sysconstraints c
JOIN sys.objects o on c.constid = o.object_id
WHERE o.name = 'PK_T1' ;

and noted that for testDB_V1 the status value in sysconstraints is 133665 and for testDB_V2 it is 2593. Both of these values are far higher than the values documented in books online. This appears to be a “feature” so I raised an item on connect:

https://connect.microsoft.com/SQLServer/feedback/details/612575/pks-do-not-script-correctly-when-renamed-using-sp-rename#details

<Update>

20120510 - This connect item was closed with the comment fixed in next release of SQL Server. I checked SQL Server 2008 R2 SP1 and it didn't work but SQL Server 2012 RTM did. Therefore for older version use the workaround suggested by David B Howard


20101110 - An interesting comment on my connect item relating to the problem with sp_rename by David B Howard is that using OBJECT as the objtype in the sp_rename resets the is_system_named value in sys.key_constraints and can be used as a workaround.
</Update>

Back to the original reason for this blog post.. If you are going to use a tool to generate your upgrade scripts, then you should check the scripts it generates as potentially they may be inefficient and they even produce the some undesired results.

Once you start questioning or changing these scripts the use of the tool is devalued, so I don’t think it will mean the demise of the T-SQL expert just yet!!

In my next post I look at how database projects handle these two situations.

Filed under: ,
Another post-SQLbits blog post
11 October 10 10:48 AM | MartinBell | 1 comment(s)

Thanks to everyone who’s taken the time to email, twitter etc us on how much they enjoyed the SQLBits in York. I hope you all had fun and learnt a great amount.

For the Saturday we had 195 badges left unclaimed. Saturday may be free for delegates, but 195 no-shows equates to between £7,000 amd £10,000 of unnecessary costs (depending on the venue). This amount is around the total budget that a regional DDD event.

I would love to make the list of no-shows available to name and shame these people. I’ve already checked that the domains
www.inconsideratecretins.com and www.inconsideratecretins.info are available. I’ll have to see what the lawyers say about it!

In the mean time what I can do, is when I’m called in to take a technical interview (as I’m quite often asked to do), I’ll make sure I check the list beforehand. Of course, the interviewee won’t know that until it’s too late Wink

Filed under:

This Blog

SQL Blogs

Syndication