Identity Columns- How close to the edge are you?
09 May 11 09:27 PM | MartinCroft | with no comments

Day 9. How close is your data to the edge! Integer identity columns that increment by 1, have been known to run out of numbers. I have seen it occur on a few occasions on highly transactional systems. However there is really no excuse for getting caught out. The folwoing code easily shows you were your identity table columns are.

 

--2005
SELECT SCHEMA_NAME(schema_id) AS [Schema],name AS [Table]
FROM sys.tables
WHERE OBJECTPROPERTY(object_id,'TableHasIdentity') = 1
ORDER BY [Schema], [Table];
GO
--2000
SELECT Name from sysobjects 
WHERE OBJECTPROPERTY(id,'TableHasIdentity') = 1

 

The code below will show all your tables that have an identity seed on them and how full they are percentage wise. The trick is to catch them before they hit 100% and bring down the database as it can't insert any more rows!.

There are several options to fix, change the data to bigint, reseed the values ( if you don’t keep all the data such as archiving and you don’t need a unique ID) to a lower value,  or even start on a negative identity number to double the capacity!, but somehow that seems wrong. If it had been planned correctly it would already be the right data type!. Hey these thing happen when you inherit systems

Exact number data types that use integer data.

bigint

Integer (whole number) data from -2^63 (-9223372036854775808) through 2^63-1 (9223372036854775807). Storage size is 8 bytes.

int

Integer (whole number) data from -2^31 (-2,147,483,648) through 2^31 - 1 (2,147,483,647). Storage size is 4 bytes. The SQL-92 synonym for int is integer.

smallint

Integer data from -2^15 (-32,768) through 2^15 - 1 (32,767). Storage size is 2 bytes.

tinyint

Integer data from 0 through 255. Storage size is 1 byte.

 SQL 2000 code 
SELECT
    QUOTENAME(USER_NAME(t.uid))+'.'+QUOTENAME(t.name)AS TableName,
    c.name AS ColumnName,
    CASE c.xtype
    WHEN 127 THEN 'bigint'
    WHEN 56 THEN 'int'
    WHEN 52 THEN 'smallint'
    WHEN 48 THEN 'tinyint'
    END AS 'DataType',
    IDENT_CURRENT(USER_NAME(t.uid)+'.'+ t.name) AS CurrentIdentityValue,
    CASE c.xtype
    WHEN 127 THEN (IDENT_CURRENT(USER_NAME(t.uid)+'.'+ t.name)* 100.)/ 9223372036854775807
    WHEN 56 THEN (IDENT_CURRENT(USER_NAME(t.uid)+'.'+ t.name)* 100.)/ 2147483647
    WHEN 52 THEN (IDENT_CURRENT(USER_NAME(t.uid)+'.'+ t.name)* 100.)/ 32767
    WHEN 48 THEN (IDENT_CURRENT(USER_NAME(t.uid)+'.'+ t.name)* 100.)/ 255
    END AS'PercentageUsed'
FROM
    syscolumns AS c
INNER JOIN
    sysobjects AS t ON t.id = c.id
WHERE
    COLUMNPROPERTY(t.id, c.name,'isIdentity')= 1
AND
    OBJECTPROPERTY(t.id,'isTable')= 1
ORDER BY
    PercentageUsed DESC
 
 SQL  2005/2008 
SELECT
    QUOTENAME(SCHEMA_NAME(t.schema_id))+'.'+ QUOTENAME(t.name)AS TableName,
    c.name AS ColumnName,
    CASE c.system_type_id
    WHEN 127 THEN 'bigint'
    WHEN 56 THEN 'int'
    WHEN 52 THEN 'smallint'
    WHEN 48 THEN 'tinyint'
    END AS'DataType',
    IDENT_CURRENT(SCHEMA_NAME(t.schema_id)+'.'+ t.name) AS CurrentIdentityValue,
    CASE c.system_type_id
    WHEN 127 THEN (IDENT_CURRENT(SCHEMA_NAME(t.schema_id) + '.' + t.name)* 100.)/ 9223372036854775807
    WHEN 56 THEN (IDENT_CURRENT(SCHEMA_NAME(t.schema_id) + '.' + t.name) * 100.) / 2147483647
    WHEN 52 THEN (IDENT_CURRENT(SCHEMA_NAME(t.schema_id) + '.' + t.name)* 100.)/ 32767
    WHEN 48 THEN (IDENT_CURRENT(SCHEMA_NAME(t.schema_id) + '.' + t.name)* 100.) / 255
    END AS'PercentageUsed',
    CAST(CAST(GETDATE()as VARCHAR(12))as datetime) as ReportTime
FROM
    sys.columns AS c
INNER JOIN
    sys.tables AS t ON t.[object_id] = c.[object_id]
WHERE
    c.is_identity = 1
ORDER BY
    PercentageUsed DESC
TempDB–What size was it?
08 May 11 05:04 PM | MartinCroft | with no comments

Day 8. As you might know TempDB is created every time SQL server service is restarted, but do you know what size it will start at next time you start SQL? Well this script shows you, its amazing how many production servers don’t get sized from the default when SQL was installed and its quite suspiring just how small that is, 8Mg!, So if you have an 8Gb data file set to 10% growth, guess how many times it takes 8Mg to get to 8Gb growing by 10% ( Lots), which can cause performance problems.

SELECT
  alt.filename
  ,alt.name
  ,alt.size * 8.0 / 1024.0 as originalsize_MB
  ,files.size* 8.0 / 1024.0 as currentsize_MB
FROM
  master.dbo.sysaltfiles  alt 
INNER JOIN 
  tempdb.dbo.sysfiles files ON alt.fileid = files.fileid
WHERE
  dbid = db_id('tempdb')
Filed under:
Merge–under used statement
07 May 11 04:54 PM | MartinCroft | 1 comment(s)

 

Day 7. The MERGE statement is one that quite a few DBA are aware of and use but is under used by developers. I think because they are not aware of all the scenarios that it can be used in. I recently had a variety of scenarios that developers had asked my opinion on , or a review of some code and found its perfect to use MERGE. I put together some examples that show all the different things that can be done with MERGE from the obvious to the more specialist situations.

This example uses AdventureWorks and also shows some other lesser used bits of T-SQL. The full script can be found here

The code to generate the example tables

--=====================================================================================
--Populate dummy data for example using CROSS JOIN to generate multiple rows
-- and Row_number to create unique customer numbers 
--=====================================================================================
INSERT INTO CustomersCache (Name,City,AnnualEarnings) 
SELECT 
	Name + ' ' + CAST(ROW_NUMBER() OVER(PARTITION BY Name ORDER BY Name )AS VARCHAR(3)) ,City,10000
FROM
(
SELECT TOP 10
      [Name]
 FROM 
	[AdventureWorks].[Sales].[Store] 
)a
CROSS JOIN  
(
SELECT DISTINCT TOP 30 
	[City]
FROM 
	[AdventureWorks].[Person].[Address]
)b 

 

populate script using a cross join to simplify creating dummy data. Its very easy to populate lots of rows using a cross join, but I wanted them to be distinct so used

    Name + ' ' + CAST(ROW_NUMBER() OVER(PARTITION BY Name ORDER BY Name )AS VARCHAR(3))

to give me a unique row

unique_rows

 

So we have two tables

CustomerCache and Customers. The cache table is populated and the customer table is empty.

running the following basic MERGE statement

--=====================================================================================
--Simple MERGE statement any rows that dont exist in 
--TARGET table will be populated 
--=====================================================================================
MERGE Customers AS TARGET
USING CustomersCache AS SOURCE  ON(TARGET.Name = SOURCE.NAME)
WHEN NOT MATCHED THEN
INSERT (Name, City, [AnnualEarnings], ModifiedDate)
VALUES (Name, City, [AnnualEarnings], ModifiedDate);
	

The trigger shows we have inserts occurring and checking both tables we would find they are now in sync.

Ok lets simulate a scenario where customer accounts no longer exist the easiest approach is to delete some rows out of the cache table

 

DELETE TOP(5) FROM CustomersCache
OUTPUT deleted.*

 

Using DELETE  TOP(5) we can just delete 5 rows without the need of a WHERE clause and using OUTPUT ( another under rated and used T-SQL function, I’ll blog about this in the next couple of weeks) we can see the rows its deleted.

deletedrows

ok lets show how to deactivate the customers

MERGE Customers AS TARGET
USING CustomersCache AS SOURCE  ON(TARGET.Name = SOURCE.NAME)
WHEN NOT MATCHED BY SOURCE THEN
UPDATE SET Active =0, ModifiedDate =GETDATE();

So any rows that don’t match between source and destination are now set de-activated and the modified date updated. (shown below). The trigger shows updates occur

deactiverows

Ok, now lets delete another row and also modify some data. This time i’ll show how you can delete a random row using ORDER BY newiD() and OUTPUT again, notice the placement of the OUTPUT keyword, it comes after the DELETE statement and not at the end of the query.

--Delete a Random row from Source 
DELETE TOP(1) CustomersCache
OUTPUT deleted.*
WHERE Name = (SELECT TOP 1 Name from CustomersCache ORDER BY newiD())

And also add in a new company

 

--Add New Company
INSERT INTO CustomersCache
SELECT  TOP 1
	'Martins New Company 1',
	CITY,
	AnnualEarnings,
	GETDATE() 
FROM 
	CustomersCache
--ok lets amend some values as well. Random of course.
--The code below gives a random value between 1 and 1000,
-- also using the 2008 syntax += instead of a=a+1 can put a+=1
 
 UPDATE 	TOP(1) CC
SET AnnualEarnings+= (1+ CAST(1000*RAND()AS INT)),
    ModifiedDate =GETDATE()
OUTPUT 
    deleted.Name,
    deleted.AnnualEarnings As Before,
    inserted.AnnualEarnings AS After
FROM CustomersCache CC
WHERE ModifiedDate IS NULL 

Ok that just updates 1 row, before anyone pulls me I said add some values. If you add

G0
50

at the end of the code and run it all, you get the code to ran 50 times, this is a very useful bit of code for generating lots of rows quickly, or dropping into someone code at the bottom of a script if they have left their machine unlocked ", only works in Management studio ! as its the batch separator

SELECT “I am an IDIOT”

GO 5000

you get the idea, not saying I would use the word idiot necessarily!.

* Warning if you highlight a lot of statements and add a GO 50 it will execute THEM ALL 50 times, that why in my full script you wont find the GO 10, I add it when demonstrating each section *

Right lets MERGE them into the Customers table ( Sorry couldn’t resist). I am going to use OUTPUT now with the MERGE statement so two cool functions together. However, you can’t use OUTPUT and MERGE if you have triggers on the table so I drop the triggers first.

He’s the code that will Insert new rows, update and amendments and update the modified date and de-activate the rows that no longer exist.

MERGE Customers AS TARGET
USING CustomersCache AS SOURCE  ON(TARGET.Name = SOURCE.NAME)
WHEN NOT MATCHED THEN
INSERT (Name, City, [AnnualEarnings], ModifiedDate)
VALUES (Name, City, [AnnualEarnings], ModifiedDate)
WHEN MATCHED AND
(
	TARGET.[AnnualEarnings] <> SOURCE.[AnnualEarnings]
)
THEN
UPDATE
SET TARGET.[AnnualEarnings]=SOURCE.[AnnualEarnings],
TARGET.ModifiedDate =GETDATE()
WHEN NOT MATCHED BY SOURCE THEN
UPDATE SET Active =0, ModifiedDate =GETDATE()
OUTPUT $Action AS Action,
	   Inserted.Custid,
	   Inserted.Name,
	   Deleted.ModifiedDate,
	   Deleted.AnnualEarnings,
	   Inserted.AnnualEarnings;

With the OUTPUT and MERGE statement you can use $Action keyword that will allow you see see the changes. Which can be very useful if you want to do another action dependant on what occurred and UPDATE or a DELETE.  I did use this mechanism for a complex next customer / policy number scenario based on two table which I’ll look to blog about over the coming weeks.

which outputs (umm) the following

outputAction

Conditional MERGE

Ok, lets say we are interested in Auditing our table to make sure any changes we get to keep a history of, only updates not interested in deletes.

We create an Auditing table and update 5 rows ( remember the warning) when this works running the entire script and that is because a GO precedes the statement we want to run 5 times! We are outputting again the $Action value.

CREATE TABLE Auditing
(
	[AuditID] [smallint] IDENTITY(1,1) NOT NULL,
	[Name] VARCHAR(50) NOT NULL,
	[City] VARCHAR(30) NOT NULL,
	[EarningsBefore] INT, 
	[EarningsAfter] INT, 
	[ModifiedDate] [datetime]  NULL,
	[Type] VARCHAR(10),	
	Active BIT DEFAULT (1)
)
	GO
	
	--Update 5 records using GO 
	UPDATE 	TOP(1) CC
	SET AnnualEarnings+= (1+ CAST(1000*RAND()AS INT)),
		ModifiedDate =GETDATE()
	OUTPUT deleted.Name,deleted.AnnualEarnings As Before,inserted.AnnualEarnings AS After
	FROM CustomersCache CC
	WHERE ModifiedDate IS NULL
	GO 5
	
	INSERT INTO Auditing(Name,City,[EarningsBefore],[EarningsAfter],ModifiedDate, Type)
	SELECT Name,City,EarningsBefore,EarningsAfter,ModifiedDate, Action
	FROM
	(
	MERGE Customers AS TARGET
	USING CustomersCache AS SOURCE  ON(TARGET.Name = SOURCE.NAME)
	
	WHEN NOT MATCHED THEN
	INSERT (Name, City, [AnnualEarnings], ModifiedDate)
	VALUES (Name, City, [AnnualEarnings], ModifiedDate)
	
	WHEN MATCHED AND
	(
		TARGET.[AnnualEarnings] <> SOURCE.[AnnualEarnings]
	)
	THEN
	UPDATE
	SET TARGET.[AnnualEarnings]=SOURCE.[AnnualEarnings],
	TARGET.ModifiedDate =GETDATE()
	--Output chanegs to Auditing Table showing before and after picture 
	OUTPUT $Action AS Action,
		   Inserted.Name,
		   Inserted.City,
		   Deleted.AnnualEarnings AS EarningsBefore,
		   Inserted.AnnualEarnings AS EarningsAfter ,
		   Deleted.ModifiedDate
		   
) AS D
WHERE Action = 'UPDATE';

and  Inserting into the Auditing table where the action is an update. The Target table is still getting update but we are capturing all the updates that occur.

Filed under: , , ,
Quick and Simple Deletes
06 May 11 06:45 PM | MartinCroft | with no comments

Day 6 and a Friday so keep it simple. There are occasions where you want to delete a huge table and truncation is just not an option for various reasons, but deleting from the table creates a huge transaction log. You don’t want to over engineer the solution so heres a simple example thats takes copy of the table and deleted in chunks of 1000 in this case, but you can use what ever suits .

--DELETES 
USE AdventureWorks
IF EXISTS ( SELECT 1 FROM sysobjects 
WHERE name ='SalesOrderDetailCopy') 
DROP TABLE SalesOrderDetailCopy
SELECT * INTO SalesOrderDetailCopy 
FROM Sales.SalesOrderDetail
WHILE 1=1
BEGIN
	DELETE TOP(1000)
	FROM SalesOrderDetailCopy  
	IF @@ROWCOUNT < 1000 BREAK
END 
SELECT COUNT(*) 
FROM SalesOrderDetailCopy
Traffic lights in Reporting services
05 May 11 07:07 PM | MartinCroft | with no comments

Day 5 Pretty easy thing to setup, but can look very effective in a report, you can use gauges but I wanted fore some Health check reports to go for the old traffic light system. To help the Management at work to easily digest if their SQL servers are Critical (Red) Nearly there (Amber) or running ok (Green)

First things first you need some traffic light images, preferably one with a red light on and a green one, etc I think you get the idea. (a link to the ones I used at bottom of the post)

First in a new report we need to add the images we want going to use. Right click and select Add Image

loadimage

browse to the location of the images held on your computer and add them in one by one.

loadedimage

Here I have loaded 6 images as described above. I have called them the same as the colour of the traffic light to aid clarity.

Add a table into the report design pane and drag in the red image, into the first column. The image properties box will open up as shown below.

dropimage

Click on the expression button next to use this image and paste in the following code

trafficlightcode

In the query from the database I am passing through the colour of the Alert in the Value column, so Red, Green or Amber. Which ever one it is, the image for that colours is displayed simple!

So you then get the following based on the values passed in.

trafficlights

you can also expand on this with smaller images to give an instant view of what failed using the same principle.

 

minialerts

The images used can be accessed from here

SQL Server Top 10 Queries
04 May 11 08:29 PM | MartinCroft | with no comments

Day 4 and still rehashing some old post i needed to migrate from another site. In the garden burning wood and posting the latest blog when the bench collapsed around me .

bench

This is a list of the top 10 queries I use on a frequent basis that are relatively simple, either one liners or next to one liners. These are in reverse order from 10 through my favourite 1.

10 Job Information

You want to quickly access job information, you don’t want to have to trawl through joining MSDB.dbo.sysjobs , MSDB.sysjobsteps, just want a quick overview of what jobs are enabled/disabled or what job was updated yesterday? Or when it that job last run or next run

--Name,Enabled, description, owner, modeified date,etc
EXEC MSDB.dbo.sp_help_job
--Or specify Job_id and get step details, schedule times
EXEC MSDB.dbo.sp_help_job ’34DD4F82-423C-46E9-9E9A-BF7786′

9 Quick Search

Trying to work out where a particular column or table is called in a procedure, you can check the dependency’s or another option is to just quicker check syscomments for the text you are after, there are several procedure out on the web that add a high degree of search ability but this is fast and easy

USE NorthWind;
GO
SELECT
      OBJECT_NAME(iD),text
FROM
      syscomments
WHERE
      Text LIKE ‘%sales%’

8 Traces running

I have seen it before your running a trace, the trace hangs is it still running, how can it be you have closed the crashed profiler, always safer to check. or is big brother watching! one way to check! See BOL for the output to this function

SELECT * FROM ::fn_trace_getinfo(default) 

7 Am I sysadmin

Maybe not useful for everyone, but I have had uses for this multiple times, especially with a variety of access accounts and SQL2005 /2008 ability to switch environments quickly (right click and change connection). I find it useful to know if the account I am logged in has Sysadmin as a usual theme with SQL there are a various ways of finding this out, here is one I use.

–System Admin 1 yes god like powers 0 no I can’t drop that database

SELECT is_srvrolemember(‘sysadmin’) [Sysadmin] 

6 System Uptime

Is the system running like a dog? People start asking when was SQL rebooted, I.T’s magic wand lets reboot the server, when was SQL last started. There are several ways of finding out this information ( can use Top tip 5 as well!) but this is an easy approach. Basically see when SPID 1 logged in, also if you useDATEDIFF you can get SQL to tell you how many days, how many DBA’s can count I even use SELECT 10+20 to work out calculations, far too slow opening up calculator.

–Logintime for SPID 1 
SELECT DATEDIFF(dd,login_time,Getdate())Uptime,Login_time 
FROM master..sysprocesses 
WHERE spid =1 

5 Errorlogs

Sometimes useful when evaluating an issue, the SQL error logs can be access from the object explorer, but can be quicker and especially if it’s a long log it open far quicker this way.

–Read Error log takes Integer value for the error log number

exec master..xp_readerrorlog 

Useful to find out a variety of information quickly, the log gets recycled when the server reboots, header shows were these logs are actually kept, version of SQL and a variety of message. If database are in recovery good place to look to get idea how long its going to take

4 Statistics

Lifted directly from the pages of the SQL bible, or BOL as it known. Things are running pants trying to work out what has changed, how up to date are the statistics? This will tell you.

–STATS_DATE code from BOL 
SELECT 
‘Index Name’= i.name,‘Statistics Date’=STATS_DATE(i.id,i.indid) 
FROM 
sysobjects o 
JOIN 
sysindexes i ON o.id = i.id 

I tend to use order by 2 DESC added onto the end to order by the tables that were last updated.

3 Disk Space

Another one of those problem solving procedures. Used quite frequently on development system, as you really should have no excuses for production systems running out of space, unless it is the log drive and something untoward has occurred. One of the first procedures run when a developer says my database restore won’t work, 90Gb doesn’t fit on 45Gb free space funnily enough!

–List disk information, useful for those users filling logs!

EXEC Master.dbo.xp_fixeddrives 

2 Short Cut Keys

Life savers when reviewing production incidents, how often you get “it’s not working” with little or no information, multiple systems that you’re unsure of the exact schema, so it is useful to know short cut keys. These are some of the ones I use daily. By assigning to short cuts you can specify, by highlighting SQL I can quickly pull up lots of useful info

–Get the stored procedure text of system proc sp_who , just highlight sp_who & press Ctrl-F1

USE 
MASTER; 
GO 
sp_helptext 
sp_who 
Get infomation on tables , Highlight Region & Press ALT +F1 
USE NorthWind 
GO 
Region 

1 Quick Blocking

The piece of code that I probably use more than most, so simple but so helpful in times of crisis, which was basically many years ago ripped off from the system stoted procedure EXECsp_blockcnt, which basically just tells you the number of blocked processes on ther server. This querys can be written from memory with no need for fancy solutions on production boxes that your can’t role out code to.

–Blocking processes

SELECT 
* 
FROM 
MASTER.dbo.sysprocesses 
WHERE blocked <> 0
PowerShell- SQLps
03 May 11 07:42 PM | MartinCroft | with no comments

Day 3, of a blog a day I was hoping for more time for this post but we’ll see how it goes. I looked through some of my Google Documents and found some interesting bits that I’d collected around PowerShell so i thought I’d put them in a blog for future reference and have a play around with it.

and as it this SQL related I’ll stick to SQLps. It can be evoked from a command prompt or for those not of an older generation with the run command. SQLPS runs a slimmed down PowerShell session with the SQL Server PowerShell provider and cmdlets (more on these later)loaded and registered.

sqlps

Console window after running SQLps and doing a Dir.

PowerShell can be used to connect to SQL2008, SQL 2005 from SP2 and SQL 2000 from SP4

SQL Server 2008 version of SQL Server Management Studio supports starting PowerShell sessions from the Object Explorer tree,

sqlps-mangementstudio

Right click database / table or at any level in object explorer and select PowerShell.  In the shot above this open’s a cmd line Powershell application within that database tables, and the ability to change to directory's for a variety of objects

Powershell scripts can also be executed through SQL Agent in SQL 2008.

CmdLets

CmdLets are single function command, that can be quite powerful, however security is locked down out of the box and ironically to unlock it uses a CmdLet

Security

The OS wont run PowerShell scripts by default. Therefore we need to adjust the ExecutionPolicy to allow Powershell scripts to run. you set the ExecutionPolicy through a cmdlet .To get the current Execution policy run

get-ExecutionPolicy

running the following will give help on the cmdlet

set-ExecutionPolicy –?

Short cut keys

It might be a good point to mention a few shortcut keys , if you typed in

set-ex

and pressed tab, it would auto complete the word. If there was multiple possible combinations you could cycle through all possible options.

F7 – Will show the history of last 50 commands in a popup window and you can cursor through the command ran.

F8 - More functionality that arrow up, can search for specific command already ran e.g type in d(F8) and brings up command starting d, to get the next d command press F8 again.

F9 – Allows entry of a command number

There are multiple execution policies , the default being Restricted, which permits individual commands but wont run scripts.  More details can be found here set-ExecutionPolicy –?

setting the execution level to remote signed will allow script to be execute.

set-ExecutionPolicy  RemoteSigned

you might come across the following issue if running Vista / Windows 7

sqlps-accessdenied

If this is the case, run SQLps as administrator, start run type SQLps and when it comes up right click and Run As Administrator,  once logged in run the set-executionpolicy and when opening back up next time it should be set.

This was a brief forage into Powershell I’ll look to go more indepth in some future posts, I do have 31 days in May. This task is looking slightly ugly, like the looks the wife has been giving me.

Filed under: ,
Maximum date in a row (not a column)
02 May 11 03:04 PM | MartinCroft | 2 comment(s)

Day 2 of a post every day ! I was asked recently how you could get the latest/earliest date field across a row of data, not the most usual request it tends to be the maximum value in a column and then its just a straight forward GROUP BY and MAX in that case. However

maximumvalueinrow

given the example above the user wanted for each driver to take the minimum date of birth. the following solution could be used.

USE tempdb
GO
IF  EXISTS (SELECT * FROM sys.objects 
WHERE object_id = OBJECT_ID(N'[dbo].[DriverDetails]') AND type in (N'U'))
DROP TABLE [dbo].[DriverDetails]
GO
CREATE TABLE DriverDetails 
( DriverID INT IDENTITY(1,1),
  DriverName VARCHAR(20),
  Driver1DOB DATETIME,
  Driver2DOB DATETIME,
  Driver3DOB DATETIME
)  
INSERT INTO DriverDetails VALUES 
('Bob','01 May 1986','01 June 1986','01 April 1986'), 
('Mary','01 May 1976','01 June 1946','01 April 1986'), 
('John','01 May 1966','01 June 1956','01 April 1946'), 
('Jenny','01 May 1956','01 June 1966','01 April 1986') 
SELECT
      *
FROM
      DriverDetails a
JOIN  
(
--Unpivot the data and then do a MAX, MIN ON Driver
SELECT DriverID,MIN(DOB)AS MinDOB,MAX(DOB) AS MaxDOB
FROM 
   (SELECT DriverID,Driver1DOB,Driver2DOB,Driver3DOB
   FROM DriverDetails) p
UNPIVOT
   (DOB FOR Employee IN 
      (Driver1DOB,Driver2DOB,Driver3DOB)
)AS unpvt
GROUP BY DriverID
) b
ON a.DriverID =b.DriverID;
Filed under: ,
Blog a Day May - Productivity
01 May 11 05:19 PM | MartinCroft | with no comments

Productivity 

I have decided after unsuccessfully not blogging once this year to blog each day in May. The simple premise is that I have got to blog at least once ( can't see me doing it more than once but who knows) daily and be of useful content, maybe the first one can be excused till I get back into the swing of things.

In an attempt to get more productive I have come across several useful tools. Drop box  you can sign up for a free account and get 2Gb of free space, this allows you to add to every machine you use a folder that keeps all items in that folder in sync across all machines/ phones and in the cloud. So if I drop in a SQL script, word document its sync'd between all my machines and also I can access through my Android phone, also other smart phones are available!. If you get 8 friends to sign up (its free) you can get another 250Mg per friend up to another 8Gb, now it becomes useful unless its blocked through work!

There is an action management system called Get things Done GTD  which the basic principle of The Getting Things Done method rests on the principle that a person needs to move tasks out of the mind by recording them externally. That way, the mind is freed from the job of remembering everything that needs to be done, and can concentrate on actually performing those tasks. You set next physical actions which is where Springpad comes in, this can be accessed through the web or an Android App and possibly the other one as well !.

One thing I have seen all over the show was delicious but never actually look into what it is Delicious , well Its a pretty cool tool for recording bookmarks and finding new and interesting resources. You can add tags to make finding bookmarks easier. You can add to the browser tool bar and click on "Bookmark on delicious " as your about to leave the office and pick it up again at home.  Another useful one is ( Android only) is Chrome to Phone install it into Chrome and with a single click whatever webpage is on display its sent directly to your phone great as your dashing for the train.

Anyway I should at least post something SQL related so here it is. 

Retrieve the nth smallest or greatest number in a table

DECLARE @item TABLE (ItemID INT IDENTITY(1,1), n int )

--Some randomish numbers

INSERT INTO @item (n) Values (1),(2),(4),(8),(56),(128),(256)

--(n -1) will return the nth smallest number

--changing the < to > can get the largest nth number

SELECT n FROM @item a WHERE (5 - 1)

= (select count(*) from @item b where b.itemid < a.itemid)

I am sure i'll be a  dab hand at this by the end of May, but I have so much fun with with this first post, chrome does not handle the formatting, and ie9 kept crashing as I was trying to paste in the SQL code and the formatting was horrendous, then I tried Firefox and I eventually managed it. 

SQL 2008 top 5 things for a succesful Upgrade and Migration
31 October 10 07:21 AM | MartinCroft | 1 comment(s)

We have completed several upgrades from SQL 2000 and SQL2005 to SQL2008 and I thought I would blog about the top 5 things for a successful upgrade. There are lots of things that make an upgrade a success, these are just some of them.

  1. Get a great project team together, this might seem obvious but it can make or break the project. Get expertise in all the areas that the project covers SQL,storage SAN's, VMware, Infrastructure,SSIS dependent on what the project covers. Have people available on phone support if possible. Get a good cross over of skill sets
  2. Goes without saying TEST, if your lucky enough to have like for like infrastructures then its easier but in high usage OLTP,OLAP systems which service lots of users and lots of apps this is not always easy. Test to the maximum that achievable, there are usually business constraints, resource constraints. If its not tested it need to be raised as a risk. You could blog entirely on testing
  3. Document everything. I have a subset of documents I believe are a must for an upgrade / migration. In no particular order as it probably would take me far to long to decide which document has greater requirements 
    • External Influences Document - This was an important document that details all aspects of the SQL server and all external influences, such as applications that connect, services, windows application, web.config files, url for web applications, usage, user accounts for apps. External SQL servers that connect through linked servers, DTS packages, SSIS packages, other non SQL connectivity etc. Any outputs to other SQL Servers, Application servers Webservers, file structures FTP. This document should contain a diagram of all the systems that are touched and a high level of detail around the connectivity. SQL server access to the server, groups, users and permission. During upgrades it can be found the DBA team are the main players in the upgrade and the individual teams who develop the applications on the server take a back role during the project, this is the document you need to get their input into, that way there should be no surprises down the line. This will be used for configuring the test environments, knowing the scope of the project and start of building up requirements for go live steps. 

    • Go live steps spread sheet - the list of actions for the migration, numbered, with the server the action is carried out on, what the action is. Reference to scripts, owner, who carried out the step, time , who checked the step ( dependent on project!)

    • Lessons Learned Document- This is very useful and should not be seen as a witch hunt at the end of projects, but more for the next project, what went wrong how can we avoid it next time, how can we be more efficient next time. We reviewed previous Lessons Learned Documentation from previous projects when creating the go live steps to give us the best information to succeed in the project. 
    • Risk and Issues logs, Highlight reports - goes without saying we need to keep the focus of the project with the business and cover ourselves when the business puts constraints down that don't allow you to do everything you would in the perfect world.
  4. Learn from experience - I mentioned the Lessons Learned document, learn from these. over the last several months we upgraded 5 core systems from SQL2000 &amp; SQL2005, each one presented different challenges, with different complexness, but each one we learned and took into the next one what we learned. 
  5. Use the technology available to you once upgraded. DMV's, Data collector, Resource Governor. I'll be blogging a set of DBA procedures that utilise the DMV's to show whats running, who's connected, what indexes are being used, what indexes aren't and how much space they save. What procedures are executed , how frequently , average times etc.
There are many other elements that contribute to a successful upgrade but I believe these are some of the more important ones.