February 2007 - Posts

While writing some notification stuff around the conflict detection in merge replication I got a problem - the sp_send_mail I was executing was not sending out the email even though I know email was working; basically in the merge stored procedure conflict resolved, if you decide to choose the subscriber as the winner and the subscriber has removed the row then for some reason the agent rolls back the stored procedure transaction (enough on that, I shall leave merge conflicts for another day).

Back to the task in hand; the SQL below will only send Test 1 and Test 3 messages even though you get the mail queued message.

So, becareful in those maint routines - you may never get the email if it is sent within a transaction and that transaction is rolled back; its a good and bad thing - mostly good I guess because it means the sending of emails really is transactionally aware - this is probably down to database mail essentially being a service broker queue.

An example in real life is if you've used this in a trigger to notify you of an RI problem or an event and you are rolling back the trigger then you won't get the email.

EXEC msdb.dbo.sp_send_dbmail

    @profile_name = 'REPLICATION',

    @recipients = 'tonyrogerson@torver.net',

    @body = 'The stored procedure finished successfully.',

    @subject = 'Test 1' ;


begin tran


exec msdb.dbo.sp_send_dbmail

    @profile_name = 'REPLICATION',

    @recipients = 'tonyrogerson@torver.net',

    @body = 'The stored procedure finished successfully.',

    @subject = 'Test 2' ;




EXEC msdb.dbo.sp_send_dbmail

    @profile_name = 'REPLICATION',

    @recipients = 'tonyrogerson@torver.net',

    @body = 'The stored procedure finished successfully.',

    @subject = 'Test 3' ;  

If I have time over the weekend I'll write up the work round; basically its to make sure the email is sent outside of the transaction - that means calling an external process (CLR procedure for instance).

Hi all,

I know its short notice, but I've done a few of these events now all with exceptional feedback and given Kimberly and Paul are both in Europe and could make 6th March then I just had to book them (Reading, UK)!

This only got confirmed over the weekend so I've not had chance to get the registration process ready.

Set your dates and if you are interested then book through here: http://sqlserverfaq.com/controls/event/eventbooking.aspx; cost for this full day event with two of the most well known and technical people in the SQL Server world is £199 ex VAT which includes refreshments and a light lunch.

Crucial Database Maintenance Techniques for Databases of All Sizes

Kimberly L. Tripp;
Founder, SQLskills.com

Paul S. Randal;
Principal Lead Program Manager, Microsoft Corp.

Often databases are designed to handle complex business logic and to hold critical data. Once built, they're put into production and often perform well... for a while. When the load increases and the data volume becomes larger and larger, performance degrades and problems ensue. What can be done to ensure smooth operations? How can you minimize data loss in the event of a disaster? What are the most crucial maintenance operations to automate, how do they work and why are they important, what are the best practices for running them and how can you ensure they're working correctly? Maintenance is the key to having a system that's healthy, recoverable and performant. There are lots of areas to cover but here are the most important features: Database files (shrink, grow, virtual log files, log size/management), consistency checks (CHECKDB (which includes these other) v. CHECKTABLE/CHECKALLOC/CHECKCATALOG (why might you want to break down your checks?)), fragmentation (Defrags v. Rebuilds), Statistics (create and update) and Backup/Restore (recovery models, options/granularity, strategies). This workshop is targeting the best practices and goals for Database Administrators who are wearing many hats or full-time System Administrators. The workshop will vary from 200-400 level covering ALL of the key concepts what the feature is, how it works, why it's important and best practices in how and when to run.

Why you should attend this workshop? Learn from industry expert Kimberly L. Tripp as well as a key source on the SQL Server Development Team; Paul S. Randal. Paul is the former Storage Engine Access Methods Team Lead for SQL Server 2005 and the developer who wrote DBCC SHOWCONTIG (2000), DBCC INDEXDEFRAG (2000) and DBCC CHECKDB/repair for SQL Server 2005. Currently, Paul runs the Program Management team that controls the Core Storage Engine but part of his role includes directly interfacing with customers to help understand their needs and challenges. He's relatively new to the speaking circuit but has had rave reviews at TechEd (seven of them in 2006) and ITForum! If you're interested in getting more details from and about Paul, check out his content on the SQL Server Storage Engine Team blog: http://blogs.msdn.com/sqlserverstorageengine/default.aspx.


Part I: Database Creation/Space Management Best Practices

Part II: Table Structures, Fragmentation and Statistics

Part III: Designing a Backup Strategy for Effective Disaster Recovery

Part IV: Secrets of Fast Detection and Recovery from Database Corruptions


Paul S. Randal
Principal Lead Program Manager, Microsoft Corp.

Paul started in the industry in 1994 working for DEC on the VMS file system and check/repair tools. In 1999 he moved to Microsoft to work on SQL Server, specifically on DBCC.

For SQL Server 2000, he concentrated on index fragmentation (writing DBCC INDEXDEFRAG, DBCC SHOWCONTIG) and various algorithms in DBCC CHECKDB. During SQL Server 2005 development was the lead developer/manager of one the core dev teams in the Storage Engine, responsible for data access and storage (DBCC, allocation, indexes & heaps, pages/records, text/LOB storage, snapshot isolation, etc). He also spent several years rewriting DBCC CHECKDB/repair.

Since SQL Server 2005 shipped, Paul has managed the Program Management team for the core Storage Engine to become more focused on customer/partner engagement and feature set definition. Paul regularly presents at conferences around the world on high-availability, disaster recovery and Storage Engine internals. His popular blog is at http://blogs.msdn.com/sqlserverstorageengine.

Kimberly L. Tripp
Founder, SQLskills.com

Kimberly L. Tripp is a SQL Server MVP and a Microsoft Regional Director and has worked with SQL Server since 1990. Since 1995, Kimberly has worked as a Speaker, Writer, Trainer and Consultant for her own company SYSolutions, Inc. (www.SQLskills.com). In consulting, Kimberly tackles real world problems focused on performance, tuning, system availability and minimizing data loss. In writing, Kimberly contributes regularly for MSDN and SQL Server Magazine, and co-authored the MSPress title SQL Server 2000 High Availability. In speaking, Kimberly’s comments usually include that she has a gift of being able to discuss complex technical topics with ease and humor. Prior to starting SYSolutions, Kimberly held positions at Microsoft including Subject Matter Expert/Trainer for Microsoft University and Technical Writer for the SQL Server Development Team. You can get more details about Kimberly on SQLskills.com and her blog: www.SQLskills.com/blogs/Kimberly.

Here you go Dominic, for SQL Server 2005 this will work.

The big problem we have with SQL Server of old is that its been difficult to kick people out and keep them kicked out; even using the "Single User Mode" stuff its difficult because often somebody gets before you can take control of the database.

There is another way; there is a TSQL statement we can use ->

DENY CONNECT SQL TO [someuser]            --    Prevents somebody logging on

GRANT CONNECT SQL TO [someuser]           --    Allows them to log on

The DENY CONNECT only prevents somebody from connecting, it does not affect current connections - so if your user is already logged on then you need to do something else (ring them first and ask nicely and then KILL the connection, or if you need to get to the pub then KILL first and take the ear ache later). 

The procedure below automates this process for you; its pretty basic and I could do a lot more to jazz it up but its a start...

CREATE PROC control_user_access

      @op varchar(5),

      @db sysname




            Author: Tony Rogerson, 2007



            Feel free to use at will but don't rip it off as your own!



      IF @op NOT IN ( 'GRANT', 'DENY' )


            RAISERROR( '@op takes values GRANT or DENY', 16, 1 )





            --    Prevent Logins from getting into SQL Server.

            DECLARE @sql varchar(max)

            SET @sql = '

            declare logins_cur cursor for

                  select sp.name

                  from [' + @db + '].sys.database_principals du

                        inner join master.sys.server_principals sp on sp.sid = du.sid

                  where du.principal_id > 1'


            EXEC( @sql )


            DECLARE @login_name sysname


            OPEN logins_cur


            FETCH NEXT FROM logins_cur INTO @login_name


            WHILE @@fetch_status = 0


                  SET @sql = @op + ' CONNECT SQL TO [' + @login_name + ']'

                  PRINT @sql

                  EXEC( @sql )


                  FETCH NEXT FROM logins_cur INTO @login_name



            DEALLOCATE logins_cur


            --    Kill the logins in the database

            DECLARE sessions_cur CURSOR FOR

                  SELECT spid

                  FROM master..sysprocesses     --    Have to use this because DMV's dont give me the dbid

                  WHERE dbid = DB_ID( @db )


            OPEN sessions_cur


            DECLARE @session_id int


            FETCH NEXT FROM sessions_cur INTO @session_id


            WHILE @@FETCH_STATUS = 0


                  SET @sql = 'KILL ' + CAST( @session_id AS varchar(10) )

                  PRINT @sql

                  EXEC( @sql )


                  FETCH NEXT FROM sessions_cur INTO @session_id


            DEALLOCATE sessions_cur





To use it, just pass @op as DENY or GRANT depending on what you are doing and then specify the database you are going to work on.

There are other ways of doing this but I wanted to throw this one into the mill.


 We provide turnkey solutions to the mass alarm monitoring industry. Our products can run on one of two database servers, Informix and SQL Server. The majority of our customers use Informix, and we instigated a project to move all existing SQL Server customer onto Informix. However, one of our customers has a corporate policy which insists on SQL Server, and they want to move from Informix to SQL Server. Although the we have an SQL Server schema for our database, it is quite a bit out of date, and needs to be brought in line with the Informix schema.  Although the primary function would be to bring the SQL Server database into line with the Informix schema, we could see the role developing to include the following responsibilities.

                Develop and Review Stored Procedures, Views, Schema Changes and Referential Integrity

                Manage Database Upgrade Scripts

                Liaise with Informix DBA and gain knowledge of the Informix RDMS.

                Administer internal Development\QA\Support Databases

                Review and Improve customer administration tasks.

If you are interested then please send your CV to JN32154987@monitorsoft.com

AGENCIES : Please note that we operate a preferred supplier list for agencies, and will not deal with agencies who are not on this list. Any unsolicited emails will be deleted, and will result in the agency in question being added to our email blacklist.

I've been spending a couple of weeks on a file migration into SQL Server where the client's data is stored in files written using VB's GET and PUT (fixed length records).

I was at a loss trying to import the Currency data type until I found this really useful piece of information that shows how Currency is actually stored -> http://articles.techrepublic.com.com/5100-3513_11-5885671.html (thanks to Peter Aitken).

Anyway, the crunch is that Currency is stored using 8 bytes as a BIGINT in our terms but scaled by 4 decimal places.

So, all you do is to import the data as BIGINT and then convert to DECIMAL( 28, 4 ) and divide the number by 10,000.

I know this isn't detail but my mate Trevor Dwyer as commented I haven't posted since the 11th, the M25 is taking my time at the moment so will be back to normal output in the next few weeks.