SQL Server like other software unfortunately has bugs, and thankfully these are often resolved in the subsequent Service Packs & Hotfixes released

One such bug was the incorrect growth of the log file, if the growth increment was any multiple of 4GB - it doesn't grow with the expected growth size

 

For those who didn't know

As log growths are split in a number of equally sized number of VLFs, with that number of VLFs dependent on the overall growth increment size

For a growth increment that is less than 64MB, this will grow the file in 4 VLFs (equally sized), and with a growth increment size between 64MB to 1GB this is in 8 VLFs, finally for the growths > 1GB this is in 16 VLFs


Paul Randal blogged on this bug way back in May 2010 - that's when I learned about the 4GB increment issue

http://www.sqlskills.com/BLOGS/PAUL/post/Bug-log-file-growth-broken-for-multiples-of-4GB.aspx

I recommend that you also read this entry VLFs by Kimberly http://www.sqlskills.com/BLOGS/KIMBERLY/post/Transaction-Log-VLFs-too-many-or-too-few.aspx#ixzz1lXeN9jNy

Subsequently my growths have been primarily in 2GB Growths, thus 16 VLFs each being 128MB (Once this hotfix is in place I can perhaps use the 4GB interval I want larger VLFs)

Finally a Connect item can help highlight such issues and ultimately even get us a fix - I try to review these regularly and vote for those you agree should be resolved

http://connect.microsoft.com/SQLServer/feedback/details/481594/log-growth-not-working-properly-with-specific-growth-sizes-vlfs-also-not-created-appropriately


Well for SQL Server 2008R2 this is being now addressed with the SP1+CU4 or CU11, details are documented in this KB http://support.microsoft.com/kb/2633151

It doesn't seem that there is also a fix for SQL Server 2008 - if this does become known I blog those details also

Posted by NeilHambly | with no comments
Filed under:

Things are getting real BUZY for me on the SQL Events front already {I have plans to do a lot of SQL Events this year, to even try to match my huge 2011 tally)

I have to try to find time to plan and organize all the various tasks needed to just fit it all in {need to get my travel plans sorted .I will have to try using Tripit as Jen Stirrup has recommended

and I still need to find a few evenings to run some local User-Group events in London over the next few months

I'm Looking for companies to be hosts for our User-Group evenings - So would you like to have a SQL User-Group event in London @ your own location (Ideal for your SQL Team to attend)

it just needs a room capable of seating 30-100 people for ~ 3 hours (6-9PM) during a mid-week evening (Tuesday - Thursday's) and not be difficult for the attendee's to get to.. close to a tube station (within 5-10 minutes) is always helpfull

We'll provide the speakers (you can help choice the topics for the evening) and we also provide the dinner & refreshments + SWAG for the attendee's - interested please email me @ Neil.Hambly@Hotmail.co.uk

 

So with this many event's already in my Calendar over the next few Months, it is going to be very exciting time but a tad exhausting as well

I will get to see so many of my SQL Friends over these events and undoubtedly meet a few new ones

Feb 2012

13th   Online MCM Overview event

Robert L Davis (MCM Program Lead) hosts this 90 minute session on the MCM cert

23rd   SQL UG (North Acton, London)

Presenting on Extended Events & Hosting the Evening


 March 2012

7th  Online "Virtual launch SQL Server 2012" 

Whole day of events (30) .. should be a real fun day {have it booked off to watch a lot of this without distractions}

16th- 17th  SQLSat #115 Lisbon, Portugal

Confirmed Speaker "Compression" + a LT on WMI

21st @ 11AM GMT 24 Hours of PASS Virtual event  http://www.sqlpass.org/24hours/spring2012/

Confirmed Speaker "SQL SERVER 2012 Memory Manangement" 

22nd -24th  SQLSat #105 Dublin, Ireland

Confirmed Speaker "Compression"

29th- 31st  SQLBits X London

Confirmed Speaker "Compression" + Volunteer + attending Adam's Pre-Con (excellent)

April 2012

19th SQL Server User Group (London)  @ IMGROUP EC4Y 8DP

Another of our User-Group Events, this one I'm just the host of and have Tony Rogerson MVP speaking and another from IMGROUP on BIGDATA topic

http://www.sqlserverfaq.com/events/384/London-UG-meeting-on-BI-topics-With-Tony-Rogerson-MVP-and-a-session-on-BIG-Data-from-IMGROUP.aspx

20th - 21st  SQLSat #107 Houston,  USA

Still awaiting outcome of my submissions & Idera ACE 2012 competition

May 2012

9TH - 12TH  SQLRally  Dallas, USA

Confirmed Speaker "Compression"

14th - 18th SQLSkills IE2 Course London

Attending this week 2 IE2 course, having done week 1 IE1 in July 2011

21st - 25th SQL Relay 2012 

Our UK week long series of full day SQL Events (Roadshow with SQL 2012 focus)   I'll be @ the London Event @ least , perhaps others as well

24th - 26th SQLDay Wroclaw, Poland

Confirmed Speaker with "Why are we Waiting.." session that I did that @ SQLPASS Summit 2011 

I'm only there on the Saturday 26th as in UK for the Thursday/Friday Sessions

Have more planned for June onwards... but that is a story for another day

Posted by NeilHambly | with no comments

 

As some of you probably know - I do speak a lot of SQL conference's, but getting session submissions picked can be tough @ times (especially for the larger events)

Sometimes there are community voting rounds .. for a few precious slots @ these larger events

 

I now have one of those voting rounds for my Pre-Con submission @ the next SQLRally event May 8th -  May 11th  in Dallas, TX  (this is one of the largest SQL Conferences)

My Pre-Con submission "Performance Trace & Event Collection in SQL Server" was selected for the voting round (that in itself is a big deal)

 

So please can I kindly ask if you can you spare just a minute of your time  & vote for my session

(it will be tough competition to secure one of the spots)

http://www.sqlpass.org/sqlrally/2012/dallas/CommunityChoice.aspx

Just Login (free to register if not already a member) and selected the session(s) you want to vote for - The voting ends 9th Feb 5PM (CST)

 

For those interested my upcoming confirmed events I'll be presenting @ (some outside of UK) as follows

  • London 23rd FEB @ UK SQL User-Group (North Acton) ..Presenting on Extended Events (replacement for Profiler)

 

"Effective Data Management Using Data Compression" session I have presented before @ SQL-in-the-City (Red-Gate) conference

  • Lisbon (Portugal) Mar 17th @ SQLSaturday #115
  • Dublin (Ireland) Mar 24th @ SQLSaturday #105 - Also Ireland SQL 2012 Technical launch
  • London April 1st @ SQLBits X - Also UK SQL 2012 Technical launch

"Why are we Waiting.." Session I one I did @ SQLPASS  (that had 310 attendees, 9th highest session attendance of the conference from 160 sessions)

  • Wroclaw (Poland) May 26th @ SQLDay

THANK YOU for your support

Posted by NeilHambly | with no comments
Filed under:

Following on my from posting on Cardinality earlier this month, As I will likely want a way to quickly evaluate the stats history for a database

http://sqlblogcasts.com/blogs/neilhambly/archive/2012/01/14/show-me-some-cardinality.aspx

I can of course automate the processs with something like the following script - which can be run against a database to provide a summary of the statistics for a given database

Have a Happy Stats day

SET NOCOUNT ON

CREATE TABLE #STATS

(

 TABLE_CATALOG             VARCHAR(128)

,TABLE_SCHEMA        VARCHAR(128)

,TABLE_NAME                VARCHAR(128)

,STATISTICS_CATALOG  VARCHAR(128)

,STATISTICS_SCHEMA   VARCHAR(128)

,STATISTICS_NAME     VARCHAR(128)

,STATISTICS_TYPE     INT

,COLUMN_NAME         VARCHAR(128)

,COLUMN_GUID         VARCHAR(128)

,COLUMN_PROPID             VARCHAR(128)

,ORDINAL_POSITION    INT

,SAMPLE_PCT                INT

,LAST_UPDATE_TIME    DATETIME

,NO_OF_RANGES        INT

,COLUMN_CARDINALITY  INT

,TUPLE_CARDINALITY   INT

,TABLE_CARDINALITY   INT

,AVG_COLUMN_LENGTH   INT

)

GO

 

DECLARE @DBName VARCHAR(64), @SchemaName VARCHAR(64), @TableName VARCHAR(64), @IndexName VARCHAR(64)

DECLARE curStats CURSOR FAST_FORWARD READ_ONLY FOR

       SELECT

              QUOTENAME(DB_NAME()) AS DBName,

              QUOTENAME(Schema_name(SO.schema_id)) AS SchemaName,

              QUOTENAME(OBJECT_NAME(SO.OBJECT_ID)) AS TableName,

              SI.name AS IndexName

       FROM sys.indexes SI

       INNER JOIN sys.objects SO ON SI.object_id = SO.object_id

       WHERE SI.name IS NOT NULL

       AND Schema_name(SO.schema_id) NOT LIKE 'sys'

       AND SO.type = 'U'

 

OPEN curStats

FETCH NEXT FROM curStats INTO @DBName, @SchemaName, @TableName, @IndexName

WHILE @@FETCH_STATUS = 0

BEGIN

DECLARE @SQLCmd NVARCHAR(2000)

 

       SELECT @SQLCmd =

       'INSERT #STATS EXEC (' + CHAR(39) + 'DBCC SHOW_STATISTICS ('

       + CHAR(39) + CHAR(39) + @DBname + '.' + @SchemaName + '.' + @TableName

       + CHAR(39) + CHAR(39) + ',' + CHAR(39) + CHAR(39) + @IndexName + CHAR(39)

       + CHAR(39) + ') WITH STAT_HEADER JOIN DENSITY_VECTOR' + CHAR(39) + ');'

 

       --SELECT @SQLCmd

       EXEC (@SQLCmd)

      

FETCH NEXT FROM curStats INTO @DBName, @SchemaName, @TableName, @IndexName

END

 

CLOSE curStats

DEALLOCATE curStats

GO

 

SELECT *

FROM #STATS

WHERE SAMPLE_PCT <> 100 OR LAST_UPDATE_TIME < GETDATE()-1

ORDER BY TABLE_SCHEMA ASC, TABLE_NAME ASC, STATISTICS_NAME ASC, ORDINAL_POSITION ASC

 

SET NOCOUNT OFF

Posted by NeilHambly | with no comments
Filed under:

I know I'm a Geek @ heart and I do get that "Ooooh" moment when I find something different in a release (and I haven't just read it in a blog, book or another source)

One such moment occured last week, I'm in a Study group fro the MCM on Yammer  ( https://www.yammer.com/sqlmcmstudygroup )

One of our Members:  Matin Carins {who is a newly minted MCM @ that} Posted a question that got a lot of responses, I thankfully even managed to answer the question correctly Geeked , it was to do with unshrinkable Trans logs .. answer is available in the thread if your intrested

Anyway my post here is that we used fn_dblog to review the contents of the Trans log and I tried the example Script that Martin posted up and ran this on a SQL 2008R2 and also my 2012 RC0 builds

The results clearly showed some speed improvements when run on 2012 RC0  Cool (but this wasn't the main highlight for me .. that was that output of the fn_dblog)

The screen shot below shows the excact same Query that was run on the 2 builds.. both created a new database and inserted some dummy data to do the backup & Trans log shrink tests

Results on the LEFT are for the 2012 RC0 build (11.0.1750) and the results on the RIGHT are for the 2008 R2 Build (10.50.2500)

Noticed we have a new Column RecoveryUnitID (I haven't explored this one yet.. but will do so)

and also the query that provide the breakdown of operations has many more results {again intresting to investigate @ a later datetime when I have some spare time)

And finally the 2012 RC0 has less used pages (216 vs 256)  and less active VLFs - so again this looks worthy of further investigations

Below are segments of the script that provided these results

DBCC SHRINKFILE(ActiveVLFDemo_Log,1,TRUNCATEONLY)

DBCC LOGINFO

SELECT Operation,COUNT(*) FROM fn_dblog(NULL,NULL)

GROUP BY Operation HAVING COUNT(*) > 1

ORDER BY COUNT(*) DESC 

Posted by NeilHambly | with no comments
Filed under:

In the word of data – Cardinality issues is one issue you will already have encountered or will do @ some stage, having statistical information up to date is vital for the Query optimizer choosing a good query plan

So @ times you may have to explore the Tuple Cardinality & Table Cardinality values

In the screen shot below the top set of values is after a Statistics update has been run, set below was the previous

UPDATE STATISTICS schemaname.tablename WITH FULLSCAN, INDEX

·          The TUPLE_CARDINALITY = TABLE_CARDINALITY for 1st  row  {Note: This is a Clustered Primary Key index}

·          2nd - 4th records are for another non-clustered index which has 2 columns

o    2nd & 3rd rows represent these 2 columns (Notice the ORDINAL_POSITION)

o    4th is the non-clustered internal CI key

·          TUPLE_CARDINALITY values had considerable different values before/after the statistics update

o    2nd row has TUPLE_CARDINALITY = 33

§   Matches the number of distinct values for leading key column for nc index

o    3rd rows has TUPLE_CARDINALITY = 1058

§   Matches the number of distinct values for both columns in the nc index

o    4th rows, TUPLE_CARDINALITY  = 30496916

§   Previously was 229923 that’s just 0.75% of the table rows  ** That was my Red flag **

§   After update it now Matches the number of distinct values for CI key

So clearly something was not as accurate as it could be and the clustered & non-clustered indexes tuples seemed to be a long way from the values I expected them to be

The following code was used to capture these values

DROP TABLE #STATS

GO

CREATE TABLE #STATS

(

 TABLE_CATALOG       VARCHAR(128)

,TABLE_SCHEMA        VARCHAR(128)

,TABLE_NAME          VARCHAR(128)

,STATISTICS_CATALOG  VARCHAR(128)

,STATISTICS_SCHEMA   VARCHAR(128)

,STATISTICS_NAME     VARCHAR(128)

,STATISTICS_TYPE     INT

,COLUMN_NAME         VARCHAR(128)

,COLUMN_GUID         VARCHAR(128)

,COLUMN_PROPID       VARCHAR(128)

,ORDINAL_POSITION    INT

,SAMPLE_PCT          INT

,LAST_UPDATE_TIME    DATETIME

,NO_OF_RANGES        INT

,COLUMN_CARDINALITY  INT

,TUPLE_CARDINALITY   INT

,TABLE_CARDINALITY   INT

,AVG_COLUMN_LENGTH   INT

)

GO

INSERT #STATS EXEC ('DBCC SHOW_STATISTICS (''<dbname>.<schemaname>.<tablename>'',''<clusteredindexname>'') WITH STAT_HEADER JOIN DENSITY_VECTOR');

INSERT #STATS EXEC ('DBCC SHOW_STATISTICS (''<dbname>.<schemaname>.<tablename>'',''<nonclusteredindexname>'') WITH STAT_HEADER JOIN DENSITY_VECTOR');

Just replace the values to suit your requirments

<dbname>.<schemaname>.<tablename>

<clusteredindexname>

<nonclusteredindexname>

I have used the DBCC SHOW_STATISTICS WITH STAT_HEADER JOIN DENSITY_VECTOR other options are avialable, however these would not have given me the information I wanted

  • WITH STAT_HEADER
  • WITH DENSITY_VECTOR
  • WITH HISTOGRAM

I would look to evaluate the accuracy of the statistics on a table as part of any performance troubleshooting, if values are not as expected this may be an area to investigate further

This technique could also be used to gather statistics metrics easily @ some regular intervals for baselining purposes

Posted by NeilHambly | with no comments
Filed under:

If you like me - Knowing which Build you should have deployed when we look @ the myriad of Service Packs & Cumulative Updates possible, it can quickly become confusing

What actions should I do to "PATCH" a server to the appropriate build, So here is a quick script that can easily be updated as new builds are made available, and easily customized for your own environments

-- This Script only reports on your current version and recommends your course of action  - Usualy disclaimer of using script in your environments.. 

-- As per Aarons article - please ensure this is suitable for you environment

*** Updated script [ 1st March 2012 ] to add in SQL 2012 and CU4 | CU5 fro SQL Server 2008R2 Version ***

 

;With CTE_SQLEditions([Major],[Minor],[Build],[BuildMinor])

AS

(

select

  parsename(convert(varchar,serverproperty ('productversion')),4) As Major,

  parsename(convert(varchar,serverproperty ('productversion')),3) As Minor,

  parsename(convert(varchar,serverproperty ('productversion')),2) As Build,

  parsename(convert(varchar,serverproperty ('productversion')),1) As Buildminor

)

Select @@SERVERNAME, *

,CASE

      WHEN Major = 9                            -- SQL 2005

      THEN

            CASE

                  WHEN Build < 5000 THEN 'Upgrade to Service Pack 4 (9.0.5000), also consider applying the latest CU3 for SP4'

                  WHEN Build = 5000 THEN 'SP4 Installed, Consider apply latest CU3 for SP4(9.0.5266)'

                  WHEN Build = 5254 THEN 'SP4-CU1 installed, Consider apply latest CU3 for SP4(9.0.5266)'

                  WHEN Build = 5259 THEN 'SP4-CU2 installed, Consider apply latest CU3 for SP4(9.0.5266)'

                  WHEN Build = 5266 THEN '** SP4-CU3 installed, Watch for newer CU releases for SP4'

            END

      WHEN Major = 10 And Minor = 0       -- SQL 2008

      THEN

            CASE

                  WHEN Build < 5416 THEN 'Upgrade to Service Pack 3 (10.0.5416), also consider applying the latest CU2 for SP3'

                  WHEN Build = 5416 THEN 'SP3 Installed, Consider apply latest CU2 for SP3(10.0.5768)'

                  WHEN Build = 5776 THEN 'SP3-CU1 installed, Consider apply latest CU2 for SP3(10.0.5768)'

                  WHEN Build = 5768 THEN '** SP3-CU2 installed, Watch for newer CU releases for SP3'

            END

      WHEN Major = 10 And Minor = 50      -- SQL 2008 R2

      THEN

            CASE

                  WHEN Build < 2500 THEN 'Upgrade to Service Pack 1 (10.50.2500), also consider applying the latest CU5 for SP1'

                  WHEN Build = 2500 THEN 'SP1 Installed,also Consider apply latest CU5 for SP1 (10.50.2806)'

                  WHEN Build = 2769 THEN 'SP1-CU1 installed, Consider apply latest CU5 for SP1 (10.50.2806)'

                  WHEN Build = 2772 THEN 'SP1-CU2 installed, Consider apply latest CU5 for SP1 (10.50.2806)'

                  WHEN Build = 2789 THEN 'SP1-CU3 installed, Consider apply latest CU5 for SP1 (10.50.2806)'

                  WHEN Build = 2796 THEN 'SP1-CU4 installed, Consider apply latest CU5 for SP1 (10.50.2806)'

                  WHEN Build = 2806 THEN '** SP1-CU5 installed, Watch for newer CU releases for SP1'

            END  

      WHEN Major = 11 And Minor = 0      -- SQL 2012

      THEN

            CASE

                  WHEN Build < 1103 THEN 'Upgrade to latest Release Candidate (11.00.1750), Keep eye out for RC1 or RTM'

                  WHEN Build = 1440 THEN 'CTP3 Installed, Consider moving to latest Release Candidate RC0 (11.00.1750)'

                  WHEN Build = 1750 THEN 'RC0 installed, Keep eye out for RC1 or RTM'

                  WHEN Build > 1750 THEN '** You ahead of me ... I''ll try to catch up'

            END  

END  AS [Reconmendations]

FROM CTE_SQLEditions

 

 

Posted by NeilHambly | 1 comment(s)

I'm starting on my promise of providing 1 (sometimes more) technical blog, each & every week during 2012

 

So with this being Week #1 of 2012, I thought it would be nice to start with a new system stored procedure in the SQL 2012 (CTP3 | RC0) builds

It is called nicely sp_server_diagnostics So any guesses on what this does ? 

Well here is the link to BOL on this http://msdn.microsoft.com/en-us/library/ff878233(v=sql.110).aspx and the tag line for it from BOL on its purpose

Captures diagnostic data and health information about SQL Server to detect potential failures. The procedure runs in repeat mode and sends results periodically. It can be invoked from either a regular or a DAC connection

Incidentally during my session @ SQLPASS Summit 2011 on Wait Statistics, I used this new SP, that is because one of the 5 data values returned provides an information snapshot

of the TOP 10 Waits by counter & duration for both pre-emptive & non-preemptive, this means we get 4 sets of 10 results on TOP waits- I wanted to track the most prevalent wait_types


Below is code I have put together to capture data @ 10 second intervals, and then tracking the variouis wait_types from snapshot to snapshot

 

USE tempdb

 

--DROP TABLE dbo.tmp_sp_server_diagnostics

CREATE TABLE dbo.tmp_sp_server_diagnostics

       (

       [create_time] datetime,

       [component_type] nvarchar(20),

       [component_name] nvarchar(20),

       [state] int,

       [state_desc] nvarchar(20),

       [data] xml

       )

 

/*

This doesn't work if we also supply the @Repeat_interval

*/

INSERT dbo.tmp_sp_server_diagnostics

EXEC sys.sp_server_diagnostics

  --    @Repeat_interval = 10  

 

Select * From tempdb.dbo.tmp_sp_server_diagnostics

 

 

/*

Note as we can't use the @Repeat_interval - if you wanted to capture & store data with can't simply use the INSERT method

So we have to collect the data from the output of sp_server_diagnostics SP via Extended Events

 

## Rememeber to Set your own filename Path in the following code ##

*/

 

CREATE EVENT SESSION [diag]

ON SERVER

       ADD EVENT [sp_server_diagnostics_component_result] (set collect_data=1)

    ADD TARGET package0.event_file

              (SET filename=N'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\diag.xel')

GO

 

ALTER EVENT SESSION [diag] ON SERVER STATE = start

GO

 

/*

So we now can run with the @Repeat_interval parameter supplied and the XE session will capture the information

*/

 

EXEC sys.sp_server_diagnostics

       @Repeat_interval = 10  

 

/*

We can now analyze the results of the runs completed

*/

 

--DROP TABLE TopWaits

CREATE TABLE TopWaits ([RowN] INT, [XMLData] xml);

 

;WITH CTE_SD ([Name],[Package],[Time],[Component],[State],[State_desc],[Data])

AS(        

SELECT

  xml_data.value('(/event/@name)[1]','varchar(max)')                                      AS [Name]

, xml_data.value('(/event/@package)[1]', 'varchar(max)')                                  AS [Package]

, xml_data.value('(/event/@timestamp)[1]', 'datetime')                                    AS [Time]

, xml_data.value('(/event/data[@name=''component'']/value)[1]','varchar(max)')            AS [Component]

, xml_data.value('(/event/data[@name=''state'']/value)[1]','int')                         AS [State]

, xml_data.value('(/event/data[@name=''state_desc'']/value)[1]','varchar(max)')           AS [State_desc]

, xml_data.query('(/event/data[@name="data"]/value/*)')                                   AS [Data]

FROM

       (

    SELECT  OBJECT_NAME as event,CONVERT(xml, event_data) as xml_data

    FROM  sys.fn_xe_file_target_read_file

              ('C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\diag*.xel', NULL, NULL, NULL)

    ) AS XEventData

)

,[ParsedData] ([RowN],[Data])

AS

(

SELECT

ROW_NUMBER() OVER(PARTITION BY [Component] ORDER BY [Time] DESC) AS [RowN],

CAST([Data] AS XML) AS [Data]

FROM CTE_SD

WHERE [Component] = 2 -- <queryProcessing ... >

)

INSERT INTO TopWaits([RowN],[XMLData])

Select [RowN], [Data]

from ParsedData

 

Select

        Prev_interval.RowN

    ,Prev_interval.Wait_Type

    ,Prev_interval.WaitsCnt - Next_interval.WaitsCnt AS WaitsCnt

    ,Prev_interval.AvgWT - Next_interval.AvgWT       AS AvgWT

    ,Prev_interval.MaxWT - Next_interval.MaxWT       AS MaxWT

FROM

(

SELECT

       RowN,

    c.p.value('@waitType', 'varchar(50)')         [Wait_Type],

       c.p.value('@waits', 'int')                [WaitsCnt],

    c.p.value('@averageWaitTime', 'int')      [AvgWT],

    c.p.value('@maxWaitTime', 'int')          [MaxWT]

FROM TopWaits

CROSS APPLY XMLData.nodes('/queryProcessing/topWaits/nonPreemptive/byCount/*') c(p)

) AS Prev_interval ([RowN],[Wait_Type],[WaitsCnt],[AvgWT],[MaxWT])

INNER JOIN

(

SELECT

       RowN,

    c.p.value('@waitType', 'varchar(50)')         [Wait_Type],

    c.p.value('@waits', 'int')                [WaitsCnt],

    c.p.value('@averageWaitTime', 'int')      [AvgWT],

    c.p.value('@maxWaitTime', 'int')          [MaxWT]

FROM TopWaits

CROSS APPLY XMLData.nodes('/queryProcessing/topWaits/nonPreemptive/byCount/*') c(p)

) AS Next_interval ([RowN],[Wait_Type],[WaitsCnt],[AvgWT],[MaxWT])

ON Prev_interval.[Wait_Type] = Next_interval.[Wait_Type]

and Prev_interval.RowN +1 = Next_interval.RowN

 

Posted by NeilHambly | 3 comment(s)

So having completed my review of my 2011 goals, I have put together this years 2012 goals below

 

SQL Conferences & Training Events

  • SQL Master Week # 2 (May 2012) Master Immersion Event on Performance
  • SQLPass Summit (will attend my 2nd Summit - Hopefully once again as a speaker)
  • SQLRally (Dallas) 2012
  • SQLRally Europe (if any event is planned)
  • SQLSaturday #105 (Dublin) I will be Presenting @ this event
  • SQLSaturday hopefully another 1-2 events (Chicago + ????)
  • SQL BITS X / XI Events – hoping I will be able to present & attend @ these events
  • SQLDay 2012 My 1st Event in Poland (Straight after SQLBits X)
  • Present a 2-day Training Course (London)

SQLFAQ Events

  • SQL FAQ London (8-10 Events) organize those & also present @ a few of those
  •  Further presentations @ 4-6 non-London locations, including perhaps a full day event

Exams

Study & Pass the following 3 remaining Pre-Requisite Exams
  • 70-450  & 70-433 & 70-451 - Planned for Feb/March
  • Exam 88 970: SQL Server 2008 Microsoft Certified Master: Knowledge Exam
  • Exam 88 971: SQL Server 2008 Microsoft Certified Master: Lab Exam
  • Read 3+ SQL Technical Books
  • Pass the MCM SQL Server 2008 in 1st half 2012

Work on compiling my "MCM Study Notes" (OneNote) for MCM Certification" and keep it going throughout the year

http://cid-c2dc5b5a2f112857.office.live.com/view.aspx/SQL%20MCM%20Training/SQL%20MCM%20Training

Writing

  • Complete my chapter as co-author for my SQL Server 2012 book
  • Write and hopefully get published another 2 Articles or contribute to another book
  • Technical reviewer role – hopefully also have a chance to become a technical reviewer

Community

  • LinkedIn online resources, both mine and the UK SQL Server FAQ group
  • Forum Activity (Regular  activity) with a target 100+ questions answered
  • Twitter - Regular activity on various Tags build following (target 1000 followers)
  • Blogs (min 25 technical in the year) + add a few non-technical as well
 
Posted by NeilHambly | 2 comment(s)
In January 2011 - I posted these goals I had set myself for 2011, so just how did I fair on those, see my progress report belowI will also be posting my 2012 goals in the next few days

 

SQL Conferences & Training Events (All done - Nice)

·         SQLCruise (June 2011) Alaska - I'm booked on this one already!!  

o   Done , it was an amazing time attending the SQL Cruise and learnt a lot about SQL and other stuff, not to mention an amazing Alaska  adventure

·         SQL Master Week # 1 (June 2011) Master Immersion Event on Internals and Design
o   Done, I attended the week long course and it was undoubtedly the best training course I have ever attended, I’m still learning  from it
·         SQLPass Summit (Really must go this year's event)
o   Done, I not only attended the conference but was also Privileged enough to be a presenter with my “Why are we Waiting...” Session, which incidentally had the 9th highest attendance of 310 attendees @ the conference with over 160 sessions
·         SQL BITs 8 / 9 Events - present & attend @ 1 or hopefully both of these
o   Done, I attended both of the Events and was also again fortunate enough to be selected as a presenter with my Part I & II of my “Why are we Waiting…” sessions

 

SQLFAQ Events (All done - Was a lot of fun doing these events)

·         SQL FAQ London (10-12 Events) organize these & also present @ some of them aiming to do a min of 8 events this year

o   Done, only 8 events (just made my  8 event goal) 
·         More Presentations @ 6-8 other locations (non-London), including maybe even doing a 1 full day event (if I can)
o   Done, I presented @ another 10 non-London UG Events, this included 2 mini UG tours
·         XMAS 2011 event this year to organize good event and preparing for it well in advance.
o   Done, We had an event where some of us got together for drinks, but it was not a big event
 

Exams (not as many as I had planned - although a lot of prep work done for the exams)

Study & Pass the following 4 Pre-Requisite Exams (Hmmm time for some serious study sessions)

·         v 70-432 - Passed ·         v 70-450 - Planned for Feb ·         v 70-433 - Planned for Feb/March ·         v 70-451 - Planned for March

Work on compiling my "MCM Study Notes" (OneNote) for MCM Certification" and keep it going throughout  

Still have a lot to add to this resource. I have a lot of material I want to add, so will be allowing some time (regularly) to do just that and fill out content for each section (minimum 2 pages) during 2012  http://cid-c2dc5b5a2f112857.office.live.com/view.aspx/SQL%20MCM%20Training/SQL%20MCM%20Training

 

·         Exam 88 970: SQL Server 2008 Microsoft Certified Master: Knowledge Exam (Pass)
o   Planned for March
·         Exam 88 971: SQL Server 2008 Microsoft Certified Master: Lab Exam (Pass or try an attempt @ least)
o   Planned (if I pass Exam on 1st go) for March/April
·         READ: 3+ SQL Technical Books
o   Done, actually read about 5 books (although some chapters skipped in those, have a lot more to read through)
·         Ultimate GOAL is to obtain MCM SQL Server 2008 in 2011 or 1st half 2012
o   Still on target for 1st half of 2012 - going to be heavy schedule but achievable
 

Writing (Not much achieved in this category, aside from getting the opportunity to contribute towards a MSFT SQL Server 2012 book)

·         2+ Published Articles
o   Didn’t any published, but didn’t pursue this one, so will push this one into my 2012 goals
·         Complete an app design for SQL Product {its top Secret as yet. It's still in my mental design phase}
o   This was shelved as it would take a lot of time I was unable to dedicate and finish it to a suitable level – maybe will look @ this later in 2012
·         Work on an 100-page SQL eBook (Exactly 100 pages) in-depth SQL Topic (again will furnish more details in due course)
o   Took a different direction, as I’m writing a chapter for a SQL 2012 book instead – very excited to have that opportunity

    Will provide more details once the book is nearing completion (expected to be in 2nd Qtr. 2012)

 

Community  (Some of these achieved)     

·         Build further Relationships & promote the online resources: LinkedIn , SQLFAQ Website, SQLBlogcasts
o   Maintained some activity on all these, but not nearly enough for my likening , LinkedIn was my main activity area
·         Forum Activity (Weekly activity)
o   Nope, I didn’t focus on this, did a few questions, and really would like to get regular @ this
·         Twitter - Regular activity on various Tags build following (target 500-600 followers)
o   With 691 followers on 1st Jan 2012 achieved the target and 6647 tweets to-date, have been regular in my activity in  this FUN area {Yes I’m twitter junkie}
·         Blogs (min 30 in the year) mostly technical - but not all
o   Big fail here, with only 18 posts – some were  technical and some non-technical posts as well – this year will have to make up for my short fall in 2011 big time
 

So on reflection, not too bad on most of the goals i had set being achieved.

Those I didn't or decided not to pursue will be worked into my 2012 goalsI feel have built a good foundation with my SQL community work during 2012 and plan to expand on those during 2012Along with my MCM SQL Server 2008 certification goal.

 

Posted by NeilHambly | 1 comment(s)
More Posts « Previous page - Next page »