in

SQL Server Blogs

Voices from the UK SQL Server Community

Jason Massie's SQL blog

The problem with local variables

Have you ever been writing a query and just cannot get it to use the right index? This could be one of the reasons why. Let's use this query with local variables as our example.

declare @Start datetime
declare @End datetime
select @Start = '2004-08-01 00:00:00.000'
select @End = '2004-07-28 00:00:00.000'
select ProductID from sales.SalesOrderDetail
where ModifiedDate >= @End and ModifiedDate <= @Start
 

SQL Server Clustered index scan

but we have an index on ModifiedDate. There are many reason why SQL would not use this index but, for this post, we will assume we have eliminated them. Finally, we hard code the dates and we get this plan.

SQL Server index seek with bookmark lookup

So why is it doing this? The reason is because the query optimizer cannot accurately use the statistics to estimate how many rows are returned with local variables. Let's look at how we can tell there is a problem with the cardinality estimates. In the query with the local variables, the optimizer thinks we are getting 10918.5 rows so we do the index scan. In the query with hard coded literals, the estimated rows and actual rows are the same and accurate.

SQL Server Cardinality underestimation

How can we fix this? There are several way. This is another situation that makes a case for stored procedures or parameterized queries.

create proc pDemo01
@Start datetime,
@End datetime
as
select ProductID from sales.SalesOrderDetail
where ModifiedDate >= @End and ModifiedDate <= @Start


exec pDemo01 '2004-08-01 00:00:00.000', '2004-07-28 00:00:00.000'

The stored proc generates the proper plan. However, you will run into the same problem if you modify the parameter within the stored proc like select @start = @start-90. In this case, to should use sp_executeSQL. What if you cannot use a stored proc because it is a 3rd party app or some other reason? A covering index is probably the solution. Once we create this index, it will always be used:

create index ix01 on sales.SalesOrderDetail(ModifiedDate) include (ProductID)

We could use a plan guide or an index hint with a forceseek(SQL 2008)  but performance will be really bad when we really do need to get 10k rows. The same problem can happen with stored proc's but that is another post.

To get deeper into this subject, check out this.

 *Cross posted from http://statisticsio.com *
 

 

 
 
 

Comments

 

GrumpyOldDBA said:

this isn't exactly unknown territory and has been a possible issue back as far as I can remember. If you can't properly parameterise then your only option is to use a plan guide - however having used plan guides to resolve one set of problems they actually caused another set. If result sets can vary( e.g. it may require a seek for one range vs a scan for another ) the normal option is to set the plan guide to recompile to make sure a correct plan is generated each time. Can be tricky with third party apps which use in-line / dynamic sql ( and I wonder where LINQ will fit in with this? ) Perhaps we should put more pressure on management not to buy "generic" apps which use in-line sql?

January 25, 2008 10:14 AM
 

JasonMassie said:

Your right. The only fix in all situations is a covering index.

The problem is alot of generic apps are things like seibel, SAP etc. The industry standard stuff...

January 25, 2008 6:55 PM

About JasonMassie

Jason is a SQL Server Consultant for the professional services organization of Terremark (Formerly Data Return LLC) where he has worked for the last 8 1/2 years. Jason is an MCITP Database Administrator\Database Developer as well as an MCDBA on 7.0 and 2000. You can read his blog at http://statisticsio.com. He lives in Irving, TX. He enjoys time with his wife and three daughters as well as making electronic music. He can be reached at http://linkedin.com/in/jasonmassie or jmassie@terremark.com
Powered by Community Server (Commercial Edition), by Telligent Systems