0

I want to write a query in a stored proc with many filters but I want to avoid dynamic SQL.

Say my parameters are nullable (@filter1, @filter2, @filter3...). One way I might solve this is:

SELECT col1, col2, col3
FROM table
WHERE col1 = ISNULL(@filter1, col1)
AND col2 = ISNULL(@filter2, col2)
AND col3 = ISNULL(@filter3, col3)

The result of this would filter by the appropriate filters if not null. The question is: 1) Is this good a practice? 2) Will the optimizer optimize the col1 = col1 out or will this affect query performance?

2
  • 3
    The performance is going to suck, because there's so many possible combinations that a query plan is not likely to be cached. Dynamic SQL is a much better option. Commented Jun 23, 2010 at 20:09
  • Erland Sommarskog put together an excellent article on this type of problem. I strongly advise reading through it. Commented Jun 23, 2010 at 20:14

5 Answers 5

1

About optimizing the conditions: what you must realize is that a compiled plan has to satisfy any variable value. So when the plan is generated, SQL Server must create an access plan that works when @filter1 is NULL and also works when @filter1 is not NULL. The result is almost always a scan.

The articles linked by Tom H. go into this in much detail.

Sign up to request clarification or add additional context in comments.

Comments

0

ISNULL can hurt index usage so I wouldn't say it's ideal but if you need the functionality described above i'm not sure there is a way around it.

Can you look at your execution plan to see if the index you would expect to be used are being used?

2 Comments

This construct often works OK. COALESCE can be the killer because of how it treats datatypes
Interesting, any recommended readings for the ISNULL vs COALESCE?
0

1) Is this good a practice? 2) Will the optimizer optimize the col1 = col1 out or will this affect query performance?

Yes, it's a good practice.

Some RDBMSes will optimize it out, some won't. None will if you're calling it as a prepared statement.

Don't prematurely optimize; odds are, for most things, the difference in costs will be negligible, or if not, can be made negligible with appropriate indices.

Concentrate on writing code that clearly expresses what you're doing. In my opinion, this idiom is clear and concise.

2 Comments

I'm a big proponent of not trying to prematurely optimize, but in this case the performance impact can often be huge and indexes are unlikely to help because the query is potentially doing something different every time. This type of functionality is usually system-wide (i.e. you need to do dynamic searches on many different tables), so it's good to know your general approach before you've written 50 SPs that you then have to rewrite.
This is not premature optimizing it is desgning for performance which should be done on every database. When there are known differnces in performance between techniques, then the best performing one should be chosen from the start. Databasea are notoriously difficult to refactor. It is too late when performance has become an issue. Premature optimization doesn't mean no optimization.
0

If you expect this table to grow to any substantial size this is not a good idea as the query optimizer will not cache the execution plan and the optimizer sucks at dealing with situations like this as it can't easily tell at compile time what the execution path will be.

You would be much better off just generating a query on the client side with the correct filters in the where clause instead of trying to write a single catch-all query.

Comments

0

In my experience (by running some benchmarks on large tables) the following:

(col1 = @filter or @filter IS NULL)

is much faster than:

col1 = ISNULL(@filter1, col1)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.