Madam Zu Zu
Madam Zu Zu

Reputation: 6615

Optimizing stored procedure with multiple "LIKE"s

I am passing in a comma-delimited list of values that I need to compare to the database

Here is an example of the values I'm passing in:

@orgList = "1123, 223%, 54%"

To use the wildcard I think I have to do LIKE but the query runs a long time and only returns 14 rows (the results are correct, but it's just taking forever, probably because I'm using the join incorrectly)

Can I make it better?

This is what I do now:

declare @tempTable Table (SearchOrg nvarchar(max) )

insert  into @tempTable
   select * from dbo.udf_split(@orgList) as split 

-- this splits the values at the comma and puts them in a temp table

-- then I do a join on the main table and the temp table to do a like on it.... 
-- but I think it's not right because it's too long. 

select something 
from maintable gt
join @tempTable tt on gt.org like tt.SearchOrg
where
    AYEAR= ISNULL(@year, ayear)
    and (AYEAR >= ISNULL(@yearR1, ayear) and ayear <= ISNULL(@yearr2, ayear))
    and  adate = ISNULL(@Date, adate)
    and (adate  >= ISNULL(@dateR1, adate) and adate <= ISNULL(@DateR2 , adate))

The final result would be all rows where the maintable.org is 1123, or starts with 223 or starts with 554

The reason for my date craziness is because sometimes the stored procedure only checks for a year, sometimes for a year range, sometimes for a specific date and sometimes for a date range... everything that's not used in passed in as null.

Maybe the problem is there?

Upvotes: 1

Views: 877

Answers (3)

Gordon Linoff
Gordon Linoff

Reputation: 1270723

Your query may be difficult to optimize. Part of the question is what is in the where clause. You probably want to filter these first, and then do the join using like. Or, you can try to make the join faster, and then do a full table scan on the results.

SQL Server should optimize a like statement of the form 'abc%' -- that is, where the wildcard is at the end. (See here, for example.) So, you can start with an index on maintable.org. Fortunately, your examples meet this criteria. However, if you have '%abc' -- the wildcard comes first -- then the optimization won't work.

For the index to work best, it might also need to take into account the conditions in the where clause. In other words, adding the index is suggestive, but the rest of the query may preclude the use of the index.

And, let me add, the best solution for these types of searches is to use the full text search capability in SQL Server (see here).

Upvotes: 1

usr
usr

Reputation: 171226

Such a dynamic query with optional filters and LIKE driven by a table (!) are very hard to optimize because almost nothing is statically known. The optimizer has to create a very general plan.

You can do two things to speed this up by orders of magnitute:

  1. Play with OPTION (RECOMPILE). If the compile times are acceptable this will at least deal with all the optional filters (but not with the LIKE table).
  2. Do code generation and EXEC sp_executesql the code. Build a query with all LIKE clauses inlined into the SQL so that it looks like this: WHERE a LIKE @like0 OR a LIKE @like1 ... (not sure if you need OR or AND). This allows the optimizer to get rid of the join and just execute a normal predicate).

Upvotes: 1

Richard Deeming
Richard Deeming

Reputation: 31238

Try something like this:

Declare @tempTable Table
(
   -- Since the column is a varchar(10), you don't want to use nvarchar here.
   SearchOrg varchar(20)
);

INSERT INTO @tempTable
SELECT * FROM dbo.udf_split(@orgList);

SELECT 
   something
FROM 
   maintable gt
WHERE
   some where statements go here
And
   Exists
   (
      SELECT 1
      FROM @tempTable tt
      WHERE gt.org Like tt.SearchOrg
   )

Upvotes: 1

Related Questions