Dmitriy
Dmitriy

Reputation: 664

How to optimize SQL query?

I have 2 tables ('keys' is consists of about 6 fields, 'stats' is consists of about 65 fields).

I want to insert rows in both of tables without dublication of phrase text. I use something like this:

UPDATE Keys SET CommandType = 'ADDED', CommandCode = @CommandCode WHERE 
KeyText = @KeyText AND Tab_ID = @TabID AND CommandType = 'DELETED';

INSERT INTO Keys (IsChecked, KeyText, AddDateTime, Tab_ID, KeySource_ID, CommandCode, CommandType)
SELECT 0, @KeyText, datetime(), @TabID, @KeySourceID, @CommandCode, 'ADDED'
WHERE NOT EXISTS (SELECT 1 FROM Keys WHERE Tab_ID = @TabID AND KeyText = @KeyText);

INSERT INTO Statistics (Key_ID)
SELECT ID FROM Keys WHERE KeyText = @KeyText AND Tab_ID = @TabID AND (CommandType IS NULL OR CommandType <> 'DELETED') AND 
NOT EXISTS (SELECT 1 FROM Statistics WHERE Key_ID = (SELECT ID FROM Keys WHERE KeyText = @KeyText AND Tab_ID = @TabID AND (CommandType IS NULL OR CommandType <> 'DELETED') LIMIT 1));

How can I optimize it? I create indexes for all used in this query fields. Maybe you can recommend me some solution?

Thanks for help and sorry for my bad english.

Upvotes: 0

Views: 1068

Answers (3)

Dmitriy
Dmitriy

Reputation: 664

The problem was in bad indexes for my tables. I reconstruct it and replace some query parametrs with static content and it works great!

Upvotes: 0

Devin Garner
Devin Garner

Reputation: 1416

You can combine an insert/update statement into a single statement with a MERGE statement. If you want to copy modifications of keys into statistics, you can use an OUTPUT statement.

You'd have to add your indexes to the question in order to be able to comment on their effectiveness, but basically you want a single index on each table that contains all of the columns in your where clause. You want to use include columns for anything in your select that is not in the where clause.

The best way to optimize is to get an estimated/actual query plan and see which parts of the query are slow. In SQL Server this is done from the "query" menu. Basically, look out for anything that says "scan", that means you're missing an index. "seek" is good.

However, a query plan is mostly helpful for fine-tuning. In this case, using a different algorithm (like merge/output) will make a more drastic difference.

In SQL Server, the results would look somewhat like this:

INSERT INTO [Statistics] (ID)
SELECT ID FROM
(
    MERGE [Keys] AS TARGET
    USING (
        SELECT @KeyText AS KeyText, @TabID AS TabId, @CommandCode AS CommandCode, @KeySourceID AS KeySourceID, 'Added' AS CommandType
    ) AS SOURCE
    ON (target.KeyText = source.KeyText AND target.Tab_Id = @TabID)
    WHEN MATCHED AND CommandType = 'DELETED' THEN
        UPDATE SET Target.CommandType = Source.CommandType, Target.CommandCode = Source.CommandCode
    WHEN NOT MATCHED BY TARGET THEN
        INSERT (IsChecked, KeyText, AddDateTime, Tab_Id, KeySource_ID, CommandCode, CommandType) VALUES (0, KeyText, getdate(), TabId, KeySourceId, CommandCode, CommandType)
    OUTPUT $Action, INSERTED.ID
) AS Changes (Action, ID)
WHERE Changes.Action = 'INSERT'
AND NOT EXISTS (SELECT 1 FROM Statistics b WHERE b.ID = Changes.ID)

Upvotes: 1

Milimetric
Milimetric

Reputation: 13549

Creating indices slows down insert and update queries because the index must be updated along with the data. To optimize your particular insert statements, get rid of any indices you don't need for your typical select statements. Then work on simplifying those "and not exists" clauses. Those are the only source of any performance gains you're going to get. Try creating indices to speed that subquery up once you've simplified it.

Upvotes: 1

Related Questions