Raj
Raj

Reputation: 321

How to avoid deadlock while mutiple process working on same table in sql

Can we avoid deadlock by creating different database users for different processes

e.g. one user for communicating to API 'ABC' and one user for communicating to API 'PQR' and other user for Processing System data which is brought by API 'ABC' and 'PQR'? And all these user will process same tables.

Upvotes: 5

Views: 12199

Answers (5)

singhswat
singhswat

Reputation: 906

No, first find the deadlock victim look at this article. In most cases its lack of index or bad index causes deadlock...

If you can post ur deadlock details we can suggest a best possible solution.

Based on what you have asked its better to set priority to avoid deadlock.

Upvotes: 0

cloudsafe
cloudsafe

Reputation: 2506

It may not be suitable in all cases, but we try to handle the processing logic in a stored procedure and use 'sp_getapplock' to prevent the procedure transaction from being used simultaneously.

Upvotes: 0

Duc Filan
Duc Filan

Reputation: 7157

Personally, you can add a timestamp column to a table to help maintain the integrity of the database when multiple users are updating rows at the same time. You may also want to know how many rows and which rows were updated without re-querying the table.

CREATE TABLE MyTest (myKey int PRIMARY KEY, myValue int, RV rowversion);  

Then, you can then use the following sample Transact-SQL statements to implement optimistic concurrency control on the [table-name] table during the update.

DECLARE @t TABLE (myKey int);  
UPDATE MyTest  
SET myValue = 2  
    OUTPUT inserted.myKey INTO @t(myKey)   
WHERE myKey = 1   
    AND RV = [row-version-value];  
IF (SELECT COUNT(*) FROM @t) = 0  
    BEGIN  
        RAISERROR ('error changing row with myKey = %d'  
            ,16 -- Severity.  
            ,1 -- State   
            ,1) -- myKey that was changed   
    END;  

Upvotes: 0

Ryan Gadsdon
Ryan Gadsdon

Reputation: 2378

I guess that would prevent deadlock because you would have different users accessing different processes but that wouldnt really fix a deadlock problem. Deadlock is more where 2 entities are accessing the same piece of data/ the data gets blocked and then no one can finish the transaction. Its more like a catch 22 situation where they are both waiting for the other to finish but they both cant. Creating different users for different processes would prevent deadlock but its not really practical.

Deadlock

A normal approach/best practice would simply be to program the system to use locks so that transactions are locked in a certain order when entities are accessing them. This would prevent any transactions from falling into a deadlock scenario and if one transaction is using data, another trying to access the same piece would be forced to wait for the other to finish before it can proceed.

Upvotes: 1

M.Ali
M.Ali

Reputation: 69554

Deadlocks happen because of different sessions fighting for the same resources (tables, indexes, rows etc), SQL server doesn't care about who is the owner of the sessions, it can be the same users having multiple sessions or multiple users. So creating multiple users solely to avoid deadlocks isn't going to help.

Things that can help.....

  1. Access objects in the same order.
  2. Avoid user interaction in transactions.
  3. Keep transactions short and in one batch.
  4. Use a lower isolation level(With caution).
  5. Use a row versioning-based isolation level.
  6. Set READ_COMMITTED_SNAPSHOT database option ON to enable read-committed transactions to use row versioning.
  7. Use snapshot isolation if possible (be aware it will hammer the hell out of your tempdb).

Have a look at this Minimizing Deadlocks

Upvotes: 5

Related Questions