Reputation: 5931
So I've got class Client
that has_many :transactions
. Both fields has monetized fields (money-rails
) gem. INclass Transaction
I've got after_create :add_customer_balance
. It should add this transaction.amount
to transaction.client
balance.
Problem I'm facing is situation when 2 transaction would be made in same moment. Let's have a look on this situation:
Variant 1:
time / process / code
0:01 / P1 / client = Client.find(1)
0:01 / P2 / client = Client.find(1)
0:02 / P1 / client.balance += 100
0:02 / P1 / client.save # SQL: update clients set balance = 200 where id = 1
0:03 / P2 / client.balance += 200
0:03 / P2 / client.save # SQL: update clients set balance = 300 where id = 1
Variant 2
to,e / process / code
0:01 / P1 / client = Client.find(1)
0:01 / P2 / client = Client.find(1)
0:02 / P1 / client.update_all(...) # SQL: update clients set balance = balance + 100 where id = 1
0:03 / P2 / client.update_all(...) # SQL: update clients set balance = balance + 200 where id = 1
Result:
Client.find(1).balance = 400
My question is: how to prevent first situation?
I'm looking for solution that would increase field with balance and immediately save it to database.
I tried doing increment!
but it seems to doesn't prevent race condition.
def increment!(attribute, by = 1)
increment(attribute, by).update_attribute(attribute, self[attribute])
end
Upvotes: 3
Views: 910
Reputation: 84114
A transaction on your own won't help you here. The save
process is wrapped in a transaction (the before_save
, after_save
and the actual save) but even if you included the find in your transaction
Client.transaction do
client = Client.find(1)
client.balance += 100
client.save
end
Then you are still at risk. It's easy to see this by adding a random duration call to sleep
between the find
and the save
. When the save executes an exclusive lock will be acquired on the row. This would block calls to find occurring in other transactions (and so they would only see the value post save), but if the client row has already been retrieved then it won't force it to reload.
There are 2 common approaches to this sort of problem
This looks like
Client.transaction do
client = Client.lock.find(1)
client.balance += 100
client.save
end
What this does is lock the row at the point of retrieval - any other attempt to call find
on that client will block until the end of the transaction. It's called pessimistic because even though the risk of the collision is low, you expect the worse case and lock every time. There is a performance penalty, since it blocks all attempts to read that row, even ones that weren't going to do an update. It's still the case that if this runs in parallel with
client = Client.find(1) #no call to lock here!
#some lengthy process
client.balance += 1
client.save
then you'll end up with bad data: the entire find-lock-update process could happen in the break between when the row was fetched and when the row was updated. Therefore all of the places where you update balance would need to use lock
With this you add a lock_version column to your model (must be of type integer and default to 0). Calls to save
will execute queries of the form
UPDATE clients set .... lock_version = 5 where id = 1 and lock_version = 4
With each save, lock_version is incremented by 1. If no rows are updated (ie there is a mismatch on the lock_version) then ActiveRecord::StaleObjectError is raised.
Applying this to your example
0:01 / P1 / client = Client.find(1) #lock_version is 1
0:01 / P2 / client = Client.find(1) #lock_version is 1
0:02 / P1 / client.balance += 100
0:02 / P1 / client.save # update clients
# set balance = 200, lock_version = 2
# where id = 1 and lock_version = 1
0:03 / P2 / client.balance += 200
0:03 / P2 / client.save # update clients
# set balance = 300, lock_version =2
# where id = 1 and lock_version = 1
The second update will match no rows, and so the exception is raised. At this point you should reload the client object and try again.
It's called optimistic because we assume that most of the time there won't be simultaneous updates: in the happy case the overhead is minimal. A downside is that any call to save
can result in ActiveRecord::StaleObjectError - it can be a bit of a pain handling all of those
The documentation for these is at http://api.rubyonrails.org/classes/ActiveRecord/Locking/Optimistic.html and http://api.rubyonrails.org/classes/ActiveRecord/Locking/Pessimistic.html
Upvotes: 5
Reputation: 10997
If I'm understanding your question correctly, this sounds like a textbook case for using transactions:
Transactions are protective blocks where SQL statements are only permanent if they can all succeed as one atomic action. The classic example is a transfer between two accounts where you can only have a deposit if the withdrawal succeeded and vice versa. Transactions enforce the integrity of the database and guard the data against program errors or database break-downs. So basically you should use transaction blocks whenever you have a number of statements that must be executed together or not at all.
Active Record supports transactions, you can read more about them here.
Here is the example from the documentation:
ActiveRecord::Base.transaction do
david.withdrawal(100)
mary.deposit(100)
end
In this case, if the withdrawal from david's account fails, the deposit to mary's account is not executed. Likewise if the withdrawal succeeds and the deposit fails, the withdrawal is rolled back and no action is taken. Either everything works or nothing does, and it happens as an atomic operation - meaning nothing else can access the database before the transaction finishes (either success or failure)
Upvotes: 2