Reputation: 351
I have the following relational DB:
class Search
has_many :items , dependent: :destroy
end
class Item
has_many :sub_items , dependent: :destroy
end
class SubItem
end
Items can range from 100 to 10,000. SumItems is around 100 on average.
Trying to keep my memory low I used the following:
class Search
before_destroy :destroy_items_in_batches
def destroy_items_in_batches
self.try(:items).try(:in_batches, { of: 300 } ).try(:destroy_all)
end
class Item
before_destroy :destroy_transactions_in_batches
def destroy_transactions_in_batches
self.try(:sub_items).try(:in_batches, { of: 100 } ).try(:destroy_all)
end
It still wasn't enough , so I added a select
into the query to reduce the memory loaded since I'm destroy it anyway so all I care about is the ID I need to remove from the database.
self.try(:items).try(:select,:id).try(:in_batches, { of: 300 } ).try(:destroy_all)
self.try(:sub_items).try(:select,:id).try(:in_batches, { of: 100 } ).try(:destroy_all)
Anything else I can do to remove the memory usage? (besides the obvious solution of decreasing the batches size)
Upvotes: 0
Views: 384
Reputation: 9692
Using destroy_all
actually instantiates each object into memory and then runs each item's callbacks before destroying just that one row from the database. Very inefficient.
Instead, as suggested by Marc, you should use delete_all
in this order:
SubItem.where(conditions).delete_all
Item.where(conditions).delete_all
Search.where(conditions).delete_all
This will run a SQL statement like DELETE FROM sub_items WHERE conditions...
. Three SQL statements instead of (potentially) hundreds or thousands. Should be a lot faster.
Upvotes: 1
Reputation: 1352
I would suggest that you replace destroy_all
with delete_all
. delete_all
will delete the objects (records) directly from the database without loading them into memory or reading them at all.
One caveat is that no ActiveRecord callbacks will be run so you need to deal with that yourself if necessary.
Upvotes: 1