Reputation: 9643
I'm regenerating a redis
database in a Sidekiq
job. The things is, it takes a lot of time (2-3 minutes) to finish and the job happens quiet often.
In addition, this redis instance is a master to many slave instances. But the way that I'm regenerating it is basically a flush and then regeneration. When that happens, all the slave instances copy the flushed db. So in that window of 2-3 minutes the data that is shown by the slaves has nothing.
How can I perserve a redis DB when it needs to be regenerated again in Rails so that the slaves won't copy a flushed db?
The code below is the method I currently use:
class PlacementsGeneratorJob < ApplicationJob
queue_as :high_priority
def perform(*args)
redis = Redis.new url: ENV['PLACEMENTS_STORE_URL']
redis.flushdb
redis.hset '_default_', 'excluded_ua', Settings.default.excluded_ua.to_json
Campaign.all.each do |campaign|
redis.hset '_campaigns_', campaign.id, campaign.settings['search_engines'].to_json
end
end
end
Upvotes: 0
Views: 86
Reputation: 46419
I assume you are flushing because it's easier to regenerate from scratch than try to figure out the transformation. Obviously, it would be best if you could transform via updates, but if you can't, you could just transfer a dump after regenerating it. This will let you keep the database online the whole time, just with a small time of read-only access.
ie:
Upvotes: 1