Reputation: 32190
I have a page which renders a lot of partials. I fragment cache them all, which makes it very fast. Horray!
The thing is, that because of the amount of partials, the first run, when writing the cache, takes so long, the request timeout (but the other times are really fast)
I also use sidekiq (but the question is relevant to any background processor)
Is there a way to save those partials in a background process so users that miss the cache (due to expiration) won't have a timeout? So I would go over all partials, and those of which the cache expired (or is going to expire soon) I will recache them?
Upvotes: 5
Views: 635
Reputation: 1447
I was working on some project and had similar problem. Actually it was problem with only what page and problem with loading right after cleaning the cache. I solved it on another way (I didn't have anything like sidekiq, so maybe it will not be right solution for you, but maybe will be helpful)
What I did, is that right after cleaning the cache a called open()
method and put the problematic url as parameter like:
open('http://my-website/some-url')
so, after cleaning the cache, that url was being called and it creates a new cache automatically. We solved that problem quickly on that way. I know that the some background workers would be better solutions, but for me it worked.
Just to say, that our cache was cleaning by the cron, not manually.
UPDATE
or maybe if you want do clean the cache manually, you can after cleaning the cache call open('http://my-website/some-url')
but using the sidekiq (I didn't try this, it's only idea).
Of course, my problem was with only one page, but if you want whole website, it makes things complicated.
Upvotes: 0
Reputation: 17991
I only know of preheat gem, but I think it is still not complex enough for my need. Plus it hasn't been maintained for ages.
Upvotes: 0