Fragment caching with background worker

I have a page that has a lot of partitions. I fragment them all, which makes it very fast. Horray!

The thing is, because of the number of partitions, the first run when writing the cache takes so long, the request timed out (but very fast at other times)

I am also using sidekiq (but the question applies to any background cpu)

Is there a way to keep this partial data in a background process so that users who have missed the cache (due to expiration) don't have a timeout? So I would consider all the partial ones, and the ones that the cache has expired (or will soon expire), will I re-read them?

+3


source to share


2 answers


I was working on some project and had a similar problem. Actually it was only a problem with that page and a problem with loading immediately after clearing the cache. I solved it in a different way (I didn't have anything similar to sidekiq, so it might not be the right solution for you, but it might be helpful)

What I did was, right after clearing the cache, the method is called open()

and put the problematic url as a parameter:

open('http://my-website/some-url')

      

so, after clearing the cache, this url is called and it automatically creates a new cache. We quickly resolved this issue. I know some background workers would be better solutions, but it worked for me.



Just to say that our cache was cleared by cron and not manually.

UPDATE

or perhaps if you want to clear the cache manually, you can clear the call cache open('http://my-website/some-url')

but using sidekiq (I haven't tried this, that's just an idea).

Of course my problem was just one page, but if you want the whole site it complicates things.

0


source


I only know the gem preheat , but I think this is still not complicated enough for my need. Plus, it hasn't been maintained for centuries.



0


source







All Articles