Speed up flushing of file cache

I’m having a big site, with file cache enabled. When there’s not much editing on the website via the panel, the cache can grow quite big. As a result, the cache flush that happens when saving a page in the panel takes up too long and time outs, without having flushed the cache.

Emptying that folder over SSH takes forever too.

Anyone has experienced this before and can help me out please? Thanks!

Do you ssh directly into the server or via FTP client, should be faster with a simple delete command on the server.

That won’t solve the problem with the cache flush when someone saves a page, though.:thinking:

SSH via terminal. Listing the cache directory via FTP takes forever too. It’s a really big cache…

Not really run into this my self but I know you can flush the cache programatically with cache::flush(); shouldn’t be to hard to make a little dashboard widget & button to flush on demand rather then waiting for a page to be saved. You could probably do it on an admin only route as well, just hit the url in the browser to flush.

I don’t know a huge about about the caching… when you save a page, does it rebuild the cache for that specific page or for every page? Im guessing every page since its taking ages to complete.

I wonder if increasing the amount of ram allocated to php will help.

Hm, but if it times out in the Panel, the command itself will also time out.

No, the cache is only rebuild for single pages if they are opened in the browser.

My thinking is that your splitting the job in two this way. If you flush the cache via the button, then save the page its doing the job in two separate steps rather then a flush and rebuild in one bang.

Adding a button to flush the cache kinda defeats its purpose, no? I’m afraid pushing that button will timeout too, since cache:flush is used by the panel to flush cache too…

I’m thinking the problem has something to do with too much files in a single folder?

But as I said, it doesn’t rebuild, so that won’t really help.

Can you increase the memory_limit?

Maybe use custom caching…

@bnomei As far as I’m aware you have built big sites? What is your solution to caching?

I now flush the cache when this problem occurs by SSH-ing into the webserver and deleting everything in the cache folder manually via find -delete which is the fastest way to do it via command in my experience. But it still takes a lot of time…

It’s an annoying problem since it’s basically renders the panel useless when the cache is in this bloated state.

Maybe you can run a cron job that flushes the cache once in a while, so it doesn’t get that huge in the first place? Maybe once a month at night?

It’s not really a php memory issue AFAIK. Deleting the cache folder contents takes a lot of time via SSH as well, so a timeout can’t be avoided.

Maybe that’s an option, but it kinda defeats having a cache mechanism, right?

Well, yes, but if the whole cache get’s deleted anyway if the page is updated…doesn’t make such a big difference. And it only affects the first visitor on a page. Maybe not ideal, but then this whole page cache is not ideal.

i would think its actually much faster to just nuke the cache folder from terminal and recreate the empty cache folder with the right permissions immediately after rather then getting it to dig through the contents.

I am interested in this approach too tbh. I’ve already asked this here a while ago: Cache setup tips

I haven’t tested this, but if it is, then it should be the default Kirby behaviour imho :wink:

@texnixe, I have even tried setting up memcached, but I can’t get it working somehow. Do you have any experience with this? And tips to debug this? e.g. how can I see what’s inside the memcached “database”?

Well, you can actually run bash scripts from PHP, so if theres a way to stop the default process of cache flushing you could run a bash script from a panel hook on page save to nuke & recreate folder instead (i think… you would need to do it before save, if theres an appropriate hook for that). I have a feeling that might mean tinkering with the source code though.