Cache part of a page?

This is really a wide question so I’m open for all kind of suggestions.

I’m converting a site from WP to Kirby, making use of my bought licenses.

The slow part of my site looks like this:

It’s quite static. It will not change much in time but the rest of the site will. Therefor I came up with the idea that I could cache just this part of the page.

Why it’s slow

I’m parsing a large blueprint and it takes time. Then the data is used to display in the table above.

To know what I’m after here are some sub questions:

  • Is it a good idea to cache just a part of a page?
  • Would it be best to cache as html or some kind of query cache?
  • How would you solve it?

You can use the cache class to cache the data you parsed from the blueprint (this code belongs in a controller):

$data = cache::get('cachekey');
if(!$data) {
  // get data somehow
  $data = somefunction();
  
  // cache it
  cache::set('cachekey', $data);
}

return compact('data');

Please note that you need to enable the cache so that Kirby can store the data in the site/cache directory. You could however use c::set('cache.ignore', ['*']); to disable the page cache.

5 Likes

i just create a file in site/cache myself and read data (json in my case) back if needed. so i do not need to enable cache for complete site.

$data = null;
$cacheFile = kirby()->roots()->cache().DS.md5('cache-for-something').'.json';
if(!f::exists($cacheFile) ||
  ( f::exists($cacheFile) && (time() - f::modified($cacheFile) > c::get('cache.refresh', 360)) )  // seconds
) {
    $data = []; // get data somehow
    f::write($cacheFile, json_decode($data));
} else {
    $data = json_decode(f::read($cacheFile));
}
3 Likes

if its slow and almost static. yes cache it. but beware the root of evil. :wink:

i am identifying timesinks in code myself on localhost using xdebug timstamps. so stuff pops up like its not a good idea to create $site->index()->filterBy() if one has a content folder with 10k+ pages every page load. just iterating with a for loop and storing the resulting uris (which is static result for 1-2h) in a cache was way faster.

@lukasbestle I made a quick test now in a snippet just to see how it works and it does. First I was wondering how to clear the cache with the cache key but then I realized that I only needed to set it again to overwrite it. I might reset the cache once a week or if something else trigger it.

I also like the built in approach.

@bnomei Thanks! If I have trouble with the built in solution yours are probably a neat alternative.

root of evil

Yes, I always have in mind to optimize the things I code. In this case I could make it 5 times faster but even if I do, it’s still too slow and I know that all the parsing is the bottleneck and I still need the parsing.

I have plans to move from my own blueprint plugin to GitHub - AugustMiller/kirby-architect: 📐 Easily reference Blueprint data from anywhere in your Kirby application. @AugustMiller in the future if some features are implemented (support for register set blueprints and definitions). It has some kind of cache as well and I need both.

Thanks! :slight_smile:

Just for completeness, this is my final result:

Config

c::set('cache', true);
c::set('cache.ignore', ['*']);

Controller

<?php
return function($site, $pages, $page) {
  $cache = cache::get($page->slug()); //My slugs are unique
  if( ! $cache ) {
    $cache = array(
      'foo' => 'bar',
    );
    cache::set($page->slug(), $cache);
  }
  return array(
    'cache' => $cache,
  );
};

Template/snippet

echo $cache['foo'];
1 Like

Cache expiration is always difficult. Your current solution won’t update the cache once the original data is changed. So you either need to clear the cache manually to rebuild the information or use the expiration time param for cache::set(). Best would be to check if the blueprint changed (filemtime()), but if it doesn’t change often, one of the other ways would have a better performance.

Your current solution won’t update the cache

Yes, I know. I’ll probably compare dates/timestamps, or trigger a delete cache file “action” when new data is created.

or use a cronjob to update this and maybe other future caches on fixed timeframes, like once every night.