Caching and limiting external api json request

I want to show the current weather at the clients office. I adapted a solution found on this excellent forum. One thing I am struggling to get my head is how to limit how often the external data is requested. Is there an inbuilt caching api I can use or should a look at caching everything in Kirby, then setting up a cron job to clear the cache every hour?

<?php
$data = [];
$url = 'http://api.openweathermap.org/data/2.5/weather?q=London&units=metric&&APPID=1234';
$request = remote::request($url);
if (!empty($request->content)) {
$data = json_decode($request->content, true);
} ?>
<?php if($data['cod'] == 200): ?>
 <?= round($data['main']['temp']); ?>°C
 <?= $data['weather'][0]['description'];?>
 <?= $data['wind']['speed'];?>
 <?php else: ?>
<?= $data['message']; ?>
<?php endif; ?>

There are several ways to solve this:

1. Save it to a file

See this post to get an idea of how to write data to files

2. Use the Kirby cache

See the post above in the same threat

3. Save it to a custom site field

This is something a did a while ago. You need a textarea field to store the data and a textarea to store the timestamp of last update. And then you need to check if the data should be read from the field or if a new request should be made.

Seems to work with

$data_ai = null;
$cacheFile = kirby()->roots()->cache().DS.md5('weather_cache').'.json';
if(!f::exists($cacheFile) ||
   ( f::exists($cacheFile) && (time() - f::modified($cacheFile) > c::get('cache.refresh', 3600)) )  // seconds
  ) {
    $url = 'http://api.openweathermap.org/data/2.5/forecast?lat=43.834591&lon=4.360860&lang=fr&units=metric&APPID=cd9fa9a6628053cf11ee83972f73b&cnt=5';
    $data = file_get_contents($url);
    $data_ai = json_decode($data);
    f::write($cacheFile, $data);
} else {
    $data_ai = json_decode(f::read($cacheFile)); 
}

and get data with stdClass like

$data_ai->list[0]->weather[0]->description