Get all thumbs or other assets recursively

I’m trying to get a list of all thumb assets recursively, i.e. the entire thumbs folder and subfolders.
I only need the filenames.
I’m using the toolkit’s dir::read(kirby()->roots()->thumbs()) but that only lists the top level.
I know I could do some looping through the results, but as the toolkit has provided me with exactly-what-I-want magic for every other of my needs, I thought I’d check in and see if I’m missing a trick?

I might be wrong but i think you need to go through the pages, access images on those pages which in turn will give you the matching thumbs. That might be a slow process if your site is large, so im hoping theres a better way.

Whats your use case, whats the context of what you are trying to achieve?

Looking through the cheetsheet, i cant see a way to go deaper then the top level of the thumbs folder.

Hm, I don’t think this works without a loop:

$folder = new Folder(kirby()->roots()->thumbs());
foreach($folder->children() as $child) {
  foreach($child->files() as $file) {
    echo $file->filename();
  }
}

But this won’t work if you have more levels.

@jimbobrjames I don’t need to regenerate them, I can be sure that they exist as files already.
I do have a page that crawls all subpages, requesting thumb urls, but it is pretty slow as you say, and feels a bit redundant.
My use case is fairly unique - am making zip archives of all files.
As well as thumbs, will be doing the same for static assets folder, the same principal applies.

@texnixe that’ll do it, but the trouble is that folder is nested as deep as the content, i.e. many levels!

Could you not use a recursive snippet here, like the tree menu does? that will just keep calling it self until it runs out of levels? That way you dont have to loop for each depth manually.

I’d use PHP functions in this case, from SO:

function getDirContents($path) {
    $rii = new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path));

    $files = array();
    foreach ($rii as $file)
        if (!$file->isDir())
            $files[] = $file->getFilename();

    return $files;
}

dump(getDirContents(kirby()->roots()->thumbs()));

Ah ok, so there is an actual php thing called RecursiveDirectoryIterator

			$Directory = new RecursiveDirectoryIterator(kirby()->roots()->thumbs());
			$Iterator = new RecursiveIteratorIterator($Directory);
			foreach($Iterator as $name => $object){
				$filenames[] = $name;
			}

I’ve been having so much fun using the toolkit, i forgot about actual PHP!

@texnixe beat me to it.
but it looks so loong and ugly compared to @bastianallgeier 's style. I want dir::tree() !

But wait a minute… you say your creating a zip file. Do you even need to get all the file names? cant you just zip the thumbs folder with php?

Internally, the toolkit probably wouldn’t do anything else. Just hide the function away in a plugin file and you will have just a function call in your template.

@jimbobrjames er yeah, good point!
It’s because I was re-using a zip function that was expecting an array of paths, for instance to just zip all video content.
Maybe it’s time to stop for the night…

The folder class has a zip function, so

$folder = new Folder(kirby()->roots()->thumbs());
$folder->zip('path_to_zip_file');

should do the job if you don’t need a zip file with a plain set of files.

lol… never mind…the thumbs folder will always have images in so no need to filter or find anything. Your safe to just zip directly I think.

And so to :bed:

Just to add something - this might have been easier with a bash script. That way you can tie the script to a cron job so you get an automatic back up at a timed interval.

The zip terminal command has an exclude flag. For example:

Assuming your site is in a public folder.

To zip recursively but exclude certain folders

zip -r myarchive.zip /path/to/public -x "/path/to/public/site/*" "/path/to/public/panel/*" "/path/to/public/kirby/*" 

To zip recursively but exclude certain file types:

zip -r myarchive.zip /path/to/public -x "*.txt" "*.php" "*.yml"

The two combined:

zip -r myarchive.zip /path/to/public -x "/path/to/public/site/*" "/path/to/public/panel/*" "/path/to/public/kirby/*" "*.txt" "*.php" "*.yml" 

You can even kick off bash scripts from PHP with shell_exec() so in theory you can tie that to a panel button in a dashboard widget for on demand archiving, as well as using the same script in a cron job.

The above is untested but written based on the docs for the zip command. It should work.

What I’m asking myself is why you want to zip those thumbs in the first place. Thumbs are always regenerated from the originals f they are missing, so thumbs are not the sort of files that you would usually back up at all?

Well, for context, this isn’t a usual website scenario.
The final result is a web app that runs on iPads using the Kiosk Pro app - essentially a chromeless browser which can serve local files for offline use. But no php of course. The web app (angular) is essentially a Json file of all the content.
I use Kirby on the online/server version, so that clients can update content. To transfer to iPad, I currently use npm scripts to zip up all the files (different zips depending on file type and update frequency), including a static version of the json, which the only active Kirby template. The iPad app listens for zip updates (that’s a Kirby template too) and updates accordingly.
The main trouble with running npm/gulp on a cron schedule to create zips is it’s quite slow, and seems memory intensive on the server, and also means updates aren’t triggered immediately. Whereas I figured Kirby has all the hooks and modification dates to do it only and immediately when content changes. And, not rigorously tested yet, but feelings, makes zip files much faster.

So there you go, that’s how I use Kirby to build iPad apps!