From time to time, weāve got issues with image thumbnails not being generated correctly. This is most likely a server issue (timeout, memory, whatever): the final image has the proper ratio and size but is only partly rendered, e. g. the upper quarter or upper half, the rest is grey. Usually, on next load, everything is fine again.
With caching enabled, though, Kirby stores this broken image.
Is there a way for Kirby to determine if thumb generation was actually successful before generating the cache file?
What thumbdriver are you using? Maybe switching to IM will clear up the issue. You would have to check that the file is valid, and the problem is that most images, particularly jpegs can be read if if the file is half missing. As long as the header is intact, it will read as much as it can. So the trick is deciding how corrupt is corrupt A successful generation may not equate to a good generation.
We havenāt set anything specifically so I guess we use GD by default.
Is Image Magic a drop in replacement or do I have to keep any side effects in mind when switching?
That sounds like there is nothing I can really do, right. Because the images are served, so the headers must be fine.
You need it installed on the server, but it usually is. GD is more common, but in my experience with image heavy sites, the results are better with IM.
Yes, iām afraid thats probably the case, The jpeg format is actually pretty forgiving (itās how criminals get caught, even after they tried to shred their hard drive). You can even hide files inside jpegs and still have the image read back just fine in a graphics programs without revelling the secret. itās called Steganography.
are the images too large in width and height? maybe your server (shared etc) does not allow you enough memory to work on the image? both gd and im will work with uncompressed data. try saving it as tiff to get a rough estimate on memory requirement of a file.
also try running the image to an optimizer like imageoptim before thumb creation to remove fancy color profiles.
Yes, the images are large but optimized with ImageOptim already.
We raised the memory limit already and so far everything is fine. I just hoped there was a way to exclude broken images from the cache (which doesnāt seem to be the case).
Iām not sure that would work. If the headers ok, it would get a mime type back (i think), but the rest of the file might by corrupt or partially missing. I think you would have to check for an end of file marker instead, no idea how to actually do that Jpegs can still be read of there is no end of file marker, but if it is there then you know the file is complete.
I canāt get rid of the feeling that this is something Kirby should handle (somehow). Thumbnail generation is failing again and again which is a problem as I handed the website over to my client. I canāt keep coming back to the website, check every single image and trigger thumbnail regeneration for the coming years.
There is no real pattern as to which images fail. Yes, the image files are large. But they donāt always failāonly occasionally.
Are the thumbnails generated synchronously or asynchronously? I think the generation fails more often when many thumbnails are generated at once (i. e. when opening a page in the frontend after uploading a larger amount of images) so I assume synchronously, at least to a certain amount. If so, could there be a Kirby option to limit the number of processed images at once?
Running into the same problem with one client. Found this stackoverflow threat discussing possible ways to detect these corrupt images.
Iād hope there would be a better way to detect that this happenes while the thumbs get generated since the discussed technique is not perfect (false positive for images that should have the exact solid gray) and needs lots of additional processing.
There is also an issue on Github regarding this. (In some environments the resulting images seem to be corrupted, in others the source is simply copied, but I would guess for the same reason.)
The defined size has to be 2x the final thumb size to prevent aliasing and it only works with jpgs.
In the authors test, scaling a 10000x10000px JPG with this used 35MB ram instead of 720MB and took 300ms instead of 3s CPU time. Thats amazing.
I usually advise my clients to upload their images in full quality so that I have all the freedom with choosing srcsets (also in the future ā who knows what Apple will come up with next ).
My standard srcset widths are: 640, 1280, 2560, 3840 and 5120.
Photos my client upload are often around 6000 wide.
So for every picture <= 2560, -define jpeg:size= definition would work and save a lot of ram and processing time.
The author also recommends vipsthumbnail which is an even more efficient tool to resize images. Unfortunately the adoption of it isnāt very wide yet and on most of my clients servers I donāt have access to install tools.
Adding -define jpeg:size= to the ImageMagick darkroom looks like a low hanging fruit to improve the resizing performance and it could help to prevent the broken thumbs problem.
Intresting. It is possible to extend the thumbdriver your self in plugin so you should be to override and extend the current feature yourself without needing it to be added to the core.
I sometimes encounter a single huge image clogging up the thumbs job, leading to multiple failed resizingās. Like a 25000x4000 renderer by architects. Thats easiest solved by:
accept:
maxheight: 10000
maxwidth: 10000
in the file blueprint. I think 10000x10000 px source image files is future proof enough for the next century.