I wonder whether anyone has tried setting up a Kirby site with a few thousand pages - and whether searching is still fast?..
I have a new project, which is an online shop for technical products (small parts) where visitors will find the item they need either by typing in a product code directly, or by searching through product descriptions. The catalogue is quite large: at the moment, they have almost 2,500 products, and they expect to extend it with more products before we launch, so it’ll be close to 3,000.
Normally, I’d assume that I’d need to use a database for this kind of project. I would assume that searching for products through 3,000 files would not be sufficiently fast - and searching is the main function of the site, so it DOES need to be fast.
But today I’ve come across discussions about flat-file CMSs where developers describe how they’ve been using things like Grav for similar sites, and setting up sites with TENS of thousands of pages, and still managed to keep the CMS search under 2 seconds…
Has anyone had any experience with this in Kirby? Do we have any benchmarks or info?