Some years ago I created a website using Kirby 2. It was one of my first projects using Kirby. The websites main purpose was to present various products categorised by type.
Recently the client contacted me saying that he himself along with some visitors noticed that the site is running slow. I run some tests and found out the slow page loads are not related to front end issues (Although there is room for improvement there) but mostly due to large ttfb.
I wanted to test php execution time so i run some tests using microtime() and found out that some pages need up to 12 seconds to render. What I found strange though is that I can open a page and get microtime to report a very slow render time and then hit reload and get something like 0.05s as if the rendering of the page is getting cached. This creates a problem because I have no way of making measurements in a predictable manner as I have no control on how to reproduce the slow render time.
I hoped that maybe someone has some insight on why this is happening and how can I test it in a more reliable way.
Thanks in advance.
I think you chose the wrong category for your question (Kirby 3, instead of Kirby 2).
Could it be that there was some server-sided change, for example the PHP caching is disabled because of a wrong PHP configuration? Is there maybe a general server-sided issue like too less RAM/memory to execute PHP requests quickly? This could explain why it works so fast when its frontend is cached.
Have you set caching to true in your config?
Thanks, I changed the category.
I am testing with browser cache disabled so the front end is still served when I get fast loading times. Also I can replicate that in my local environment although the differences are less dramatic(I guess this is because I am using a way more powerfull computer than a shared hosting). No matter the problem is the irregularity.
No cache is disabled. I can see how caching the pages could help the situation but at this stage I just want to pinpoint the problem so I can have something to compare to once optimisations (maybe including caching) will take place.
Did the client add new content over time or did the content stay the same between when everything was fine and now?
And is it a big site and use
$site->index() a lot?
Sure. Site was working fine with a few products. As the list grew bigger it started to slow down. I am quite sure this has to do with unoptimised code on my side but I can’t understand why, since no caching is taking place on the server side or the front side, some times it loads fast.
I would say it is medium to large sized. I dont use $site->index() anywhere but I am using $page->index() on pages with a lot of children and I am sure this the main problem. But as I stated above my question really is “why isn’t it always slow since no caching is taking place”
That leaves as with server side caching then, but if the same happens in your local environment
I am using MAMP on my local environment. Could MAMP use caching as well?