Hello, all.
My website has been running slow for a while, but I’ve been attributing that to living where the internet is unreliable and bad. Having recently arrived back to my house where the internet is slow, I notice that my website is still slow.
Slow as in: 40 seconds TTFB.
I decided to try the var_dump($page) debugging as suggested in the kirby docs. The blob of information is so large my text editor hangs when I try to scroll through it. My web browser is unable to “show source code” without hanging. When I finally managed to save the page as an HTML file I found it was 50MB.
Something told me this wasn’t right, It seemed that every post and page and image had it’s own entry, even though none of that was used. I discovered the “scan for css files and js files in post pages and add them to the header” script, and when I took that out, the source code save was only 21.4 MB.
How big should the $page variable be? (Or the $site) variable be for that matter? I know it’s probably different for each site, but is there a target size? Or a “this is always too big” size?
Thanks a bunch.