Get a better pagespeed score

Hi,

i run my latest kirby site through https://gtmetrix.com and got a not that good C rating.

Then i fiddled around with the .htaccess a little bit and added some cache and gzip rules:

# ----------------------------------------------------------------------------------------------------------------------------------------------
<IfModule mod_expires.c>
# Enable expirations
ExpiresActive On 
# Default directive
ExpiresDefault "access plus 1 month"
# My favicon
ExpiresByType image/x-icon "access plus 1 year"
# Images
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpg "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType image/svg "access plus 1 month"
# CSS
ExpiresByType text/css "access plus 1 month"
# Javascript
ExpiresByType application/javascript "access plus 1 year"
</IfModule>
# ----------------------------------------------------------------------------------------------------------------------------------------------
<IfModule mod_headers.c>
  <FilesMatch ".(js|css|xml|gz|html|woff2|woff|otf|eot|ttf|svg|jpg|png|gif|jpeg)$">
    Header append Vary: Accept-Encoding
  </FilesMatch>
</IfModule>
# ----------------------------------------------------------------------------------------------------------------------------------------------
<IfModule mod_mime.c>
    AddType application/javascript          js
    AddType application/vnd.ms-fontobject   eot
    AddType application/x-font-ttf          ttf ttc
    AddType font/opentype                   otf
    AddType application/x-font-woff         woff
    AddType image/svg+xml                   svg svgz 
    AddEncoding gzip                        svgz
</Ifmodule>
# ----------------------------------------------------------------------------------------------------------------------------------------------
<IfModule mod_deflate.c>
    AddOutputFilterByType DEFLATE text/html text/plain text/css application/json
    AddOutputFilterByType DEFLATE application/javascript
    AddOutputFilterByType DEFLATE text/xml application/xml text/x-component
    AddOutputFilterByType DEFLATE application/xhtml+xml application/rss+xml application/atom+xml
    AddOutputFilterByType DEFLATE image/x-icon image/svg+xml application/vnd.ms-fontobject application/x-font-ttf font/opentype
</Ifmodule>
# ----------------------------------------------------------------------------------------------------------------------------------------------
Options +Multiviews
# ----------------------------------------------------------------------------------------------------------------------------------------------
#Header unset ETag
#FileETag None

and now the score looks like this:

I’m not the .htaccess pro but i’m happy with the result.

Does somebody else have more hints&hacks to speed up pageload with kirby?

best Regards,
Svnt

4 Likes

I use https://developers.google.com/speed/pagespeed/. If it gives a good score it might give SEO advantages as well.

Other than that I do a few simple things.

Only 1 js file and 1 css file

Some people like to have 1 base css and then have one css for each template. It means less information in bytes needs to be loaded. It also means 2 http requests instead of 1. In most cases the extra http request takes more time than the extra bytes of a css file that contains everything. This larger css file is also in general cached by the browser which mean that the next page css will load faster.

It could be a debate about this, but that’s how I work with it and Google Pagespeed agree that it’s a good idea.

Images

A really simple solution to optimize load time on an image heavy page is to remove the images or add a pagination for them. I often saved the images as png because I like quality. I’ve stopped doing that because compressed jpgs are so much smaller. So, use jpg. If you have a logo, use svg if possible. It scales and is a small format.

Compress html

There are 3 Kirby plugins for compressing html:

I think all of them works more or less fine.

Caching

Kirby has a cache. For heavy function calls it’s also possible to cache a part of the page.

Don’t use jquery or other frameworks

http://youmightnotneedjquery.com/

It’s also debatable, but I’ve started writing my sites with javascript without jquery. More and more js plugins are no longer dependent on jquery. It will save about 100 kB of javascript to skip jQuery.

I also try avoid to css and js frameworks because they probably contains 5% of what you need and 95% of what you don’t.

Always keep speed in your mind

If you have speed in your mind from the first line of code it will probably be a fast site. Then you will avoid all the things that has a price in terms of speed. Ask yourself, is this needed by users or is it just for fancyness?

There are so many things you can do for performance of a site but here are a few ways I like to do it. A good hosting providor is also important.

7 Likes

i do not agree with @jenstornell suggestion about avoiding libs and frameworks. like he said its debatable. combined into one file for css and js each they compress very well when gzipped and are cachable by browser (with proper htaccess). focus on creating things with ease instead of limiting yourself too much.

most impact imho comes from unoptimized images uploaded into the cms. create at least proper thumbs or use an optimizer plugin (kirby has a few).

But i do. I droped jQuery years ago and just use vanillaJS if i need JS. It works good for me.

Hint: code/test your sites on a hotel wifi :wink:

Nice talk: https://www.youtube.com/watch?v=FhbMNV-FAIQ

Money quote:

“Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.”
Antoine de Saint-Exupery

2 Likes

My two cents since ive been doing this a lot lately.

What really matters:

  1. HTTP2 most of the webservers are http2 ready, you need to have also https for http2 but since letsencrypt its free, its just about not being lazy. HTTP2 really makes the whole bundling thing not relevant anymore because there are no more limits on concurent requests like in http1. We are actually stopping bundling because when you bundle two libs together with your scripts then client has to download the whole thing every time you update anything. When you dont bundle you have to download only updated part.

  2. Minifying (compressing) html does not give you big gains if you are using gzip. Often times you get few kb and you get ugly markup that sometimes breaks stuff. I wouldn’t do it.

  3. Minifying css with cssnano on the other hand does make a quite big difference. Minifying javascript (with uglyfi) also does. Do it.

  4. Images - if you are making image heavy site every code optimalisation is almost negligible compared to size of images. Best image compressor is mozjpeg. There is imagekit for kirby that can use it. It will save you a lot. Also using scrset for responsive images can greatly save bandiwth. There is lot of stuff going on with image optimisation (webp and guetzli) but right now best practical thing is mozjpeg :).

  5. First load time - critical css (basicaly inlining part of your css) makes huge difference. Google critical css how to do it. If you are not writing too much css, you can just inline the whole css in head. We do that on few sites. We do all css custom without any frameworks and then the sizes are really small. If you have 5KBs css file, just inline it. It will make site feel faster.

About the frameworks - i dont think js libs like jquery or lodash make your site slower. On the other hand things like bootstrap and foundation will make it slower. Those big ui frameworks are hell.

Also there is this nifty tool https://developers.google.com/web/tools/lighthouse/ that combines lot of cool stuff together.

1 Like

there is a similar thread here

1 Like

checking if one does need jquery is a good idea. not using it even if it could make a better user experience when using it just to save a few KB (less than 30KB) seems a bit silly tough. :grimacing:

edit: added kb estimation

Btw you can simulate network conditions in chrome dev tools. It makes things more visible :slight_smile:

1 Like

@krisa

  1. http2 looks interesting. I’m not sure if Google Pagespeed would like that files that are not bundled anymore (untested). Because Google Pagespeed also gives SEO points, I often do what they suggest.

  2. Google Pagespeed likes it. Good for SEO, but I agree that if it breaks it’s not worth it. GitHub - iksi/kirby-compress: Compresses Kirby’s html output. can break in some cases, but I don’t think the others do.

  3. Just by changing from png to jpg, I went from like 250kB to like 50kB. I think the jpg compression does make a great difference. But yes, you should start with resizing the images first.

  4. What happends on the second page load? No inline css? Or do you always have inline css in the head?

About the frameworks - i dont think js libs like jquery or lodash make your site slower.

Yes they do, especially on slow or bad connections, but to make it feel as fast as possible jQuery should be loaded with a CDN for example Bibliotecas alojadas  |  Hosted Libraries  |  Google for Developers

Also there is this nifty tool Lighthouse overview - Chrome for Developers

I’ve just tested it and it seems like a great Google extension. Thanks! :slight_smile:

@bnomei

I think there are other benefits to not being dependent on jQuery as well. Many people add it because they simply can’t do anything without jQuery. For me it was a great learning experience to step out of it. I think http://www.vanillalist.com/ is a good start for anyone who need vanilla (jQuery free) plugins JavaScript Plugins Repository is another one.

less than 30KB

At least in 2009. :wink:

1 Like

ok 32kb

1 Like

shrink it by 2/3 with almost the same features… use: http://zeptojs.com/

1 Like

I’ve made a local packscript for our blog/site editors and they love it. I would be great to have this compression in the kirby panel… but anyway…

lol guetzli is swiss german for: sweets, candy :candy:

1 Like

@bnomei

A real example. My site Lånera is a site completely without jQuery. On the startpage it has some tags called “Filter” and “Visa”. It includes filters, range sliders, toggles etc.

  • The size of the js of these functions is only 14.8KB gzipped.
  • The js for the whole site is 15.7KB (I noticed I forgot to merge Kirby Nja, about 0.9KB).

If I would have been using jQuery, it would probably be around 47.7KB (32 + 15.7).

I understand that you don’t see the gain. But if we start somewhere and make small adjustments here and there, we can go from a very slow site and end up with a fast one over time. “Många bäckar små” as we say in Sweden. Translated it should be “Many small rivers will end up in a big lake”. It means that many small things will eventually be a big thing, in this case, many small speed changes will make a big impact on your site.

@Svnt

google/guetzli

Yes, there are not many Kirby plugins with these kind of tasks. It would be nice to see more of them. I don’t know how well Imagekit compress images compared to that.

use: http://zeptojs.com/

Or the modular Domtastic, only 4KB gzipped. It kind of looks like jQuery as well:

$('.item').append('<p>more</p>');

HTTP2 is far more loved from google than any html compression. I dont think page speed ranks html compression better. Actually i don’t think google themselves compress html (youtube is not compressed and their docs neither).

Google is pushing http2 hard. Pagespeed is not canonical source, it often lags behind of what google says is best practice.

With critical css you just have it in there. You download it every time. You might think that this is wasteful but from my experience its best solution for fastest load. If you want to try this, good tool is http://www.webpagetest.org/ you can see timeline there. You might try your site without inlined css and then you can inline generated critical css https://jonassebastianohlsson.com/criticalpathcssgenerator/

But its all about the types of websites i do. I make mostly custom designed (small custom css) presentation/content sites often with lot of images. When you have 5 big photos on every page… 10kbs dont matter. I wouldn’t do it this way in webapp for example.

Also KBs are not the most imporant thing for fast page loads. Its about amount of requests (http2 solves that a lot) and if you use JS then its about writing it in proper nonblocking way (oh and good thing is to load JS async). Lazy loading images is important.

On GUETZLI - one day it might be cool. But right now its just plain worse than mozjpeg. It is literally 1000times slower (it takes like 30 minutes to encode one regular sized image) and sizewise its often times even bigger. Now it might one day get very optimized and will yield better results but right now best compression is mozjpeg period. And ImageKit can use it (the problem with imagekit is license, i just cant use it on small projects).

BTW @jenstornell you will get faster render if you use critical css, you load roboto font async with js (there is small loader for that) and use lazy loading of images. I mean thats what pagespeed is saying and actualy…
https://lånera.se/assets/images/people-grid.jpg this photo when you optimize it (jpegtrans, mozjpeg whatever) you shave off 30 kbs without any visible difference. Webfont is also fun check it - you are downloading it synchronously (i asume so that you dont get that nasty flash of type) but also you are downloading cyrilic, greek, vietnamese… al in two weights. You save what… hundreds KBs if you subset the font properly.

Thats sort of my point, you micro manage KB here KB there and than you loose it all on one header and webfont. Its important to know your battles.

I dont think 30kbs matter that much in times of hero images and webfonts.

1 Like

With critical css you just have it in there.

Alright, I’m considering to implement it. I’ve been thinking about it for quite a while, but it just feels a bit wrong to add 5-10 KB on each page, but I understand the benefit.

And ImageKit can use it (the problem with imagekit is license, i just cant use it on small projects).

Maybe it would make sense to have a license for unlimited sites @fabianmichael? Then small projects would not be a problem if you already have an unlimited license. It could also have a line that says “Reselling not allowed”.

you load roboto font async with js

I used it async before but I did not get around the font flash effect where the system font was loaded first and then replaced with the loaded font. Even if a flash effect is required, I tried to find a way to cache the font for the next page but I did not succeed on that. The flash effect appeared on ever page. Until I find a solution I will keep it like this.

this photo when you optimize it (jpegtrans, mozjpeg whatever) you shave off 30 kbs without any visible difference.

I don’t use Imagekit at the moment or anything else (except for Kirby thumbs where you can set the quality) and I’m not too fond of doing these things manually. But yes, I could probably optimize the static images that will not change over time.

but also you are downloading cyrilic, greek, vietnamese

That’s a good advice. I need to look into that. :slight_smile:

About the fonts:

You will not get around the font switching. The best you can do is to make it in a way that they will switch on first load and then will be fine. The problematic thing is when the fonts “blink” font 1 - nothing - font 2. You can also hide the fonts using global class (when loaded with js) and unhide them when fonts are loaded. In any case from my experiments - loading things async with js was giving me faster loads (the page was rendered faster) even if i was just wait for the font (as would happen when loading webfont sync).

Its not too much work but honestly lot of times i am lazy and i do it same way you did. Best approach is to design website using system fonts :)).

Sidenote
Just a with critical css its often times much less than 5-10KB. The generators look on what you have on you screen when you come there. It might be just few rules. The 10KB example was me inling the whole css file when its small and doing critical css every time something changes is too much work :D. (you can setup automatic critical css generation but then you need to use some headless browser like phantomjs and its just more pain. Ive settled on using online generators and i do it just before the site is finished).

1 Like

you can setup automatic critical css generation but then you need to use some headless browser like phantomjs and its just more pain

If I will keep using Gulp tasks, it would be possible to have the css task to also create a file called critical.css or something and include it inline in the header.php. For that to work I need to have a clear separation between critical css stuff and non critical css stuff.

For larger stylesheets, what is critical css? Some says it’s the things above the fold. But what is that? Screens have different resolutions. My screen is 3840x2160. That’s my fold. I really think there is no fold because all screens are different. Maybe we should think in terms of layout instead and generate the skeleton for the site as critical css? And perhaps the global top things like the menu and logo.

Thought I would chip in.

Personally I use sitepeed.io to generate a performance report. It’s a command line tool. By default, this will only run on 1 page (ie: the URL you feed it.) Be sure to set the number of runs and crawl depth. By default it will run the test in Chrome, but can use other browsers.

I’ve recently be looking at ways to improve site performance. There are a number of reasonably easy ways you can do this.

Images

This is always an uphill battle, particularly when clients are allowed to upload their own content. With a little education on optimising sites, together with some occasional housekeeping, you keep things in order. I use rsync to keep my local copy up to date compared with the live version. I also use it for deploys. Once I pull down from live, I run ImageOptim CLI on the content folder. This will recursively rip through the site and optimise the images. You can run this on it’s own from terminal or as part of your build (I do it with NPM Scripts).

Scripts

Recently I figured out how to reduce load time by splitting code up on a per page basis. Kirby makes this dead easy due its javascript autoload feature described here. This means using Browserify together with the factor-bundle module, you can greatly reduce page load. Essentially, Browserify will look through all your js files and find duplicate and common code. It will then generate a common.js file, together with files for each page that has javascript. I do this with NPM scripts, but you could probably do it from the command line or Gulp/Grunt. The command took me hours to figure out, so I’ll share if it helps anyone. Just name your files after your pages ie: home.js or contact.js. Browserify will do the rest:

browserify src/js/*.js -p [ factor-bundle -o 'uglifyjs > public/assets/js/templates/`basename $FILE` -b' ] | uglifyjs > public/assets/js/common.js -b

If you want to minify the code as well, this can be done by adding the -c and -m switch which will compress the code and mangle function names etc.

browserify src/js/*.js -p [ factor-bundle -o 'uglifyjs -c -m > public/assets/js/templates/`basename $FILE`' ] | uglifyjs -c -m > public/assets/js/common.js

CSS

Some tools to optimise CSS have already been mention, like CSSNano. The way I do this is through PostCSS which can run a whole bunch of tools against your code. I use Uncss, mqpacker and cssnano.

The way uncss works is to compare css to html, then it strips out all unused CSS. For this to work, uncss needs a page list to work from. Kirbies API can help here. Setup a page template called uncss.php and add this code to it:

<?php
header('Content-type: application/json; charset=utf-8');
$root = $site->find('home');
$firstlevel = $site->children()->visible();
$secondlevel = $site->children()->children()->visible();
$thirdlevel = $site->children()->children()->children()->visible();
$json = array();
$json[] =  (string)$root->url();

foreach ($firstlevel as $article) {
    $json[] =  (string)$article->url();
}

foreach ($secondlevel as $article) {
    $json[] =  (string)$article->url();
}

foreach ($thirdlevel as $article) {
    $json[] =  (string)$article->url();
}

echo json_encode($json);

Next create an empty page in your content that use that template. You can then fetch the page list to a json file with the following command:

curl 'http://example.dev/uncss' > uncss.json

Then simply feed it into your PostCSS Config like so:

require('postcss-uncss')({
  html: JSON.parse(require('fs').readFileSync('./uncss.json', 'utf-8')),
   ignore: [/.(ignore|these|classes)/]
  }),

That will give you drastically smaller CSS files. Notice the ignore option. Uncss cannot execute javascript so things like image sliders and other plugins that add classes to the page will break because Uncss will strip them out. You need to tell it to ignore these. The regex above will find multiple strings within multiple classes. :slight_smile:

Hopefully this helps some body.

Happy optimising :slight_smile:

3 Likes

You are exactly right about the gulp task. The problem is that you will have to have phantom js dependency in the gulp task which is like 80Mb and that bothered me :D. Its been a while though and i think there are ways how to link global phantomjs so you don’t have to have it in every project. Also there is headless chrome now and i know stuff like critical css is gonna happen with it.

About the fold - i wouldnt worry about it too much. I thought the same but it basicaly looks at top of the page and gets most important css. Its the basics like backgrounds, headers, basic typography. It seems to just work well. Also really the whole point is to give user something between those few 100ms before they get your real css. From my experience it is so much better when first thing you see are block of page with right colors and proportions than just white something that jumps to proper design afterwards.

The only problem there can be is if you have wildly different css from one page to other since it obviously scans just the page you give it (homepage mostly). But this was also not an issue for me because sites are mostly not like that.