Multisite setup flexible for local to remote server?


// /site.php

$kirby   = kirby();
$domain  = server::get('server_name');
$domains = array('', '');

if(in_array($domain, $domains)) {

  // custom roots
  $kirby->roots->site    = __DIR__ . DS . 'site' . DS . $domain;
  $kirby->roots->content = __DIR__ . DS . 'content' . DS . $domain;
  $kirby->roots->avatars = __DIR__ . DS . 'assets' . DS . 'avatars' . DS . $domain;
  $kirby->roots->thumbs  = __DIR__ . DS . 'thumbs' . DS . $domain;

  // custom urls
  $kirby->urls->index    = url::scheme() . '://' . $domain;
  $kirby->urls->content  = $kirby->urls->index . '/content/' . $domain;
  $kirby->urls->avatars  = $kirby->urls->index . '/assets/avatars/' . $domain;
  $kirby->urls->thumbs   = $kirby->urls->index . '/thumbs/' . $domain;


i was thinking about saving some files (because i am regulary syncing and doing backups and stuff) and having several project stages within my localhost which adds quite some amount of files for me to sync between here and there. so i have been looking at the multi setup tutorial and i have been wondering how that would work out when transitioning between localhost as well as goin online…

same goes as i have multiple installations from composer (such as mpdf) which ups one kirby folder towards 10k files …

i interpret, that the code above will check the url and collect matching folders for content…

so let’s say my site is accessable locally (usually “”) but obviously “live” the live url is diffrent. thus the folder would need to be renamed manually (change config file again)

so is there anykind of way to get it slightly more flexible being able to dynamically provide a local as well as remote domain?

Alternativly: Develop on Localhost with “one installation for all projects” and publishing live without much hassle (or manual work)…


Why not set the domains in your config and then fetch them via c::get() from the corresponding config files.

haven’t done anything with multisites so far, so i am just looking out for options…

so your solution would be:

//site config folder
//sample config.xxxxx.php
c::set('domains', array('localdomain'));
// site.php 
$domain  = server::get('server_name');
$domains = c::get('domains'); 

// if(........
// .........
// ........

just trying to avoid having to manually change folders and stuff… should still be like, "upload and done"p
still not getting the “live” domain part… i guess it’ll load accordingly to what domain is being used, so if domain.ext is being used, it’d get the matching content. or would i be leaving the

$domain = server::get('server_name');


Maybe I have a problem following you. What exactly do you want to achieve? Different folder structures on local and remote? Or something else? Maybe you can give an example.

The idea of the multi-site setup is like this:

  • you have one Kirby installation
  • all domains point to that Kirby installation
  • each domain may have different content, site or thumbs folders.

The example site.php from the docs refers to a setup like this:


just now i have a few project folders summing up to a total of 71.000 files…


  • assets/
  • content/
  • site/
  • vendors/


  • assets/
  • content/
  • site/
  • vendors/

while most of the time e.g. the vendors part is quite similar, so e.g. i have 3 diffrent sites on my localhost, each have their own composer file, i end up having many same composer plugins several times, with something like mpdf summing up in thousands of files which will add up when processing files to backup.

so i am looking to simplify everything to somewhat end up with something similar to multisite to have kirby installation and things like mpdf just once… with multisite i’d only have one composer folder but call e.g. mpdf when needed.

while this looks like multisite config could be the right thing, i still wonder about a seamless transition from from local to live server…

let’s say multisite is used locally, but on live server it could still be on different servers e.g. multisite features only one live domain anyhow

my issue only would be having the diffrent url after uploading as i am not too clear about that since we have set content/domain folders…


  • content/domain-1
  • content/domain-2
  • assets/domain-1

so my stategy is to somewhat get everything sorted reducing overall file-count…


In short:

  • Locally have everything sorted and slim
  • Remotely still might not be used on the same server… (not always)(where i think this could still be done, as i would only need to upload specific files)

I don’t think Kirby multi-site setup would work for your vendors folder.

Locally you could probably work with symlinks to certain folders but would then have include those files on the remote server.

does it not have any access outside of each individual domain subfolder?

for example i have the vendors folder with some javascript and in my


i’ll included from vendors/javascriblib/dist/javascriptfile.js ?

in the end i’d upload everything i need - all except for domain folders i don’t need on that server.
it’s kinda neatly seperated anyhow…

Well, yes, maybe. Why don’t you just test if it works for you? After all, it’s not that difficult to set up the site.php and move some folders around.

In my head, there are two solutions here. I haven’t tried this Kirby but i have tried it with other CMS systems that have a good chunk of shared code.

  1. Symlink the plugins & fields folder and anything else common into your project. Im assuming this will work with the vendor folder for composer. if you prefer to use the kirby cli to install stuff, you could set up a dummy site containing all your common plugins, fields, and vendor folder, and symlink from there. You could in theory symlink the panel, and kirby folders too, so that you only have to keep one up to date.

  2. Forgive me if this is wrong, I have not really worked with composer, but it seem to me that it is similar to other package managers. Can you not install things globally so they are accessible everywhere, rather then in each project.

  3. Why not use Rsync? This only transfers new and changed stuff which means it wont be transferring 71,000 files each time.I use a bash script that i trigger from NPM script to do this:


ERRORSTRING="Error. Please make sure you've indicated correct parameters"
if [ $# -eq 0 ]
  echo "$ERRORSTRING"; 

elif [[ "$1" == "live" ]]
  if [[ -z $2 ]]
    echo "Running live dry-run"
    rsync --dry-run -az --force --delete --progress --exclude-from=rsync_exclude.txt -e "ssh -p22" ./your/local/source/folder YOURUSERNAME@XX.XXX.XXX.XX:/path/to/site/on/server/from/volume/root
  elif [[ "$2" == "go" ]]
    echo "Running live actual deploy"
    rsync -az --force --delete --progress --exclude-from=rsync_exclude.txt -e "ssh -p22" ./your/local/source/folder YOURUSERNAME@XX.XXX.XXX.XX:/path/to/site/on/server/from/volume/root
    echo "$ERRORSTRING";

elif [[ "$1" == "staging" ]]
  if [[ -z $2 ]]
    echo "Running staging dry-run"
    rsync --dry-run -az --force --delete --progress --exclude-from=rsync_exclude.txt -e "ssh -p22" ./your/local/source/folder YOURUSERNAME@XX.XXX.XXX.XX:/path/to/site/on/server/from/volume/root
  elif [[ "$2" == "go" ]]
    echo "Running staging actual deploy"
    rsync -az --force --delete --progress --exclude-from=rsync_exclude.txt -e "ssh -p22" ./your/local/source/folder YOURUSERNAME@XX.XXX.XXX.XX:/path/to/site/on/server/from/volume/root
    echo "$ERRORSTRING";

And the npm scripts:

"deploy:live:sim": "./deploy live",
"deploy:live": "./deploy live go",
"deploy:staging:sim": "./deploy staging",
"deploy:staging": "./deploy staging go"

Remember to make the bash script executable. the :sim tasks will show you what will get transferred but wont actually do it so you can trial a deploy without changing anything. Its a preview. You can also create a rsync_exclude.txt file along side the bash script where you can define paths and files you want rsync to skip. Do this a line at a time for each exclusion, like you would with a .gitignore file.