I use dploy.io.
I just use rsync
, e.g.:
rsync -avz --delete --exclude-from=deploy-exclude.txt src dest
If you’re not scared of the command line, that’s a really simple way. I still use Git for source management, but in fact, you don’t want to deploy all your git-related stuff and neither some other stuff.
With rsync, you can exclude files or folders (as in the example), for instance
.git
.gitignore
etc.
Thats the beauty: You have two folders: A bare git repository and a working-tree in a different folder
As I still haven’t really gotten into git, could you two (@BenSeitz & @dinko) elaborate on this? If you’d use git, how - and if not, why and how’s rsync working (constantly or you put that line in the terminal everytime you want to update or …)?
@distantnative Deploying with git is great, if you are already using git for version control - otherwise it seems a bit too much.
So if you committed to your local repository, and everything is working fine you can push your local commits to your remote repository on your server. (Or to test it in a live environment on a beta server)
It’s not syncing constantly - only if you push
I’ve found this method really handy.
- https://github.com/mislav/git-deploy is used to easily set up the remote servers as git remote repos.
- Then I just use
git push production master
- I’m load balanced across multiple production servers, so I set each one up as production1, production2, etc, and then I edit my
.git/config
file in the repo to list all four production servers under the production remote, sogit push production master
pushes to all four servers.
One of these days I might figure out how to trigger an update on the servers when I push master to github, but I like this method.
I also use git to push deploy to the server (if the server supports it), however, I’m still trying to figure out what would be the best way to backup content changes on the production server on a remote repo or rather set up a workflow that includes a staging and a production environment plus a (separate) content backup repo.
By the way, is there a way to automatically pull the submodules once the repo is uploaded?
I use DeployHQ for most of my projects.
It basically allows you to always have a copy of a branch of your choice of any private or public GitHub or BitBucket repository deployed within seconds after pushing to that branch.
Combined with the Kirby Git Submodule workflow it makes managing Kirby websites a breeze.
I haven’t used dploy.io or DeployHQ but I’ve heard of them and I’ll have a look as they look very nice! But I just want to suggest rsync
for the unbeatable price of zero for a lifetime membership.
In any case…here’s how I do it:
Let’s say I have a new feature I want to publish.
- I commit and push on Git (to a master or feature branch).
- I have a Grunt task that does a few things (like css and js building)
- I have a deploy script that uses
rsync
to publish the changes to the live (or preview) server. For example, I exclude thecontent
folder and some other stuff. Approximately like so:
#! /bin/bash
if [ "$1" = "live" ]
then
echo "Deploying on live server..."
rsync -avz --delete --exclude-from=deploy-exclude.txt . live_server_and_path
elif [ "$1" = "preview" ]
then
echo "Deploying on preview server..."
rsync -avz --delete --exclude-from=deploy-exclude.txt . preview_server_and_path
else
echo "Please specify a target parameter [live, preview]."
fi
Once you have a live site, you need a way of getting the live content to your dev site. For that, I use another sync script that uses rsync
in the other direction. It syncs my local content folder (and some other stuff like avatars, accounts, etc.) with the live server. Approximately like so:
rsync -avz --delete live_server_and_path .
Obviously, there’s lots of configuration options for rsync, you can bend it to your needs.
While I was looking for Dploy.io and DployHQ, I stumbled upon Dploy, a node.js tool that helps you deploy through FTP/SFTP.
Could be handy for those limited shared hostings.
git pull
If my client is running on hosting that doesn’t support SSH or Git, I first urge them to move, then if that doesn’t work, I’ll sync over FTP with Transmit
Somebody might find this useful: Dandelion. It’s a script I found a couple of years ago and it works quite nicely for ftp deploys. Basically it uploads just the changed files from your local repo.
It doesn’t work with submodules though.
Is anyone having luck getting git-ftp to work with submodules?
Edit: to be more specific:
git ftp init ...
begins to do its thing but gets some errors:
curl: (25) Failed FTP upload: 553
It seems that all submodules (including kirby and panel) are uploading as a 0k file.
As you can see here in the buffering - it’s reading the folder itself as a file to upload, not as a directory.
Sun May 3 11:53:31 PDT 2015: [25 of 38] Buffered for upload 'content/site.txt'.
Sun May 3 11:53:31 PDT 2015: [26 of 38] Buffered for upload 'index.php'.
Sun May 3 11:53:31 PDT 2015: [27 of 38] Buffered for upload 'kirby'.
Sun May 3 11:53:31 PDT 2015: [28 of 38] Buffered for upload 'license.md'.
Sun May 3 11:53:31 PDT 2015: [29 of 38] Buffered for upload 'panel'.
Sun May 3 11:53:31 PDT 2015: [30 of 38] Buffered for upload 'readme.md'.
The strange thing is that I have this working for one repository and not another…
A problem when you’re stuck with FTP, is that it doesn’t understand symlinks. Tools like DeployHQ and the likes can’t deploy them. So if you want to manage some plugins with submodules + symlinks (like Uniform), you’ll have to create the symlinks manually on the server, or find another approach.
I just tested both Dploy.io and DeployHQ with free accounts on a shared hosting via FTP.
Dploy.io is slow. Deploying the whole site the first time took a bit more than 9 minutes. As a comparison, I also uploaded the whole site with Transmit –not the fastest FTP client around– and it took 3m 35s. I also didn’t find the interface very intuitive nor clear.
DeployHQ in comparison is fast. Crazy fast. The same site took 1m 18s to deploy. I don’t know how they do this. I like its interface a lot more, it is always obvious what you deploy, from where, to what commit. Post commit hooks are nicely explained…
You mileage may vary, but for me the winner is DeployHQ by a mile.
Edit: Deploy.io is in the USA, DeployHQ in the UK. I’m in France, it could explain why the latter is faster.
Sorry to revive an old topic if it’s not relevant anymore… I use the post-receive hook method @BenSeitz described and it’s very fast and reliable in my experience. As @dinko pointed out, doing that makes you deploy on the production server git-related stuff which some people might not want to do since it uses disk space unnecessarily on the production server, and as you also pointed out, it doesn’t sync the other way around to reflect content changes on the production server to a remote repo (that just happens to not be a problem for me since there is no direct modification of content on the production server in my workflow – this however seems to be discussed here), but what i’m wondering about is if i really need to also have a post-receive hook (as i do) for each of the core, toolkit and panel repos. When i update the infrastructure, i have to push 3 git repos to the production server via those hooks. Is there a better way? It’s not an issue for me since i’m just doing that a couple times a year, and on one server. I’m just curious to understand if i’m doing something stupid
EDIT: i just found this thread in the forum which seems to say that pushing just once taking submodules into account is not totally straighforward if you want to separate the repos from the work tree on the production server… I also second @Mattrushka’s idea that a ‘small & simple deployment guide for Kirby’ would probably be helpful.
I’m still using my rsync
process and I’m very happy with it. I haven’t tried any of the other suggested tools…I meant to but I guess my motivation wasn’t high enough because I’m happy with my process.
Whatever way you choose, I think it’s useful to not deploy all files and folders you have in dev mode. I think it’s good to have some sort of build process with grunt, gulp, custom scripting or whatever you like.
As far as content backup is concerned – @texnixe, rsync
can do this, either manually or automatically with some sort of cron task. You could also zip
or tar
your target files and folders and scp
them somewhere. Lots of options – tyranny of choice
Edit: forgot to mention that you obviously need ssh
for rsync
to work. While more and more providers offer ssh access, clients may only have simple webspace with ftp access.
I’m looking for the best way to deploy kirby sites to my server, this guide seems a good way to do it, however, I have some questions.
Ideally I’d like changes to the content that are made on the server through the panel to be saved/backed-up to a another remote server. Would using the Auto-Git to a private GitHub repo be safe/recomended?
If this is the case, what other items should I be putting in the .gitignore
? As, from what I understand I’ll have two git repos.
.DS_Store
/site/accounts
/assets/avatars
/assets/scss
/thumbs/*
/node_modules
/bower_components