Deploying via Git

There are many services, that upload stuff to your FTP-Server if you post a commit.

But if you’re able to install git on your server, you can simply do this with a hook: https://www.digitalocean.com/community/tutorials/how-to-set-up-automatic-deployment-with-git-with-a-vps

For me this works like a charm.
How do you deploy? A paid service, git-ftp, same thing I use or good old FTP?

Let me know and share your experience :slight_smile:

2 Likes

I just upload the stuff via sftp, key based auth. There is no ftp service on my vps.

So far I’ve done most things through FTP, but I’m looking for a new webhoster that includes git support to deploy especially dev-showcases for the clients directly.

And until then I’ll switch to a git-FTP-workaround like http://dploy.io

I use dploy.io.

1 Like

I just use rsync, e.g.:

rsync -avz --delete --exclude-from=deploy-exclude.txt src dest

If you’re not scared of the command line, that’s a really simple way. I still use Git for source management, but in fact, you don’t want to deploy all your git-related stuff and neither some other stuff.

With rsync, you can exclude files or folders (as in the example), for instance

.git
.gitignore

etc.

3 Likes

Thats the beauty: You have two folders: A bare git repository and a working-tree in a different folder :slight_smile:

As I still haven’t really gotten into git, could you two (@BenSeitz & @dinko) elaborate on this? If you’d use git, how - and if not, why and how’s rsync working (constantly or you put that line in the terminal everytime you want to update or …)?

@distantnative Deploying with git is great, if you are already using git for version control - otherwise it seems a bit too much.
So if you committed to your local repository, and everything is working fine you can push your local commits to your remote repository on your server. (Or to test it in a live environment on a beta server)

It’s not syncing constantly - only if you push :slight_smile:

I’ve found this method really handy.

  1. https://github.com/mislav/git-deploy is used to easily set up the remote servers as git remote repos.
  2. Then I just use git push production master
  3. I’m load balanced across multiple production servers, so I set each one up as production1, production2, etc, and then I edit my .git/config file in the repo to list all four production servers under the production remote, so git push production master pushes to all four servers.

One of these days I might figure out how to trigger an update on the servers when I push master to github, but I like this method.

3 Likes

I also use git to push deploy to the server (if the server supports it), however, I’m still trying to figure out what would be the best way to backup content changes on the production server on a remote repo or rather set up a workflow that includes a staging and a production environment plus a (separate) content backup repo.

By the way, is there a way to automatically pull the submodules once the repo is uploaded?

1 Like

Hi,

I personally use www.deployhq.com.

You should take a look at dploy.io. There is no cap on how many deployments you can make.

I use DeployHQ for most of my projects.

It basically allows you to always have a copy of a branch of your choice of any private or public GitHub or BitBucket repository deployed within seconds after pushing to that branch.

Combined with the Kirby Git Submodule workflow it makes managing Kirby websites a breeze.

1 Like

I haven’t used dploy.io or DeployHQ but I’ve heard of them and I’ll have a look as they look very nice! But I just want to suggest rsync for the unbeatable price of zero for a lifetime membership. :wink:

In any case…here’s how I do it:

Let’s say I have a new feature I want to publish.

  1. I commit and push on Git (to a master or feature branch).
  2. I have a Grunt task that does a few things (like css and js building)
  3. I have a deploy script that uses rsync to publish the changes to the live (or preview) server. For example, I exclude the content folder and some other stuff. Approximately like so:
#! /bin/bash

if [ "$1" = "live" ]
then
    echo "Deploying on live server..."
    rsync -avz --delete --exclude-from=deploy-exclude.txt . live_server_and_path
elif [ "$1" = "preview" ]
then
    echo "Deploying on preview server..."
    rsync -avz --delete --exclude-from=deploy-exclude.txt . preview_server_and_path
else
  echo "Please specify a target parameter [live, preview]."
fi

Once you have a live site, you need a way of getting the live content to your dev site. For that, I use another sync script that uses rsync in the other direction. It syncs my local content folder (and some other stuff like avatars, accounts, etc.) with the live server. Approximately like so:

rsync -avz --delete live_server_and_path .

Obviously, there’s lots of configuration options for rsync, you can bend it to your needs.

4 Likes

While I was looking for Dploy.io and DployHQ, I stumbled upon Dploy, a node.js tool that helps you deploy through FTP/SFTP.
Could be handy for those limited shared hostings.

git pull

If my client is running on hosting that doesn’t support SSH or Git, I first urge them to move, then if that doesn’t work, I’ll sync over FTP with Transmit

1 Like

Somebody might find this useful: Dandelion. It’s a script I found a couple of years ago and it works quite nicely for ftp deploys. Basically it uploads just the changed files from your local repo.
It doesn’t work with submodules though.

Is anyone having luck getting git-ftp to work with submodules?

Edit: to be more specific:

git ftp init ... begins to do its thing but gets some errors:
curl: (25) Failed FTP upload: 553
It seems that all submodules (including kirby and panel) are uploading as a 0k file.
As you can see here in the buffering - it’s reading the folder itself as a file to upload, not as a directory.

Sun May  3 11:53:31 PDT 2015: [25 of 38] Buffered for upload 'content/site.txt'.
Sun May  3 11:53:31 PDT 2015: [26 of 38] Buffered for upload 'index.php'.
Sun May  3 11:53:31 PDT 2015: [27 of 38] Buffered for upload 'kirby'.
Sun May  3 11:53:31 PDT 2015: [28 of 38] Buffered for upload 'license.md'.
Sun May  3 11:53:31 PDT 2015: [29 of 38] Buffered for upload 'panel'.
Sun May  3 11:53:31 PDT 2015: [30 of 38] Buffered for upload 'readme.md'.

The strange thing is that I have this working for one repository and not another…

A problem when you’re stuck with FTP, is that it doesn’t understand symlinks. Tools like DeployHQ and the likes can’t deploy them. So if you want to manage some plugins with submodules + symlinks (like Uniform), you’ll have to create the symlinks manually on the server, or find another approach.

I just tested both Dploy.io and DeployHQ with free accounts on a shared hosting via FTP.

Dploy.io is slow. Deploying the whole site the first time took a bit more than 9 minutes. As a comparison, I also uploaded the whole site with Transmit –not the fastest FTP client around– and it took 3m 35s. I also didn’t find the interface very intuitive nor clear.

DeployHQ in comparison is fast. Crazy fast. The same site took 1m 18s to deploy. I don’t know how they do this. I like its interface a lot more, it is always obvious what you deploy, from where, to what commit. Post commit hooks are nicely explained…

You mileage may vary, but for me the winner is DeployHQ by a mile.

Edit: Deploy.io is in the USA, DeployHQ in the UK. I’m in France, it could explain why the latter is faster.