Managing Clients Content Folder On Live Site

Hi guys,

I have a question in regards to keeping the content doc always up to date.

Currently, my dev setup involves me working on a site locally, linking it to a git repo and pushing up any changes.

When it times for the site to go live, I ssh into the server and perform a git pull. Everything works.

My issue now is that after a few months and the client has gone into their panel and made many content changes, He wants me to do some updates. Currently i’ve been ftp’ing into the server, copying his updated content doc and replacing my local one with it.

This is a fine method but very tedious. Another solution was to gitignore the content folder and link the folder to a unique repo either as part of a project or a submodule. So when it was time to download the clients latest content doc I would ssh into the server and git push inside the content doc.

Right now, I am using gitlabs and their structure is a little different ,you cant have repos inside repos so my content doc repo inside doesn’t work anymore.

Im just looking for ideas/solutions on how others do it .

Sorry for the long read,
Cheers!

There are plugins that let you push from remote to a repo, for example the AutoGit plugin.

1 Like

You can also use rSycnc, which will pull down the differences between your local copy and the remote copy. I use this bash script, which i trigger via NPM scripts:

"deploy:live:sim": "./deploy live",
"deploy:live": "./deploy live go",
"deploy:staging:sim": "./deploy staging",
"deploy:staging": "./deploy staging go",
"content:sync:sim": "./deploy sync",
"content:sync": "./deploy sync go",

Should be self explanatory. The sim commands let you see what will change without actually doing it. Deploy pushes up and content:sync pulls down from just the content folder.

If you are new to rSync, there is a good guide here.

2 Likes