Reputation: 33593
I'm having a website with many big images file. The source (as well as the images) is maintained with git. I wish to deploy that via ftp to a bluehost-like cheap server.
I do not wish to deploy all the website each time (so that I won't have to upload too many unchanged files over and over), but to do roughly the following:
It is similar in spirit to svn2web. But I want that for DVCS. Mercurial alternative will be considered.
It's a pretty simple script to write, but I'd rather not to reinvent the wheel if there's some similar script on the web.
Capistrano and fab seems to know only how to push the whole revision, in their SCM integration. So I don't think I can currently use them.
Upvotes: 10
Views: 7166
Reputation: 5361
For Github users, you can use FTP Deploy Action.
Just add following code in /.github/workflows/main.yml
.
on: push
name: 🚀 Deploy website on push
jobs:
web-deploy:
name: 🎉 Deploy
runs-on: ubuntu-latest
steps:
- name: 🚚 Get latest code
uses: actions/checkout@v2
- name: 📂 Sync files
uses: SamKirkland/[email protected]
with:
server: <ftp_server>
username: <ftp_username>
password: ${{ secrets.ftp_password }}
On each master push, only changed files since last time will be automatically uploaded to FTP server.
Upvotes: 0
Reputation: 16243
The command git-ftp push
from git-ftp
seems to work quite well.
Install it
sudo apt-get install git-ftp
After installing it, configure your ftp account
git config git-ftp.url ftp.example.net
git config git-ftp.user your-ftp-user
git config git-ftp.password your-secr3t
Then do it for the first time
git-ftp init
And then, for every change, you just need
git add -A
git commit -m "update"
git-ftp push
The files will be uploaded to the user home ~/
directory.
Upvotes: 0
Reputation: 4127
The git-ftp script might be what you are looking for. It takes the changes local git repository and syncs it to a remote git repo over ftp.
I used it by hosting a git repo created using the --bare option. Put it on my ftp server.
than ran ./git-ftp.py. It prompts for ftp username, password, ftp host, local git repo path, remote git repo path (the location of the bare repository).
Then it connects to the ftp git repo and then sends the diffs alone. (it uses the git-python library to get that info needed).
The script has few issues. It seems to be prompting for username details always and I had to comment out line 68.
#ftp.voidcmd('SITE CHMOD 755 ' + node.name).
But those things can be easily fixed.
Alternative
If you are on a nix platform an alternative is to use curlftpfs. It will mount your ftp account as a device directory from which you can do all normal git operations (push, pull). Of course this solution ain't git specific.
You need to use the bare option as mentioned above on the repo shared on FTP as well as run git update-server-info within the repo before sharing it over FTP.
Caution: This isn't a good idea if you plan to have multiple users to write to your git repo. As FTP has no mechanism to LOCK access. You will end up with a corrupt repo. Test before taking to production.
Upvotes: 7
Reputation: 11
you might just as well use wput (wput --timestamping --reupload --dont-continue) -- like wget just for ftp uploading
Upvotes: 1
Reputation: 1324208
Another option would be to use git archive
.
Of course, as mentioned by Joey in his "git archive
as distro package format":
The tricky part of using a git (or other rcs) archive as distribution source package format is handling pristine upstream tarballs.
One approach would be to try to create a git archive that didn't include objects present in the upstream tarball. Then, to unpack the source package, you'd unpack the upstream tarball, convert the files in it into git objects and add them into the .git directory.
This seems like it might be possible to implement, but you'd need to know quite a lot about git internals to remove the redundant objects from the git repo and regenerate them from the tarball.Another approach would be to keep the pristine upstream tarball in the git archive, and then the source package would consist entirely of the git archive. This doesn't have the same nice minimal bandwidth upload behavior -- unless you can "
git push
" your changes to do the uploadStoring a lot of upstream tarballs in git wouldn't be efficient, but the script pristine-tar takes care of that:
pristine-tar can regenerate a pristine upstream tarball using only a small binary delta file and a copy of the source which can be a revision control checkout.
The package also includes a pristine-gz command, which can regenerate a pristine .gz file.
The delta file is designed to be checked into revision control along-side the source code, thus allowing the original tarball to be extracted from revision control.
More details in the header of this perl script pristine-tar.
Upvotes: 1
Reputation: 4244
You can store the latest deployed revision somewhere in a file, then you can simply get the name of the changed files:
$ git diff --name-only $deployed $latest
Substitute with the according sha-1 codes, or the $latest can be "master", for example.
Upvotes: 2