Reputation: 7611
I have a piece of (someone else's) software, which has an annoying tendency to every now and then destroy its workspace by corrupting important files if there's is a hard shutdown while it is running.
It already is invoked via a wrapper script, so my response to this was to (since the workspace isn't that large -- less than 100M) stick a tar -czf backups/workspace_$(date +%f_%R).tar.gz workspace/
before the program is run. I can use this solution (will just need to put something in place to clean old backups), but it seems inelegant, because most of the time very little is changed in these workspaces. Most of the files in question are binary.
Yes, I know that the proper backup system is a "better" choice, but I would like to not use it for this.
The obvious solution is to use revision control: git. I have only used git manually, so I'm slightly unsure about using this automated system
Question 1: After setting up the repo, is
git add workspace
git commit -m "backup on `date`"
going to do what I'm looking for?
Question 2: Is there a better way that I'm not seeing? (NOT using large-scale backups--I want an incremental revision control scheme)
Upvotes: 1
Views: 95
Reputation: 129566
git add -A -- some/dir
will stage all modifications restricted to some/dir
Upvotes: 1
Reputation: 11920
git add
will not track deleted files, if you are looking to track deleted files as well, you can use git add -A
. Otherwise, git add workspace
+ git commit
will track new and modified files within workspace
.
You could use cron to back up changes on a regular interval, which might provide better protection from data loss. Instead of losing changes that happened from when the application started, you would just lose the last 15 minutes/hour/day, depending on what interval you use for cron.
Upvotes: 1