Reputation: 2666
I am trying to use Github
for some academic work, as a replacement for heavy-weight cloud storage systems like Dropbox and Google Drive. This means I'd like to save PDF figures and manuscripts to my git repo, so that collaborators can clone/pull this stuff from my private repo (I have code and .tex
files in the same repo, so for that reason, I feel that git is still appropriate). Unfortunately, the size of the .git
folder can quickly get massive since git
is saving these old, unwanted versions of scientific figures (we generate a lot before settling on something publication-worthy).
Would it be possible to have git automatically remove the version history of all binary (.pdf, .png, etc.) files automatically on each commit? That is, for certain filetypes, git will only keep track of the files present during the latest commit.
Alternatively, is there a simple command I can use to periodically wipe out the binary file version history in git, so I can run it whenever .git
starts to get massive?
Upvotes: 1
Views: 153
Reputation: 83557
Git is not intended as a backup service. It is built as a version control system. This means that any file that is tracked by git must have a history of changes. All changes are stored within the repository in .git.
You should not use Git as a replacement for Dropbox or Google Drive as these serve two entirely different purposes.
If you truly need version control for large files, then you should look at git-lfs. Once again, this is a version control system, not a file storage system. It will still store all information about all versions of the file. If you just need file storage, then you should use existing cloud storage systems instead.
Upvotes: 2