Reputation: 174
My group works on a project that includes some large data files which are integral to our development process and need to be versioned. However, there are some times (like when I'm at home or in Starbucks) when I do a pull and don't really want to bother with small changes in those big files. I've had ideas like moving all the big files to a branch (i.e., full-scale-testing) and only switching to this one when you're ready to deal with big files. I'm wondering if anyone else has come up with a better way of dealing with this type of situation.
Upvotes: 3
Views: 598
Reputation: 1323833
git-lfs/git-lfs issue 227 touched on that issue:
For completeness- if you want to not have the repo auto get everything you have to add a
.lfsconfig
file in the repo withfetchexclude=*
You can then get the files using
git fetch --all
Or:
You can also run any of the LFS "download" commands (
clone
,pull
,fetch
) with the-X
flag and a pattern of files to exclude.
Similarly, the-I
flag accepts a glob of files to include.
Finally, you also have:
Skips automatic downloading of objects on clone or pull. This requires a manual "
git lfs pull
" every time a new commit is checked out on your repository.
lfs.fetchinclude
/lfs.fetchexclude
Upvotes: 1