Network Effects
Network Effects

Reputation: 174

What's the best strategy to control the use of large files in a git repo?

My group works on a project that includes some large data files which are integral to our development process and need to be versioned. However, there are some times (like when I'm at home or in Starbucks) when I do a pull and don't really want to bother with small changes in those big files. I've had ideas like moving all the big files to a branch (i.e., full-scale-testing) and only switching to this one when you're ready to deal with big files. I'm wondering if anyone else has come up with a better way of dealing with this type of situation.

Upvotes: 3

Views: 598

Answers (1)

VonC
VonC

Reputation: 1323833

git-lfs/git-lfs issue 227 touched on that issue:

For completeness- if you want to not have the repo auto get everything you have to add a .lfsconfig file in the repo with fetchexclude=*

You can then get the files using git fetch --all

Or:

You can also run any of the LFS "download" commands (clone, pull, fetch) with the -X flag and a pattern of files to exclude.
Similarly, the -I flag accepts a glob of files to include.

Finally, you also have:

Skips automatic downloading of objects on clone or pull. This requires a manual "git lfs pull" every time a new commit is checked out on your repository.

Upvotes: 1

Related Questions