Reputation: 657
In case a github repository (for example /user/myRepository) has large data files managed by Git LFS, when we clone the repository (with git-lfs installed on the client) with git clone https://github.com/user/myRepository.git
, we get the whole repository (including the large files) and it can be quite long.
Is there a solution to not recover large files if you want to quickly clone only codes (without large files)?
I tried naively to do:
git lfs uninstall
before to do:
git clone https://github.com/user/myRepository.git
and it took long time and the large files was cloned ...
I look for a simple method like:
I want the large files, I do:
git lfs install
git clone https://github.com/user/myRepository.git
I want to be fast and I don't need large files, I do:
git lfs uninstall
git clone https://github.com/user/myRepository.git
Upvotes: 1
Views: 1208
Reputation: 76599
You can set the environment variable GIT_LFS_SKIP_SMUDGE=1
while cloning and then use git lfs pull
to pull down the large files if you want to check them out later.
The environment variable is documented in the git-lfs-config(5)
manual page.
Upvotes: 1