Reputation: 810
I don't want to push >100MB files to my repos as data connectivity is a constraint for me.
Is there any way, any script that automatically removes >100MB files (irrespective of their file format) from my commits?
Solution needs to be preferably with a warning along with a list of files it is removing from commits
doesn't require me to type long commands (git or otherwise)
simple and easy to use with any new repo
P.S.
I know there is 100MB limit to adding and pushing files and we get error while pushing to the github server.
I am not interested in pushing data through git lfs service.
I have been using data type omissions in .gitignore
file. However, often I like to commit *.pkl
(python pickle files) that are <100MB
Upvotes: 1
Views: 1030
Reputation: 2394
If this happened to you:
git revert
which creates a new commit where the large file doesn't exist but leaves the large file in the repository's history of commitsThen doing this may solve your problem
git reset --soft HEAD^
git reset
Do this until you encounter the commit having file size>100MB.
Suppose, you commit 3 more times after large file commit(now there are 4 commits pending), then you have to write this code 4 times.
For more details, you can click here
Upvotes: 3