Reputation: 946
I have several files that are too big to commit to github and now I am forced to rewrite the history using git filter branch.
I have about 4 files that are above 100MB csv files that I would prefer to just remove from the history. I have files name: story_a002.csv, input_0019.csv, charlet.csv and model_892.csv
I know I am able to use this command below but I must use it one at a time. Once the rewrite is complete I have to push the changes; unfortunately, because I have to rewrite all of them first othewise I will get a File Too Large github issue.
git filter-branch -f --index-filter 'git rm --cached --ignore-unmatch <filename>
Is there a way to chain this so I can rewrite once and do a single push?
Upvotes: 1
Views: 402
Reputation: 1324198
First, You might consider the new git filter-repo
, which will replace the old git filter-branch
or BFG.
It has many usage examples, including path-based filtering:
To keep all files except these paths, just add
--invert-paths
:git filter-repo --path aFileToRemove --invert-paths
Try it with multiple path (to remove all your files in one command), and then force push.
Upvotes: 4