Reputation: 1061
I try to reduce my repo size and push it into a remote new-remote
. What I tried is
git checkout --orphan clean
git rm --cached
to clean those large filegit reflog expire --expire=now --all
followed by git gc --aggresive --prune=now
.I was hoping that in this way I only need to remove the large file in the HEAD since there's no history contained for this branch.However, after I did git push
, I found the compressing object
reduced but counting object
remain the same. Could I know why this is the case, and which is a better indicator of whether the new-remote
would have a shrinked repo ? Would git filter-branch
be needed in my case ?
Upvotes: 1
Views: 1016
Reputation: 1328152
If you start from scratch in a new orphan branch, I would then check the actual size of the repository by cloning the remote repository you just push to.
By cloning the remote in a new local folder, you would then get the actual size of your new repository.
But if that remote repository still includes all the other branches (and not just the new orphan branch), you would still get the large files in past commits.
I would then recommend to:
git filter-repo
(python-based)git filter-repo --strip-blobs-bigger-than 2M
for instance. (content-based filtering)git push --mirror
: make sure to notify any collaborator on that repository)Upvotes: 1