Reputation: 563
I got a git repository containing 11 different and independent projects (don't ask me why the **** they are all in one repository). Because some of the projects containing many assets, gitlab says that the size of the repo is about 14.3 GB and that causes huge checkout times (on our CI/CD system up to 20 minutes).
Because we only build one of the projects at a time, I want to separate all projects to different repositories. Because Project A does not need commits related to files of Project B, I want to cleanup the whole history.
I already tried different ways:
git filter-branch --prune-empty
, but I want to keep the file structure.git filter-branch --index-filter --prune-empty
with git rm --cached --ignore-unmatch
, but I can still recover old files.--delete-folders
. Great result, but I can only provide a glob/regex and some Projects contaiing folders with the name of other projects (bad naming...) which are also wiped out...The best would be a tool/command working like BFG, but which allows me to provide paths to delete or better paths to keep.
Example of the file structure:
./
+- Project A/
+- Project B/
+- UI Projects/
| +- Foo/
| +- Bar/
+- Project E/
| +- Foo/
| +- Bar/
+- Build
+- build_a/
+- build_b/
+- build_foo/
+- build_bar/
+- build_e/
My requierments are:
./Project A/
and ./Build/build_a/
for Repo A)Any suggestions?
Upvotes: 2
Views: 2347
Reputation: 2695
The following tree-filter satisfies your requirements:
find . ./Build -maxdepth 1 -path . -o -path ./Build -o -path "./Project A" -o -path ./Build/build_a -o -exec rm -rf {} +
Replace Project A
and build_a
with the actual project name. You can add other paths following the example of the ./Build
folder.
Pass it to the --tree-filter
option of filter-branch:
git filter-branch --tree-filter '...' --tag-name-filter cat --prune-empty -- --all
Upvotes: 2
Reputation: 45649
Well... you're kind of missing a bigger piece of the problem here, but I'll come back to taht. To address your question as asked:
Of the options you've tried, filter-branch
is the one that should have worked. (Be advised that git has a new tool, filter-repo
, that they recommend over filter-branch
; but I haven't taken the time to switch over, and it sounds like you have a nearly-working filter-branch
procedure anyway, so I'll address the answer using filter-branch
...)
So, you say you could still recover the deleted files after using filter-branch
with index-filter
. There are several possible reasons for that, but generally the point is that git tries to avoid losing data unless it's really sure you no longer want it. So:
filter-branch
creates a set of "backup refs" whenever it rewrites a repo's refs. Those "backup refs" can still reach the old histroyThe easiest way to do away with all of that is to reclone from the repo where you did the clean-up. If you really want to clean it up in place, you need to (1) delete the refs under the original
namespace; (2) expire or delete the reflogs - I've always had trouble getting git to expire them, but if all else fails rm -r .git/logs
; (3) run gc. For this type of operation I use gc --force --aggressive --prune=now
.
Now... the bigger probelm is, if the histories of 11 projects combined are 14.3GB, then the history of each project is (on average) over 1GB - and that's still ridiculous. You have a deeper problem. Splitting the repos is, IMO, a good idea (I'm not a fan of the "monorepo" trend); but you should also be trying to reduce the overall size of the repo.
Most likely you have large binary files under source control. Very rarely is that advisable. If you do need to do it, you should use a tool like git lfs
to keep the core repo small and manageable. But if you're just storing build artifacts, or dependencies, or something like that, you would be better served to look into an artifact repository (artifactory, nexus, ...). This may require improved build tooling to manage dependency versions
Upvotes: 3