Reputation: 1579
I have a folder where I keep all of my Git repos. I usually just do git pull
to get my changes, but now that I have over 50 repos it becomes a burden to have to do this for every folder.
How can I run a command that will loop through every repo and update it for me?
Upvotes: 7
Views: 8448
Reputation: 31
For better performance when dealing with many repositories, you can employ find
and xargs
to run commands in parallel. This example uses up to 4 concurrent processes:
find /foo/bar/ -mindepth 1 -maxdepth 20 -type d -print0 | xargs -0 -I {} -P 4 bash -c '[ -d "{}/.git" ] && echo "{}" && git --git-dir="{}" fetch --all 2>/dev/null''
# if repo contains special characters command will complain
# bash: line 1: /foo/bar/${env.LOCAL_REPO}/.git: bad substitution
Of course you can substitute git fetch
with git stash
followed by git pull
or whatever else command makes sense for the particular scenario. Personally prefer my suggestions as it caters for scenarios where git
repository is stored in GitLab where you can have nested repositories.
On purpose not taking into account repositories using git submodules
.
Upvotes: 1
Reputation: 1579
In Bash you can run this command which will loop through every repo in your working directory, stash your changes, fetch the origin and pull the latest commit.
for d in */; do cd $d; git stash; (git pull &); cd ..; done
Some things to note:
(git pull &)
opens a subshell and executes in the backgroundUpvotes: 13