Reputation: 11287
all questions I found want to avoid timeouts in git push/pull. In my case I want to force them. My push + pulls are all going over ssh to remote machines that might be unavailable at some point in time. For example, I have a script that pushes to two remote public repos. I don't want that this script hangs forever when it pushes to the first repo and that machine is unavailable. Instead, after some timeout i want the push to fail and continue with the second repo.
Any options here?
Upvotes: 13
Views: 10398
Reputation: 8808
git config
settings for using http
for transfers:
http.lowSpeedLimit, http.lowSpeedTime:: If the HTTP transfer speed, in bytes per second, is less than 'http.lowSpeedLimit' for longer than 'http.lowSpeedTime' seconds, the transfer is aborted. Can be overridden by the
GIT_HTTP_LOW_SPEED_LIMIT
andGIT_HTTP_LOW_SPEED_TIME
environment variables. documentation for Git config
e.g., add to your .gitconfig
:
# Abort if transfer speed is less than 1000 bytes/second for more than 5 seconds
lowSpeedLimit = 1000
lowSpeedTime = 5
or, using environment variables, e.g.,
$ GIT_HTTP_LOW_SPEED_LIMIT=1000 GIT_HTTP_LOW_SPEED_TIME=5 git pull
Upvotes: 1
Reputation: 118039
I don’t think you can do an automatic fail-over with built-in features. But since Git just uses SSH underneath, it should work to add a ConnectTimeout
option for the machines in question in your .ssh/config
. Cf. man ssh_config
. Then something like git push foo || git push bar
in the shell should do what you want.
Upvotes: 14