Reputation: 11391
I have a local repository that was not pushed to bitbucket before.
My working folder (with the local .git folder) has grown to 1.7 GB, so I decided to push it to bitbucket, as an additional backup.
git push
but that always fails with the following error:Output:
> git push
Pushing to [email protected]:workspace/repository.git
Enumerating objects: 62975, done.
Counting objects: 0% (1/62975)
Counting objects: 1% (630/62975)
...
Counting objects: 99% (62346/62975)
Counting objects: 100% (62975/62975)
Counting objects: 100% (62975/62975), done.
Delta compression using up to 12 threads
Compressing objects: 0% (1/33144)
Compressing objects: 1% (332/33144)
...
Compressing objects: 99% (32813/33144)
Compressing objects: 100% (33144/33144)
Compressing objects: 100% (33144/33144), done.
Writing objects: 0% (1/62975)
Writing objects: 1% (632/62975)
Writing objects: 1% (1094/62975), 3.45 MiB | 2.79 MiB/s
...
Writing objects: 29% (18265/62975), 70.41 MiB | 1.13 MiB/s
Writing objects: 29% (18282/62975), 71.57 MiB | 1.16 MiB/s
client_loop: send disconnect: Broken pipe
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Based on similar issues, I have already tried to update ~/.ssh/config
to this:
> cat ~/.ssh/config
Host *
ServerAliveInterval 600
TCPKeepAlive yes
IPQoS=throughput
The error happens after 30-60 seconds - it always fails at a different position.
What can cause that problem, and how can it be fixed?
Upvotes: 2
Views: 826
Reputation: 1326686
I don't think this is related to the SSH key, which does authenticate you correctly to BitBucket.
This is more linked to a BitBucket repository size limit, as listed here: if you have a big giant commit which is more than 1GB, that would fail to upload.
Try and use on your local repository a tool like github/git-sizer
to evaluate not just the global size of the repository, by also the size of its largest objects.
Upvotes: 1