anti
anti

Reputation: 3125

How can I add large files to a Git repo?

I have a Microsoft Azure / Visual studio online repo managed with Git. I am using the Git GUI application to manage it.

I have a couple of files that are 535 MB and 620 MB in size. I would like to add these to the repo.

I have enabled Git large file support, and i have set the global post buffer with the command:

git config --global http.postBuffer 1048576000

No matter what I do, I cannot seem to add these files. The commit is fine, but when I push to the remote branch, I get:

POST git-receive-pack (547584390 bytes)
error: RPC failed; HTTP 503 curl 22 The requested URL returned error: 503
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date

As far as I know, adjusting the buffer like this should work in this case. What am I missing?

Upvotes: 1

Views: 2493

Answers (2)

anti
anti

Reputation: 3125

After fighting with git for weeks, I have finally solved this.

the answer for me was to clone the repo again using SSH instead of HTTP.

Upvotes: 1

VonC
VonC

Reputation: 1323115

Activating LFS locally (git-lfs.github.com as you mention) is a good first step.

Check also the prerequisites and limitations at Azure DevOps Azure Repos / Use Git Large File Storage (LFS)

Finally, if you just added/committed the large file, it is better to reset that commit (assuming you don't have any other work in progress), and then track it through lfs:

git reset @~
git lfs track lyLargeFile
git 

Upvotes: 1

Related Questions