Reputation: 3476
Does Git have a limit on how much data can be stored in a first push? I am moving all of my websites into Git, some are years old of data. I downloaded the site, created a repository and ran
$ git add .
At some point it will say the nothing can be written to the index. Basically locking it up and nothing can go through. Is there something I am missing?
The total size of the site is 1GB, I am sure i could cut that down, but it's a lot of user PDFs and media presentations. Is Git my true solution, or is mercurial?
Upvotes: 1
Views: 986
Reputation: 1767
One possible reason might be that your pack file (which is how git saves snapshots) exceeds your underlying file systems maximum file size. I believe that is 4GB for FAT32, for example. If that is the case, a setting of, say, 200MB for pack.packSizeLimit
might solve your problem.
Another possible reason might be ownership/permission issues. Like, you did git init
as root but try to git add
as regular user.
Upvotes: 0
Reputation: 72637
Git does not have any theoretical limits on the number of files, or the total size of files, in a commit or repository.
However, there are defiantly practical limits to the size of the repository - Linus Torvalds mentions these in a message here.
As the size of the repository grows, the time taken to do things will also grow. There's a few answers about that, such as this one.
There's also a few questions which suggest that physical limitations (memory, specifically) can impose limits on some actions in a repository - there's a thread about that here, although these kinds of issues may be addressed through workarounds in git.
Addressing the question: from the comments, the error that you're getting fatal: index file corrupt
isn't related to the aggregate size of files you're trying to add to the index.
Upvotes: 2