Reputation: 8693
How does SVN (or GIT) or other version control system perform if I have say 10s of TB of data being managed?
What are the main things that I would need to consider when going to such large repositories?
Upvotes: 3
Views: 690
Reputation: 1112
I guess, besides SVN/Git, you are also considering other version controls to handle your large projects. You may need to check the following factors if you are going to manage relatively large projects.
The data storage, whether the version control tool is file system or uses SQL Server as the back-end. Technically, file-system is much more fragile.
Security, including the management of access permissions, database encryption, whether it is easy to back up the database and the securities on the network level. After all, source code is the most treasure asset.
Concurrent connections. This is based on the developers you have. Since you have large projects, I guess you may have many developers (even located in different countries) working on them. This case, you should consider whether the version control tool can neatly handle the situation.
Upvotes: 0
Reputation: 682
This was covered on our forum a while back and was responded to by some Subversion committers, here for reference - http://www.svnforum.org/threads/39795-Is-there-any-inherent-Subversion-repository-size-limit
The short answer is that there isn't an upper limit, except that imposed by the filesystem.
In practice though, you're gonna hit problems with massive repositories such as the one you propose, especially if you need to do things like svnadmin dump/load or need to hunt through a log file that's very busy.
Upvotes: 1