Reputation: 79
Hello everybody and I hope you're having a good day!
I have a small problem which I've searched for a solution to no avail. Maybe somebody here could help me? The problem is this:
I am using a script I wrote which loops through all the folders and performs a hotcopy of the repos to a network location. All works really well apart from one repo, which fails with the following error:
svnadmin: Can't open file 'E:\repositories\20100831_repository_xyz\db\revs\0\235': The system cannot find the file specified.
Unable to backup the repository.
<==ERROR SEGMENT COMPLETE==!>
I dont mind losing the revision, but I obviously cant export and import (losing all revisions). Also, I really need to get it fixed as its breaking my nightly backups! 500gb's used in 4 weeks...
Anybody have a working solution?
Upvotes: 0
Views: 510
Reputation: 4856
First of all you're better off using svnadmin dump
as it will package each repo with
its' meta-data - revisions, chagnes, users, etc. Your "down-and-dirty" method is risky because there is the possibility to lose some files while transfering them over the network. Then you can safely transport all the repos
over Intranet, Internet or whatever and load them in another SVN instance or just keep the files.
This is how the command works:
svnadmin dump REPOSITORY_NAME > out_file.dump
What I use is a cronjob that scans my svn_root directory with ls -1
, that's where all the root Repo dirs are, for all directories (except ./ and ../) gets their names and performs a repository dump on each of them. After that is done I safely move the files to another server via ssh.
And another thing - if you want to have another server that is working (as a slave/mirror) and up to date you can use repository hooks to sync it with the master. Drop a comment on this if you would like me to explain that. It is not hard at all.
Upvotes: 2