Reputation: 4603
I'm moving hosting from one Linux server to another. Both run cPanel, but my source host has disabled the backup function within cPanel due to some issues it was causing and refuses to re-enable it.
I have 36 Gigs of content I need to transfer over from one server to another.
I'm wondering if connecting to the shell with SSH and using wget to download all the data to the new server is a good idea.
Does anyone foresee any issues with this approach given the vast quantity of content? Any tips?
Upvotes: 0
Views: 4909
Reputation: 18193
You might have an easier time using scp. Since you have ssh access, it should work. With scp you can copy a directory recursively, so it might be as simple as the command below. For example login to the destination server (the server you wish to copy the files to) and try this command:
scp -r username@source_host:/path/to/source/directory .
The dot at the end represents the current directory you are in on the destination server. You can also specify a path instead of the dot to copy the files to some other location on the destination server.
wget can also copy a directory recursively, but the difference is those files need to be accessible via ftp or http. Whereas scp can copy any files on the file system you have access to.
After reading the wget man page, a recursive download is really only useful with an ftp:// URL. With http, it will follow the links in the document, which may not include all of your files. Frankly, I've never used wget to download more than one file, and always use scp for copying multiple files/directories.
Upvotes: 3