Reputation: 61
I want to understand the best way to copy files from one remote server to another remote server using python.
My setup looks something like this:
+--------------+
| Server A |
+--------------+
| Build Server |
+--------------+
|
|
+-------------+
| Server B |
+-------------+
| Python Code |
+-------------+
|
|
+------------+
| Server C |
+------------+
| App Server |
+------------+
I have a few RPM's stored in the build server. These binaries needs to be transferred to the App server, so that i can install them on this box.
Currently i am using Python's Paramiko library [sftp.get and sftp.put] and get the binaries from Server A to Server B and transfer it to from Server B to Server C. Is there anyway i could structure my code so that the binaries can be transferred directly from Server A to Server C?
To be more precise, do something like this:
scp -3 user1@remote1:/home/user1/file1.txt user2@remote2:/home/user2/file1.txt
This kind of avoids the intermediate hop.
Suggestions/Improvements are much appreciated!
Upvotes: 1
Views: 1508
Reputation: 690
I would use You might be able to just directly call rsync
to handle this problem.scp
from Python using the subprocess
module or try an existing Python module that wraps or implements rsync
It will be much easier to call scp
via subprocess than perform all of the required operations via paramiko
.
Upvotes: 1