Reputation: 1032
I am trying to copy several files from a remote server into local drive in Bash using scp. Here's the part of the code
scp -r -q $USR@$IP:/home/file1.txt $PWD
scp -r -q $USR@$IP:/home/file2.txt $PWD
scp -r -q $USR@$IP:/root/file3.txt $PWD
However, the problem is that EVERY time that it wants to copy a file, it keeps asking for the password of the server, which is the same. I want it to ask only once and then copy all my files.
And please, do not suggest rsync nor making a key authentication file since I do not want to do that. Are there any other ways...? Any help would be appreciated
Upvotes: 3
Views: 5683
Reputation: 6158
Well, in this particular case, you can write...
scp -q $USR@$IP:/home/file[1-3].txt $PWD
Upvotes: 0
Reputation: 5596
You can use expect
script or sshpass
sshpass -p 'password' scp ...
#!/usr/bin/expect -f
spawn scp ...
expect "password:"
send "ur_password"
An disadvantage is that your password is now in plaintext
Upvotes: 1
Reputation: 297
I'm assuming that if you can scp files from the remote server, you can also ssh in and create a tarball of the remote files.
The -r flag is recursive, for copying entire directories but your listing distinct files in your command, so -r becomes superfluous.
Try this from the bash shell on the remote system:
$ mkdir /home/file_mover
$ cp /home/file1.txt /home/file_mover/
$ cp /home/file2.txt /home/file_mover/
$ cp /root/file3.txt /home/file_mover/
$ tar -cvf /home/myTarball.tar /home/file_mover/
$ scp -q $USR@$IP:/home/myTarball.tar $PWD
Upvotes: 0