Reputation: 11
My /tmp
directory can have hundreds of files. I want to transfer them to another server but only two FTP instances should be run.
I created one FTP script, which takes input variable as file name and starts the trfr
process.
Main Script: Process to create a list file and then created two variables , var1
and var 2
which takes first two file names.
I created a while loop and ran ksh ftp.sh var1 & ksh ftp.sh var2
Issue:
If var1
job is running , I cant run var2
again as command will not come out of the while loop.
Upvotes: 1
Views: 77
Reputation: 207465
You can also do it like this. First generate a list of all the files and sort it by size, so that each of your FTP processes is sending similarly sized files and one doesn't take longer than the other, then start two FTPs and wait for both to finish till all files have been transferred.
#!/bin/bash
cd /tmp
# Generate list of all files sorted into increasing size
du * | sort -n | cut -f 2- | \
while :; do
read f
[ $? -eq 1 ] && break
echo $f
trfr "$f" &
read f
[ $? -eq 1 ] && break
echo $f
trfr "$f" &
wait # for both transfers to finish
done
Upvotes: 1
Reputation: 207465
Consider using GNU Parallel, like this:
cd /tmp
parallel -j 2 trfr {} ::: *
The -j 2
makes only two run at once, and all the filenames are given after the :::
. If that overflows your command line, you can use find
or ls
like this:
cd /tmp
find . -print0 | parallel -0 -j 2 trfr {}
Upvotes: 2