Reputation: 64004
I have a file that looks like this:
ftp://url1/files1.tar.gz dir1
ftp://url2/files2.txt dir2
.... many more...
What I want to do are these steps:
But how come this approach of mine doesn't work
while(<>) {
chomp;
my ($url,$dir) = split(/\t/,$_);
system("mkdir $dir");
system("cd $dir");
system("wget $url"); # This doesn't get executed
}
What's the right way to do it?
Upvotes: 2
Views: 450
Reputation: 50284
Use native Perl solutions where possible:
cd
can be done with chdirmkdir
can be done with mkdirmkdir -p
(don't die if dir exists, recursive creation) can be done with File::Path which comes with Perlwget
can be done with LWP::SimpleHow I would implement this:
use File::Spec::Functions qw(catfile); # adds a '/' between things (or '\' on Windows)
use LWP::Simple qw(mirror);
use File::Path qw(mkpath);
use File::Basename;
use URI;
while (<>) {
chomp;
my ($url, $dir) = split /\t/;
mkpath($dir);
# Use the 'filename' of the $url to save
my $file = basename(URI->new($url)->path);
mirror($url, catfile($dir, $file));
}
If you do this, you get:
die
)Upvotes: 13
Reputation: 881363
I'll tell you one thing wrong. The system("cd $dir");
will create a sub-shell, change into the directory within that sub-shell, then exit.
The process running Perl will still be in its original directory.
I'm not sure if that's your specific problem since # Fail here
is a little light on detail :-)
One possible fix is:
system("mkdir $dir && cd $dir && wget $url");
That will do the whole lot in one sub-shell so shouldn't suffer from the problems mentioned.
In fact, this script works fine:
use strict;
use warnings;
system ("mkdir qwert && cd qwert && pwd && cd .. && rmdir qwert");
outputting:
/home/pax/qwert
Upvotes: 4