Reputation: 33
I have a folder which has over 1500 files scattered around in different sub-folders with extension .fna
. I was wondering if there is a simple way in Perl to extract all these files and store them in a different location?
Upvotes: 0
Views: 1053
Reputation: 1459
As File::Find is recommended everywhere, let me add that there are other, sometimes nicer, options, like https://metacpan.org/pod/Path::Iterator::Rule or Path::Class traverse function.
Upvotes: 2
Reputation: 33
sorry for the late response. I was away for a conference. Here is my code which seem to work fine so far.
use strict;
use warnings;
use Cwd;
use FileHandle;
open my $out, ">>results7.txt" or die;
my $parent = "/home/denis/Denis_data/Ordered species";
my ($par_dir, $sub_dir);
opendir($par_dir, $parent);
while (my $sub_folders = readdir($par_dir)) {
next if ($sub_folders =~ /^..?$/); # skip . and ..
my $path = $parent . '/' . $sub_folders;
#my $path = $sub_folders;
next unless (-d $path); # skip anything that isn't a directory
chdir($path) or die;
system 'perl batch_hmm.pl';
print $out $path."\n";
#chdir('..') or die;
#closedir($sub_dir);
}
closedir($par_dir);
I will also try the File::Finder option. The above one looks quite messy.
Upvotes: 0
Reputation: 593
Without much more information to go on, you don't need a perl script to do something as easy as this.
Here's a *nix one-liner
find /source/dir -name "*.fna" -exec mv -t /target/dir '{}' \+ -print
Upvotes: 1
Reputation: 4709
use File::Find;
my @files;
find(\&search, '/some/path/*.fna');
doSomethingWith(@files);
exit;
sub search {
push @files, $File::Find::name;
return;
}
Upvotes: 1
Reputation: 390
Which OS are you using? If it's Windows, I think a simple xcopy command would be a lot easier. Open a console window and type "xcopy /?" to get the info on this command. It should be something simple like:
xcopy directory1/*.fna directory2 /s
Upvotes: 1