SSh
SSh

Reputation: 189

perl filehandles in loop error

Trying to split fastq file into many files based on match present in a file so output files can be more than 500. Giving two input files - one for match and other content(fastq). Put match in different files. This code works perfectly but cannot create more than 502 files. Above 502, says cannot make file. Is that the limit of filehandles in loop. I need to create > 502 files at a time. Any solutions? Thanks

my @fh_array;
 foreach(@file_names){
    chomp $_;
    local *OUT;
    open (OUT, ">", "$_\.txt") or die "cannot write file";
    push @fh_array, *OUT;

}

    # In for loop, it works
    print $fh_array[0] "Hello";

Upvotes: 0

Views: 198

Answers (3)

mob
mob

Reputation: 118605

Psuedo-filehandles do not count against your system limit of open filehandles. Depending on your memory requirements, it might be feasible to write your output to memory and then write your memory to disk (one file at a time) at the end of your script.

For example:

my (@fh_array, %output);
foreach my $file_name (@file_names) {
    $output{$file_name} = '';
    open my $fh, '>', \$output{$file_name};
    push @fh_array, $fh;
}

# main part of program, print to handles in @fh_array
... print {$fh_array[185]} "Hello\n" ...

# end of program. Copy from memory to disk with one open fh at a time
close $_ for @fh_array;
foreach my $file_name (keys %output) {
    open my $fh, '>', $file_name;
    print $fh $output{$file_name};
    close $fh;
}

There are almost surely better ways to design your program without having to have so many filehandles open simultaneously or to have this extra complexity.

Upvotes: 0

TLP
TLP

Reputation: 67900

While there may be limitations of your system that prevents this many file handles being open at the same time, there is also the fact that this is a brute force solution that is not necessary, or very practical. I will wager that you will not notice a significant performance difference if you simply open a file when you need it, rather than have all of your files open at once.

For example:

write_to_file($file, @lines);

sub write_to_file {
    my ($file, @lines) = @_;
    open my $fh, ">>", $file or die "Cannot open file '$file' for appending: $!";
    print $fh @lines;
}

Upvotes: 0

Hellmar Becker
Hellmar Becker

Reputation: 2972

I do not think this is a Perl problem. I ran it with 1000 files, no problem. There is a per process limit to the number of open files that you can display (and possibly set) using ulimit.

Upvotes: 1

Related Questions