Nari
Nari

Reputation: 53

Out of memory! Perl

I am using 28 files in one perl program.
Each file is about 2Mb in size.
I have taken them into 28 arrays. and printing in 28 output files.
Each output file contains all the arrays concatenated , except current file array. After 11 output files, each of about 70 MB size, Out of memory! msg is coming . How to increase the memory limit.

What I tried is :
I closed each file handler after fetching the data into an array. but no use.... Please suggest solutions.

Upvotes: 1

Views: 5675

Answers (1)

amon
amon

Reputation: 57590

Assuming that you have four files A B C D, you then want to create four files so that
File 1 contains B C D,
File 2 contains A C D,
File 3 contains A B D, and
File 4 contains A B C.

What you are currently doing is loading every file into an array (just using strings would spare a little memory), and then printing each output file consecutively.

You could also open all output files, then open each input file in sequence and print it to every non-corresponding output file. This keeps only one file in memory at any time.

use strict; use warnings;

my @in =  qw(A B C D);
my @out = qw(1 2 3 4);

my @outhandles = map {open my $fh, ">", $_ or die $!; $fh} @out;

for my $i (0 .. $#in) {
   open my $fh, "<", $in[$i] or die $!;
   my $content = do {local $/; <$fh>};
   for my $j (0 .. $#outhandles) {
      print {$outhandles[$j]} $content unless $i == $j;
   }
}

Memory could be reduced further if you'd say print {$outhandles[$j]} $_ while <$fh> instead of slurping the input files.

Test

$ mkdir test; cd test;
$ for file in {A..D}; do echo $file >$file; done
$ perl ../script.pl
$ ls
1  2  3  4  A  B  C  D
$ for file in `ls`; do echo == $file; cat $file; done
== 1
B
C
D
== 2
A
C
D
== 3
A
B
D
== 4
A
B
C
== A
A
== B
B
== C
C
== D
D

Upvotes: 7

Related Questions