Reputation: 115
I have a script in perl that I need to modify. The script opens, reads and seeks through two large (ASCII) files (they are several GB in size). Since it does that quite a bit, I would like to put these two files completely into RAM. The easiest way of doing this while not modifying the script a lot would be to load the files into the memory in a way that I can treat the resulting variable just as a file handle - and for example use seek to get to a specific byte position. Is that possible in perl?
Update: Using File::Slurp as proposed does the job only for small files. If the files are larger than about 2GB, it doesn't work.
Mimimum example:
#!/usr/bin/env perl
use strict;
use warnings;
use Tie::File;
use File::Slurp 'read_file';
my $fn="testfile";
#buffer, then open as file, read first line:
read_file($fn, buf_ref => \my $file_contents_forests) or die "Could not read file!";
my $filehandle;
open($filehandle, "<", \$file_contents_forests) or die "Could not open buffer: $!\n";
my $line = "the first line:".<$filehandle>;
print $line."\n";
close($filehandle);
#open as file, read first line:
open( FORESTS, "<",$fn) or die "Could not open file.\n";
my $line = "the first line:".<FORESTS>;
print $line;
close(FORESTS);
The output in this case is identical for the two methods if the file size is < 2 GB. If the file is larger, then slurping returns an empty line.
Upvotes: 0
Views: 1453
Reputation: 98398
Read in the file:
use File::Slurp 'read_file';
read_file( "filename", buf_ref => \my $file_contents );
and open a filehandle to it:
open my $file_handle, '<', \$file_contents;
Upvotes: 5