Reputation: 29988
I am generating relatively large files using Perl. The files I am generating are of two kinds:
Table files, i.e. textual files I print line by line (row by row), which contain mainly numbers. A typical line looks like:
126891 126991 14545 12
Serialized objects I create then store into a file using Storable::nstore
. These objects usually contain some large hash with numeric values. The values in the object might have been pack
ed to save on space (and the object unpack
s each value before using it).
Currently I'm usually doing the following:
use IO::Compress::Gzip qw(gzip $GzipError);
# create normal, uncompressed file ($out_file)
# ...
# compress file using gzip
my $gz_out_file = "$out_file.gz";
gzip $out_file => $gz_out_file or die "gzip failed: $GzipError";
# delete uncompressed file
unlink($out_file) or die "can't unlink file $out_file: $!";
This is quite inefficient since I first write the large file to disk, then gzip
read it again and compresses it. So my questions are as following:
Can I create a compressed file without first writing a file to disk? Is it possible to create a compressed file sequentially, i.e. printing line-by-line like in scenario (1) described earlier?
Does Gzip
sounds like an appropriate choice? aRe there any other recommended compressors for the kind of data I have described?
Does it make sense to pack
values in an object that will later be stored and compressed anyway?
My considerations are mainly saving on disk space and allowing fast decompression later on.
Upvotes: 1
Views: 3804
Reputation: 1
IO::Compress::Zlib has an OO interface that can be used for this.
use strict;
use warnings;
use IO::Compress::Gzip;
my $z = IO::Compress::Gzip->new('out.gz');
$z->print($_, "\n") for 0 .. 10;
Upvotes: 0
Reputation: 2328
You can also open() a filehandle to a scalar instead of a real file, and use this filehandle with IO::Compress::Gzip. Haven't actually tried it, but it should work. I use something similar with Net::FTP to avoid creating files on disk.
Since v5.8.0, Perl has built using PerlIO by default. Unless you've changed this (i.e., Configure -Uuseperlio), you can open filehandles directly to Perl scalars via:
open($fh, '>', \$variable) || ..
from open()
Upvotes: 2
Reputation: 231103
You can use IO::Zlib
or PerlIO::gzip
to tie a file handle to compress on the fly.
As for what compressors are appropriate, just try several and see how they do on your data. Also keep an eye on how much CPU/memory they use for compression and decompression.
Again, test to see how much pack
helps with your data, and how much it affects your performance. In some cases, it may be helpful. In others, it may not. It really depends on your data.
Upvotes: 8