Reputation: 51
I'm new in perl and trying to convert files into its binary code and save it as text file. I've learned that it can be done by using 'unpack' command. Unfortunately unpack only able to convert small file. Do perl can convert big file, say 1 GB into binary code? Any help will be much appreciate. Here is my code:
use strict;
use warnings;
my $readFile = '/file directory';
open READ, $readFile or die "Cant open";
my $input = <READ>;
close READ;
# Convert file into binary code
my $bin = unpack 'B*',$input;
# Display how much file were converted
print "\nRead ", length($bin)/8000000, ' Mbytes';
# Save the binary code into txt file
open(WRITE,">file.txt") or die "\nCan't create txt file";
print WRITE "$bin";
close WRITE;
Upvotes: 1
Views: 726
Reputation: 385657
Just do it in chunks
#!/usr/bin/perl
use strict;
use warnings;
binmode STDOUT;
for $ARGV (@ARGV ? @ARGV : '-') {
open(my $fh, $ARGV)
or warn("Can't open $ARGV: $!\n"), next;
binmode($fh);
while (sysread($fh, my $buf, 64*1024)) {
print(unpack('B*', $buf));
}
}
Usage:
script file.in >file.out
Upvotes: 3
Reputation: 35198
You have problems with large files because you're loading the entire file into memory.
I suggest you take advantage of a trick with $/
as explained in perlvar
:
Setting
$/
to a reference to an integer, scalar containing an integer, or scalar that's convertible to an integer will attempt to read records instead of lines, with the maximum record size being the referenced integer number of characters. So this:1. local $/ = \32768; # or \"32768", or \$var_containing_32768 2. open my $fh, "<", $myfile or die $!; 3. local $_ = <$fh>;
will read a record of no more than
32768
characters from$fh
.
Therefore the following script would work:
#!env perl
use strict;
use warnings;
use open IO => ':raw'; # If called as: tobinary.pl infile.bin > out.txt
binmode(STDIN); # If called as: tobinary.pl < infile.bin > out.txt
binmode(STDOUT);
local $/ = \(1<<16);
while (<>) {
print unpack 'B*', $_;
}
Upvotes: 1