Reputation: 49
I can't seem to export more than 85,000 rows to an .xlsx file using Excel::Writer::XLSX. Exporting more than 85,000 rows of data results in a 5KB file with no data in it. The script takes roughly the same amount of time to run when exporting 85,000 records or 90,000 records (about 40 seconds).
The file with 85,000 rows is 7.9MB but the file with 90,000 rows is only 5KB.
Monitoring the export script using top, my perl script only shows about 1% memory usage and goes away after a few seconds. I'm new to using perl so I am not sure if there is another process I should be monitoring to see if it is running out of memory.
The perl script actually shows about 7% memory usage when exporting 85,000 rows or less and it remains in the process list until the export is finished.
Upvotes: 1
Views: 1046
Reputation: 41664
I tried the following program to generate a 90,000x20 worksheet and the output file was fine. It was a bit smaller than you reported (~6MB) but there is compression involved so it depends on the input data:
#!/usr/bin/perl -w
use strict;
use warnings;
use Excel::Writer::XLSX;
my $workbook = Excel::Writer::XLSX->new('bigfile.xlsx');
$workbook->set_optimization();
my $worksheet = $workbook->add_worksheet();
$worksheet->set_column(0, 50, 18);
for my $row (0 .. 90000 -1) {
for my $col (0 .. 20 -1) {
$worksheet->write($row, $col, "Row: $row Col: $col");
}
}
__END__
My only other suggestion is to make sure that you are using a version of Excel::Writer::XLSX >= 0.51 since there was a memory leak in optimization mode before that.
After that, I think you should submit a bug report. Make sure you attach the output generated by the Excel::Writer::XLSX bug_report.pl program.
Upvotes: 2
Reputation: 41664
There are two things you can try to resolve this.
Upvotes: 1