Reputation: 8602
I am trying to use Parse::CSV to parse through a simple CSV file with a header and 2 columns. The second column may contain commas but I want to ignore them. Is there anyway to limit how many times it splits on commas? Here is what I have so far
#!/usr/bin/perl
use Parse::CSV;
my $csv = Parse::CSV->new(file => 'file.csv');
while (my $row = $csv->fetch) {
print $row->[0] . "\t" . $row->[1] . "\n";
}
Here is an example of what my data looks like:
1234,text1,text2
5678,text3
90,text4,text5
This would return
1234 text1,text2
5678 text3
90 text4,text5
Upvotes: 0
Views: 220
Reputation: 24063
If you're really wed to Parse::CSV
, you can do this using a filter:
use strict;
use warnings;
use 5.010;
use Parse::CSV;
my $parser = Parse::CSV->new(
file => 'input.csv',
filter => sub { return [ shift @$_, join(',', @$_) ] }
);
while ( my $row = $parser->fetch ) {
say join("\t", @$row);
}
die $parser->errstr if $parser->errstr;
1234 text1,text2
5678 text3
90 text4,text5
Note that performance will be poor because Parse::CSV
is splitting the columns for you, but then you immediately join them back together again.
However, since it appears that you're not working with a true CSV (columns containing the delimiter aren't quoted or escaped in any way), why not just use split
with a third argument to specify the maximum number of fields?
use strict;
use warnings;
use 5.010;
open my $fh, '<', 'input.csv' or die $!;
while (<$fh>) {
chomp;
my @fields = split(',', $_, 2);
say join("\t", @fields);
}
close $fh;
Upvotes: 1