Reputation: 83
Perl DBD::ODBC throwing error from sql server insert statement
DBD::ODBC does not yet support binding a named parameter more than once
when binding parameters in this script is not intended. I've tried answering this on my own, and continue to do so, but haven't had any luck with answers from similar questions asked.
I've got a script to load a csv (containing a combination of datetimes, guids, and comments/message strings) into a database. I've included part of my script below for further analysis if needed. It seems to be getting hung up on this error after 350,000+ lines(about 20 columns) into the csv, but I've been unable to figure out exactly what is causing it.
I've tried setting
$self->{dbh}->{odbc_ignore_named_placeholders} = 1;
but with no success in getting past that error. All values in the csv are enclosed in single-quotes, but some of those values are strings containing either/both :
and ?
.
I'm not looking to use binding parameters, or named placeholders, in this scenario -- only to get the raw data into a database.
There's quite a bit of printing in this part of the script as a result of trouble-shooting. the script overall hasn't been intentionally/unintentionally optimized yet -- the goal for me is to just get it working. be kind.
This instantiates the "Connection" object. since the connection is made here, and assigned to $self->{dbh}
, I should be able to assign 1 to {odbc_ignore_named_placeholders}
, but it doesn't seem to be working.
sub New {
my $class = shift;
my $self = {
dbh => DBI->connect( $dsn ),
dropIfExists => 1, # 1 == drop, 0 == do not drop or create
debug => 1, # 1 == enabled, 0 == disabled
};
$self->{dbh}->{odbc_ignore_named_placeholders} = 1;
bless $self, $class;
return $self;
}
Here's the important bit that prepares the lines in an insert statement:
while ( my $row = $csv->getline( $fh ) ) {
my @fields = @{$row};
my @newline = ();
for ( @fields ) {
my $str = $_;
$str =~ s/'+/''/;
my $newString = "'" . $str . "'";
print $newString.NewLine;
push( @newline, $newString );
}
my $line = join( ",", @newline );
$values .= "(" . $line . "),";
$MasterLineCount++;
print FancyLine;
print "MasterLineCount: " . $MasterLineCount . NewLine;
print FancyLine;
$count++;
print FancyLine;
print "Line count: ".$count.NewLine;
print FancyLine;
# sql server allows a maximum of 1000 values (rows) in a
# single insert statement, so this stops the loop, and
# prepares/executes the query. if the loop hits the end of
# the file with a count < 999, it drops out of the
# while loop, and checks if count > 0, if it is, it
# prepares/executes the statement for the remaining
# rows/lines
if ( $count == 999 ) {
$statementCount++;
print FancyLine;
print "Statement Count: " . $statementCount . NewLine;
print FancyLine;
my $insertStmt = "insert into " . $db_name . ".dbo." . $tableName . " (" . $colNames . ") values " . $values;
my $chopTrailingComma = chop( $insertStmt );
# print "Insert Statement: ".NewLine;
# print FancyLine;
# print $insertStmt.NewLine;
# print FancyLine;
# load table
$self->ExecuteStmt( $insertStmt );
#reset values string
$values = "";
#reset count
$count = 0;
}
}
if ( $count > 0 ) {
$statementCount++;
print FancyLine;
print "Statement Count: " . $statementCount . NewLine;
print FancyLine;
my $insertStmt = "insert into " . $db_name . ".dbo." . $tableName . " (" . $colNames . ") values " . $values;
my $chopTrailingComma = chop( $insertStmt );
# load table
$self->ExecuteStmt( $insertStmt );
#reset values string
$values = "";
#reset count
$count = 0;
}
For clarity, this is ExecuteStmt()
:
sub ExecuteStmt {
my $self = shift;
my $stmt = shift;
$self->{sth} = $self->{dbh}->prepare( $stmt );
$self->{sth}->execute or die "$stmt".NewLine().FancyLine().$self->{sth}->errstr;
}
Upvotes: 1
Views: 184
Reputation: 83
alright, i figured out a way around it and its unrelated to the dbi module. when creating a new Text::CSV object, pass in the following parameter:
verbatim => 1,
this is what my Text::CSV constructor looks like now:
my $csv = Text::CSV_XS->new( { binary => 1,
eol => $/,
always_quote => 1,
verbatim => 1,
skip_empty_rows => 1
} );
from the documentation Text::CSV:
> This is a quite controversial attribute to set, but makes some hard
> things possible.
>
> The rationale behind this attribute is to tell the parser that the
> normally special characters newline (NL) and Carriage Return (CR) will
> not be special when this flag is set, and be dealt with as being
> ordinary binary characters. This will ease working with data with
> embedded newlines.
>
> When verbatim is used with "getline", "getline" auto-chomp's every line.
I will update this question/answer if i discover any unintended consequences relating to this change.
Upvotes: 1