user1807752
user1807752

Reputation: 31

Perl script using DBI module with Oracle 8 stops processing

I'm having a problem with a Perl script that uses DBI module (version 1.601) connecting to Oracle 8. The OS is FreeBSD 7.0.

The script just opens a connection, loops through a CSV file inserting data one row/line at a time, then closes the connection and exits. A cron job executes this script every 30 minutes processing any incoming CSV files. On occasion, the script stops processing midway through (no new data inserted, does not consume any more cpu time) and it becomes necessary to kill it.

There appears to be a size threshold for the CSV file. Anything under 35000 rows, the script executes and exits ok. Otherwise it stalls and the process sits (seemingly) dormant/waiting.

I only have remote access to Oracle8 through SQL*Plus. Checked the Oracle v$session table and see the connections for these stalled processes don't get closed, so perhaps hitting some resource limit? Running 'limits' on FreeBSD yields the following:

cputime          infinity secs
filesize         infinity kB
datasize           524288 kB
stacksize           65536 kB
coredumpsize     infinity kB
memoryuse        infinity kB
memorylocked     infinity kB
maxprocesses         5547
openfiles           11095
sbsize           infinity bytes
vmemoryuse       infinity kB

I'm unsure how to proceed. How or where can I look to narrow down the problem?

Any help is appreciated.


Here is the script with changes to some var names and input checking code omitted:

#!/usr/bin/perl -w
use DBI;
use strict;

my $exitstatus = 0;
my $infile = shift;
open (INFILE, "$infile");

my $dbh = DBI->connect(
  'dbi:Oracle:sid=mysid;host=myhostname',
  'myusername',
  'mypassword',
  {
    RaiseError => 1,
    AutoCommit => 1
  }
) or die $DBI::errstr;

while (<INFILE>) {
  chomp;
  next if (/^#.*$/ || /^$/);
  my @columns = split /\t/;
  my ($var1, $var2, $var3) = @columns;
  eval {
    my $oProce = qq{
      BEGIN
        InsertStoredProc(
        field1 => $var1,
        field2 => $var2,
        field3 => $var3
        );
      END;
    };
    $dbh->do( $oProce );
  };
  if ( $@ ) {
    $exitstatus=1;
    warn "LINE: @columns\n";
    warn "Execution of stored procedure failed: $DBI::errstr\n";
    warn "################################################################################\n";
  }
}
$dbh->disconnect;
close INFILE;
exit $exitstatus;

Upvotes: 3

Views: 745

Answers (2)

bohica
bohica

Reputation: 5992

Not that I think it relevant but that code is pretty inefficient. You are parsing your SQL for every row inserted. Why not do:

my $sth = $dbh->prepare('BEGIN InsertStoredProc(?,?,?); END;');
while (<INFILE>) {
  chomp;
  next if (/^#.*$/ || /^$/);
  my @columns = split /\t/;
  #my ($var1, $var2, $var3) = @columns;
  eval {
       $sth->execute(@columns);
    #my $oProce = qq{
    #  BEGIN
    #    InsertStoredProc(
    #    field1 => $var1,
    #    field2 => $var2,
    #    field3 => $var3
    #    );
    #  END;
    #};
    #$dbh->do( $oProce );
  };
  if ( $@ ) {
    $exitstatus=1;
    warn "LINE: @columns\n";
    warn "Execution of stored procedure failed: $DBI::errstr\n";
    warn "################################################################################\n";
  }
}

The other suggestion of turning off AutoCommit will also speed up your code as you will only be committing every N rows instead of every row. Why using AutoCommit should cause a hang makes no sense to me.

As for the point where it appears to hangs if you can reproduce it at will try running it with DBI_TRACE=15=x.log and set ora_verbose => 6 in the connect. What is at the end of the log file when it hangs.

Upvotes: 4

user1126070
user1126070

Reputation: 5069

Did you tried without AutoCommit? Maybe you creates to many transaction and Oracle stops processing more requests.

Try commit in every 100 row and when there is no more data.

You could use strace -p your_program_pid to see what's going on, where your scripts stalls.

I hope this will helps you.

Upvotes: 0

Related Questions