Mite Mite Kyle
Mite Mite Kyle

Reputation: 5

How to optimize this script performing INSERTS into a database?

So i already complete a script that will insert data into mysql table and move those file into a directory until all files are none. There around 51 files and it took around 9 sec to complete the execution. So my question is . is there a better way to speed up the execution process?

the codes are

our $DIR="/home/aimanhalim/LOG";
our $FILENAME_REGEX = "server_performance_";
# mariaDB config hash
our %db_config = ( "username"=>"root", "password"=> "", "db"=>"Top_Data", "ip" => "127.0.0.1", "port" => "3306");
main();

exit;


sub main()
{
    my $start = time();
   print "Searching file $FILENAME_REGEX in $DIR...\n";
   opendir (my $dr , $DIR) or die "<ERROR> Cannot open dir: $DIR \n";
   while( my $file = readdir $dr )
   {
      print "file in $DIR: [$file]\n";
      next if (($file eq ".") || ($file eq "..") || ($file eq "DONE"));   



      #Opening The File in the directory
      open(my $file_hndlr, "<$DIR/$file");

      #Making Variables.
      my $line_count = 0;
      my %data = ();
      my $dataRef = \%data;
      my $move = "$DIR/$file";
      print "$file\n";
      while (<$file_hndlr>) 
      { 
         my $line = $_;  
         chomp($line);
         print "line[$line_count] - [$line]\n";
         if($line_count == 0)
         {
            # get load average from line 0
            ($dataRef) = get_load_average($line,$dataRef);

            print Dumper($dataRef);

         }
         elsif ($line_count == 2)
         {

          ($dataRef) = get_Cpu($line,$dataRef);
          print Dumper($dataRef);

         }

$line_count++;
      }

      #insert db
      my ($result) = insert_record($dataRef,\%db_config,$file);
      my $Done_File="/home/aimanhalim/LOG/DONE";


sub insert_record(){

 my($data,$db_config,$file)=@_;
 my $result = -1; # -1 fail;  0 - succ

  # connect to db
  # connect to MySQL database
  my $dsn = "DBI:mysql:database=".$db_config->{'db'}.";host=".$db_config->{'ip'}.";port=".$db_config->{'port'};
my $username = $db_config->{'username'};
my $password = $db_config->{'password'};

my %attr = (PrintError=>0,RaiseError=>1 );
my $dbh = DBI->connect($dsn,$username,$password,\%attr) or die $DBI::errstr;

print "We Have Successfully Connected To The Database \n";

  $stmt->execute(@param_bind);
  ****this line is insert data statement***
  $stmt->finish();
    print "The Data Has Been Inserted Successfully\n";
  $result = 0;
  return($result);

  # commit
  $dbh->commit();
  # return succ / if fail rollback and return fail
  $dbh->disconnect();

}
exit;

editted

so pretty much this is my code with some sniping here and there.

i tried to put the 'insert_record' below the comment #insert db but i dont think that do anything :U

Upvotes: 0

Views: 95

Answers (1)

knittl
knittl

Reputation: 265547

You are connecting to the database for every file that you want to insert (if I read your code correctly, there seems to be a closing curly brace missing, it won't actually compile). Opening new database connections is (comparably) slow.

Open the connection once, before inserting the first file and re-use it for subsequent inserts into the database. Close the connection after your last file was inserted into the database. This should give you a noticable speed up.

(Depending on the amount of data, 9 seconds might actually not be too bad; but since there is no information on that, it's hard to say).

Upvotes: 6

Related Questions