Despicable me
Despicable me

Reputation: 700

Netezza Batch Insert is very slow even in Batch execute mode

I am referring to this documentation. http://www-01.ibm.com/support/docview.wss?uid=swg21981328. As per the article if we use executeBatch method then inserts will be faster (The Netezza JDBC driver may detect a batch insert, and under the covers convert this to an external table load and external table load will be faster). I had to execute millions of insert statements and i am getting only a speed of 500 records per minute per connection max. Is there any better way to load data faster to netezza via jdbc connection? I am using spark and jdbc connection to insert the records.Why external table via loading is not happening even when i am executing in batches. Given below is the spark code i am using,

Dataset<String> insertQueryDataSet.foreachPartition( partition -> {                 
    Connection conn = NetezzaConnector.getSingletonConnection(url, userName, pwd);
    conn.setAutoCommit(false);
    int commitBatchCount = 0;
    int insertBatchCount = 0;
    Statement statement = conn.createStatement();
    //PreparedStatement preparedStmt  = null;
    while(partition.hasNext()){
        insertBatchCount++;             
        //preparedStmt = conn.prepareStatement(partition.next());
        statement.addBatch(partition.next());
        //statement.addBatch(partition.next());
        commitBatchCount++;
        if(insertBatchCount % 10000 == 0){
            LOGGER.info("Before executeBatch.");                            
            int[] execCount = statement.executeBatch();
            LOGGER.info("After execCount." + execCount.length);                     
            LOGGER.info("Before commit.");
            conn.commit();                          
            LOGGER.info("After commit.");                                                   
        }                       
    }
    //execute remaining statements
    statement.executeBatch();
    int[] execCount = statement.executeBatch();
    LOGGER.info("After execCount." + execCount.length);
    conn.commit();          
    conn.close();

}); 

Upvotes: 1

Views: 602

Answers (1)

ajit junghare
ajit junghare

Reputation: 165

I tried this approach(batch insert) but found very slow, So I put all data in CSV & do external table load for each csv.

InsertReq="Insert into "+ tablename + " select * from external '"+ filepath + "' using (maxerrors 0, delimiter ',' unase 2000 encoding 'internal' remotesource 'jdbc' escapechar '\' )";
Jdbctemplate.execute(InsertReq);

Since I was using java so JDBC as source & note that csv file path is in single quotes . Hope this helps. If you find better than this approach, don't forget to post. :)

Upvotes: 1

Related Questions