Reputation: 66
I made java program for migration of DB to DB. And this generates oom. What can I do to return memory?
I return the Garbage Collection. so, I see that the memory I am using is shrinking. However, garbage collection did not help when the number of data was large.
private void insertIntoTable(Connection con, ResultSet rs, String targetTable) {
PreparedStatement pstmt = null;
int batchSize = 50000;
String insertSql = createInsertSql(targetTable);
Blob blob = null;
try {
pstmt = con.prepareStatement(insertSql);
int i = 0;
while (rs.next()) {
i++;
int j = 0;
for (String key : colMap.keySet()) {
if (colMap.get(key).contains("TIMESTAMP")) {
pstmt.setTimestamp(++j, rs.getTimestamp(key));
} else if (colMap.get(key).contains("DATE")) {
pstmt.setDate(++j, rs.getDate(key));
} else if (colMap.get(key).contains("BLOB")) {
if(blob==null) {
blob = con.createBlob();
}
blob.setBytes(1, rs.getBytes(key));
pstmt.setBlob(++j, blob);
} else {
pstmt.setString(++j, rs.getString(key));
}
}
pstmt.addBatch();// addBatch에 담기
pstmt.clearParameters();// 파라미터 Clear
if ((i % batchSize) == 0) {
System.out.print("★");
pstmt.executeBatch();
pstmt.clearBatch();
con.commit();
}
}
System.out.println("★\ncount:\t" + i);
System.out.println();
pstmt.executeBatch();
con.commit();
} catch (SQLException e) {
e.printStackTrace();
try {
con.rollback();
} catch (SQLException e1) {
e1.printStackTrace();
}
} finally {
if (rs != null)
try {
rs.close();
rs = null;
} catch (SQLException ex) {
}
if (pstmt != null)
try {
pstmt.close();
pstmt = null;
} catch (SQLException ex) {
}
}
}
I want to migrate large amounts of data returned by the memory works well at one time.
Upvotes: 1
Views: 81
Reputation: 1260
This happen because of too large batch size.
Solution:
There is a method to define max batch size but problem with that is most of the database driver was't implement that method.
So you need to maintain the count of .addBatch()
and method call and then need to execute that batch on count of 100 or 1000 as par you application resource capacity.
So, the problem here is too large batch size (50000)
Upvotes: 1