Reputation: 1476
I want to optimize the performance of this SQL query. If I populate this hashtable with one million keys the query will take around minute. How I can optimize this Java method for faster execution?
private HashMap<String, Boolean> selectedIds = new HashMap<>();
public void deleteSelectedIDs() throws SQLException {
if (ds == null) {
throw new SQLException();
}
Connection conn = ds.getConnection();
if (conn == null) {
throw new SQLException();
}
PreparedStatement ps = null;
ResultSet resultSet = null;
try {
conn.setAutoCommit(false);
boolean committed = false;
try {
String sqlDeleteQuery = "DELETE FROM ACTIVESESSIONSLOG WHERE ASESSIONID = ?";
Set<String> keySet = selectedIds.keySet();
String[] keys = new String[]{};
keys = selectedIds.keySet().toArray(keys);
ps = conn.prepareStatement(sqlDeleteQuery);
for (int i = 0; i < keys.length; i++) {
if (selectedIds.get(keys[i]).booleanValue()) {
ps.setString(1, keys[i]);
ps.executeUpdate();
ps.clearParameters();
selectedIds.put(keys[i], false); //get(keys[i]) = false;
}
}
conn.commit();
committed = true;
//selectedIds.clear();
} finally {
if (!committed) {
conn.rollback();
}
}
} finally {
ps.close();
conn.close();
}
}
Upvotes: 0
Views: 218
Reputation: 41220
I think it'd be better use CallableStatement and oracle procedure.
Snippet
SQL> create type temp_tbl
2 is
3 table of number;
4 /
...
SQL> create or replace procedure stored_p
2 (
3 list in temp_tbl,
4 p_rc out sys_refcursor )
5 as
6 begin
7 open
8 p_rc for delete from ACTIVESESSIONSLOG where ASESSIONID
in (select * from table(list));
9 end;
10 /
Upvotes: 2
Reputation: 75406
Create a temporary table and insert all the ids into there. Then do a single delete for those found in t he temporary table.
Upvotes: 2
Reputation: 99
Someone may have a better idea but have you considered passing a list of the keys to Oracle via a temp table and then performing the loop within a PL/SQL function. It will lower the traffic and the DB will do the processing.
Upvotes: 3
Reputation: 500733
If your JDBC driver supports it, use batch processing. It is easy to use, and tends to work very well for this type of scenarios.
Upvotes: 5