Reputation: 2043
In our firm, we have a use case to parse a csv file having 50k records. My task is read the csv file line by line, convert into an objets and then insert into a multiple tables. I was using sprint jdbc batch mechanism to write the into a table. Now my problem is i am unable to write data into multiple tables.
insert into role(name, code) values(?,?);
insert into person(first_name, last_name, description, role_id) values(?,?,?,?);
can somebody tell me how i can get last inserted role_id and then pass it to second query. All is operation is happening inside the jdbcbatcupdate method.
Upvotes: 1
Views: 351
Reputation: 137104
Using JDBC batch update mechanism, there is actually no way to get the generated keys with a driver-independent solution. There is a issue on Spring JIRA for this but it is closed as "Won't fix".
Your best bet would be to drop the batching mechanism and update the database line by line. The logic would become : insert new role, get last inserted role_id and use it to insert the new person.
You can retrieve the last role_id using a GeneratedKeyHolder
like this :
KeyHolder holder = new GeneratedKeyHolder();
jdbcTemplate.update(new PreparedStatementCreator() {
@Override
public PreparedStatement createPreparedStatement(Connection connection) throws SQLException {
PreparedStatement ps = connection.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS);
//set values in ps
}
}, holder);
Long newRoleId = holder.getKey().longValue();
Upvotes: 1
Reputation: 21473
I'd just use a subselect for the id in the second value. Instead of using
insert into role(name, code) values(?,?);
insert into person(first_name, last_name, description, role_id) values(?,?,?,?);
Why not use (assuming roles are unique):
insert into role(name, code) values(?,?);
insert into person(first_name, last_name, description, role_id) values(?,?,?,select id from role where name = ? and code = ?);
Upvotes: 0