Reputation: 85
While inserting records by using batch insert ( https://tool.oschina.net/uploads/apidocs/Spring-3.1.1/org/springframework/jdbc/core/simple/SimpleJdbcInsert.html#executeBatch(org.springframework.jdbc.core.namedparam.SqlParameterSource[]) ) in Redhsift table , the spring framework falls back to one by one insertion and it is taking more time.
(main) org.springframework.jdbc.support.JdbcUtils: JDBC driver does not support batch updates
is there anyway to enable the batch update in redshift table?
if not , Is there anyway to improve the table insertion performance in redshift ?
I tried - adding ?rewriteBatchedStatements=true
to the jdbcurl - still the same.
Upvotes: 0
Views: 504
Reputation: 788
The recommend way of doing batch insert is to use the copy command. Thus, the comon process is to unload data from redshift to S3 using the UNLOAD command (in the case the data you want to insert comes from a query result), and then to run a copy command referencing the data location in S3. This is far more effective than an insert.
UNLOAD ('my SQL statement')
TO 's3::my-s3-target-location'
FORMAT PARQUET;
COPY my_target_table (col1, col2, ...)
FROM 's3::my-s3-target-location'
FORMAT PARQUET;
Here is the documentation:
Upvotes: 0