Reputation: 11
I have a source table (staging table) with million records and a target table (application table) with million records in PostgreSQL. The target table is being used by a mobile application (synchronous).
I have to do merge (insert on conflict) from source table to target table and also delete some records in target table conditionally. This has to happen once in every 4 hours (because data in stage table is from a different source system). I shouldn't use Stored Procedures because of some code maintenance reasons.
Can I use JDBC to execute multiple queries in parallel with different offset and limit? Will it improve performance?
insert into test ... select ... from test_stg order by .. OFFSET 0 ROWS FETCH FIRST 1000 ROW ONLY on conflict (constraint...) do update set ...
What is the best way of running the queries parallelly within a single transaction (atomic) from Java code?
Note: I can perform delete operation in target table, once if I succeed on merge from stage to actual table.
Upvotes: 1
Views: 1180
Reputation: 127347
I would DELETE and INSERT/UPDATE in one Go, something like this:
WITH cte_delete AS (
DELETE FROM test_stg
RETURNING *
)
INSERT INTO test
SELECT *
FROM cte_delete
ON CONFLICT (...)
DO UPDATE ...;
I wouldn't start with any other performance optimisations. Keeping things simple and straight forward, works best most of the time. And you can't fix any problem if you don't have a problem.
Upvotes: 1