Reputation: 55
What is a good way to insert large amount of data into an Postgres table using node?
We are using an api to fetch a json object array with a lot of data (objects) from a 3rd party service and we need to send this data to our Postgres database using a node library.
Upvotes: 0
Views: 728
Reputation: 11581
If you have a large amount of data (ie, lots of rows) in json form, it will probably be faster to pass the json directly to postgres and have it process it and insert the data into a table.
Example:
Table foo contains one column and one row which contains this: [{"a":707,"b":12,"c":407}, {"a":221,"b":327,"c":19}, {"a":472,"b":797,"c":430}, {"a":515,"b":174,"c":198}, ..... up to 100k values]
Let's insert each json object of this array into a table "target" which has corresponding columns a,b,c ...
INSERT INTO target (a,b,c)
SELECT (j->>'a')::INTEGER, (j->>'b')::INTEGER, (j->>'c')::INTEGER
FROM (SELECT json_array_elements(data) j from foo) f;
This does the 100k rows in less than 1s, which is a lot faster than doing the same number of INSERTs.
Postgres has many ways to manipulate json, so you can season to taste.
Upvotes: 1