Reputation: 1
Using the following SQL Code
copy (select row_to_json(t) from (select * from mytable) t) to '/home/soti/myfile.json';
I can export an arbitrary table to a JSON File. Each record is serialized in a single line. That means if the table has 5000 rows, the file will have 5000 rows, too.
Now I want to do the reverse: I want to import an json-file into the table. I.e. if the file contains 5000 lines, each line containing a single json-document, 5000 rows should be inserted into the table.
Is there an easy posibility?
Upvotes: 0
Views: 1367
Reputation: 398
It may not be the best solution, but you can import the data through a temp table with a json column. I created an example with a settings table which looks like this:
create table settings(id int8, key VARCHAR(50), value text);
First create a temp table with a single json column
create temporary table tmp(data json);
Then import the data with copy
copy tmp(data) from '/tmp/myfile.json';
Now you can load your data with an insert into select from statement
insert into settings(id, key, value)
select (data->>'id')::int8, data->>'key', data->>'value' from tmp;
Or if you don't want to specify the columns manaully you can use json_populate_recordset as described here
insert into settings
select p.*
from
json_populate_recordset(null::settings, (select json_agg(data) from tmp)) as p
And finally drop the temp table
drop table tmp;
Upvotes: 1