maxime.b28
maxime.b28

Reputation: 59

How can I import a CSV file containing JSON into postgresql?

I am trying to import this CSV file:

HEADER
"{
  ""field1"": ""123"",
  ""field2"": ""456""
}"
"{
  ""field1"": ""789"",
  ""field2"": ""101""
}"

into my Postgres table.

However, it seems that the \copy my_table(my_text) from 'my_file.csv' command is creating a row for each line of the file.

This is what I get :

                       my_text
-----------------------------------------------------
 HEADER
 "{
   ""field1"": ""123"",
   ""field2"": ""456""
 }"
 "{
   ""field1"": ""789"",
   ""field2"": ""101""
 }"
(9 rows)

and what I expect:

{""field1"": ""123"", ""field2"": ""456""}
{""field1"": ""789"", ""field2"": ""101""}
(2 rows)

Upvotes: 2

Views: 1849

Answers (2)

jjanes
jjanes

Reputation: 44137

The default for \copy is "text" format, not "csv" format. All you have to do is tell your \copy to use csv, and that there is a header line.

\copy my_table(my_text) from 'my_file.csv' csv header

Changing the escape character as the other answer suggests is unnecessary and in fact will break things. Your data will load, but will not be valid JSON.

You probably want to make this column type JSON or JSONB, not text. This will automatically validate your data as being valid JSON data, and in the case of JSONB will make future parsing of it faster.

Upvotes: 0

Jim Jones
Jim Jones

Reputation: 19603

Escaping the new line might do the trick:

\copy my_table(my_text) from my_file.csv csv header escape E'\n' quote '"'

Upvotes: 1

Related Questions