Reputation: 2068
I have a file like (CSV file):
value1|value2|value2....
value1|value2|value2....
value1|value2|value2....
value1|value2|value2....
and would like to load these data into a postgresql table.
Upvotes: 30
Views: 148789
Reputation: 2200
(2025) Probably far too late for OP, but for others who come across this question.
Those existing answers all works great if you have a static file that you want to load into postgresql. On the other hand, if you have some file that is being updated by other systems and always have the last version available as a postgresql table, you can use the Foreign Data Wrapper file_fdw.
First install the extensions:
create extension postgres_fdw;
create extension file_fdw;
Then you must set up a server:
create server my_file foreign data wrapper file_fdw;
and at the end, you create the foreign table:
create foreign table my_file_table(
field1 text,
field2 text,
field3 text)
server my_file options (filename '/path/to/my_file.txt', delimiter '|');
(Set the field names and types to match whatever is in your table)
Then you will have a table that exactly mirrors what is in the file.
If the table can not be parsed by the options to the foreign table creation, then it is also possible to read in the table as whole lines and then make a view using postgresql's text parsing functions to get the fields. E.g. I needed the information from the /etc/hosts in postgres. I made a table etc_hosts_lines with the unparsed lines
create foreign table etc_hosts_line(hostline text)
server etc_hosts options (filename '/etc/hosts');
then I parsed them by
create or replace view etc_hosts as
select (regexp_match(hostline,'^(\S+)\s+'))[1] as ip,
lower((regexp_match(hostline,'(\s+)(.+)'))[2]) as hostname from etc_hosts_line ;
(yup as there may be more hostnames per ip, e.g
127.0.0.1 localhost localhost.localdomain
queries to this table should use like rather than = )
Upvotes: 0
Reputation: 414
COPY description_f (id, name) FROM 'absolutepath\test.txt' WITH (FORMAT csv, HEADER true, DELIMITER ' ');
Example
COPY description_f (id, name) FROM 'D:\HIVEWORX\COMMON\TermServerAssets\Snomed2021\SnomedCT\Full\Terminology\sct2_Description_Full_INT_20210131.txt' WITH (FORMAT csv, HEADER true, DELIMITER ' ');
Upvotes: 0
Reputation: 18385
There's Pgloader that uses the aforementioned COPY
command and which can load data from csv (and MySQL, SQLite and dBase). It's also using separate threads for reading and copying data, so it's quite fast (interestingly enough, it got written from Python to Common Lisp and got a 20 to 30x speed gain, see blog post).
To load the csv file one needs to write a little configuration file, like
LOAD CSV
FROM 'path/to/file.csv' (x, y, a, b, c, d)
INTO postgresql:///pgloader?csv (a, b, d, c)
…
Upvotes: 3
Reputation: 12363
Let consider that your data are in the file values.txt
and that you want to import them in the database table myTable
then the following query does the job
COPY myTable FROM 'value.txt' (DELIMITER('|'));
https://www.postgresql.org/docs/current/static/sql-copy.html
Upvotes: 19
Reputation:
The slightly modified version of COPY
below worked better for me, where I specify the CSV
format. This format treats backslash characters in text without any fuss. The default format is the somewhat quirky TEXT
.
COPY myTable FROM '/path/to/file/on/server' ( FORMAT CSV, DELIMITER('|') );
Upvotes: 27
Reputation:
Check out the COPY command of Postgres:
http://www.postgresql.org/docs/current/static/sql-copy.html
Upvotes: 6