Reputation: 5357
I need to bulk load a large file into PostgreSQL. I would normally use the COPY command, but this file needs to be loaded from a remote client machine. With MSSQL, I can install the local tools and use bcp.exe on the client to connect to the server.
Is there an equivalent way for PostgreSQL? If not, what is the recommended way of loading a large file from a client machine if I cannot copy the file to the server first?
Thanks.
Upvotes: 1
Views: 8567
Reputation: 2724
Use psql
's \copy
command to load data in sql:
$ psql -h <IP> -p <port> -U <username> -d <database>
database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' delimiter as '|'
database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' with csv header
Upvotes: 0
Reputation: 21
You can use the \copy
command from psql tool like:
psql -h IP_REMOTE_POSTGRESQL -d DATABASE -U USER_WITH_RIGHTS -c "\copy
TABLE(FIELD_LIST_SEPARATE_BY_COMMA) from 'FILE_IN_CLIENT_MACHINE(MAYBE IN THE SAME
DIRECTORY)' with csv header"
Upvotes: 2
Reputation: 16417
Assuming you have some sort of client in order to run the query, you can use the COPY FROM STDIN form of the COPY command: http://www.postgresql.org/docs/current/static/sql-copy.html
Upvotes: 1
Reputation: 181290
COPY
command is supported in PostgreSQL Protocol v3.0 (Postgresql 7.4 or newer).
The only thing you need to use COPY
from a remote client is a libpq enabled client such as psql
command line utility.
From the remote client run:
$ psql -d dbname -h 192.168.1.1 -U uname < yourbigscript.sql
Upvotes: 3