Reputation: 6193
I have Postgresql on a server in a docker container. How can I connect to it from the outside, that is, from my local computer? What setting should I apply to allow that?
Upvotes: 596
Views: 888581
Reputation: 398
For anyone that might run into an issue similar to me.
i wanted to connect my Web API to a postgres db initiated via Docker compose. I didnt know what the connection string should be.
So given my postgres db declaration in compose file as this
db:
image: postgres:15
environment:
- POSTGRES_DB=investors
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- "5433:5432" # Expose port 5432 on the container to port 5433 on the host
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 10s
timeout: 5s
retries: 5
This is my connection string
"Host=localhost;Port=5433;Database=investors;Username=postgres;Password=postgres"
Upvotes: 0
Reputation: 1099
Assuming you have PostGres running in a docker container on your local machine, and have exported the main port of :5432 to :15432, you would use:
psql -h localhost -p 15432 -U postgres
Upvotes: 0
Reputation: 1121
First of all, i do not use official images if it has a Bitnami image, because Bitnami has much more powered projects.
In this case we will use a bitnami/postgresql project from DockerHub for version 14+ (important).
External port: 5050
Internal (local) port: 5060
[for your apps on a server]
Docker container name: pgdb
services:
pgdb:
image: bitnami/postgresql:latest
container_name: pgdb
restart: always
environment:
POSTGRES_USER: pgdb_user
POSTGRES_PASSWORD: pgdb_password
POSTGRES_DB: pgdb_database
POSTGRESQL_PORT_NUMBER: 5060
ports:
- "5050:5060"
volumes:
- ./pg_hba.conf:/bitnami/postgresql/conf/pg_hba.conf
- ./pgdata:/bitnami/postgresql/data
Here we use scram-sha-256
auth method (Postgres 14+) for more security
# TYPE DATABASE USER ADDRESS METHOD
# Remote VPN/IP connections
# use your VPN IP instead of xxx.xxx.xx.xxx
host all all xxx.xxx.xx.xxx/32 scram-sha-256
# Local Unix-socket connections
local all all scram-sha-256
# IPv4 local connections
host all all 127.0.0.1/32 scram-sha-256
host all all 172.16.0.0/12 scram-sha-256
# Docker internal network
host all all 192.168.0.0/16 scram-sha-256
# Local network
host all all 10.0.0.0/8 scram-sha-256
# Private network range
# IPv6 local connections
host all all ::1/128 scram-sha-256
# Local replication
local replication all scram-sha-256
host replication all 127.0.0.1/32 scram-sha-256
host replication all ::1/128 scram-sha-256
# Reject all another connections
host all all all reject
docker restart pgdb
pgdb
[local connections]pgdb_user
pgdb_password
pgdb_database
5050
[if you use a custom one and 5432 if default]docker cp pgdb:/opt/bitnami/postgresql/conf/pg_hba.conf ./my/path/
docker cp pgdb:/opt/bitnami/postgresql/conf/postgresql.conf ./my/path/
But mounting configs only to /bitnami/postgresql/conf/ folder!
You can connect remotely only if your local [VPN] IP is
xxx.xxx.xx.xxx
(from first row of pg_hba.conf).Also you can add a volume for
postgresql.conf
as in case ofpg_hba.conf
with a settinglisten_addresses = '*'
andport = 5060
โญโญโญHope that helps!
Upvotes: 0
Reputation: 502
To run Postgres on Docker
docker run -e POSTGRES_PASSWORD=mysecretpassword -d -p 5432:5432 --name my_oistgres_container postgres
To connect with Postgres DB running inside the Docker
psql -h localhost -U postgres -d postgres
you will see postgres=#
meaning you have connected
\dt; //This will show the list of tables in DB
select * from "User";
Upvotes: 2
Reputation: 61
In 2024 if you are developing locally you no longer need to set a password, just use POSTGRES_HOST_AUTH_METHOD in your environment variables (it's an unsafe option, use it only in case of local testing). For example, in the docker-compose.yml file:
services:
postgres:
image: postgres:15-alpine
ports:
- 5432:5432
environment:
POSTGRES_DB: my_amazing_db
POSTGRES_HOST_AUTH_METHOD: trust
Afterward, optionally, check that everything is okay:
docker ps -a
docker exec -it b5f2039251e1 bash
psql -h localhost -p 5432 -U postgres -l
You can now connect. For example, with HeidiSQL: Example HeidiSQL
Upvotes: 6
Reputation: 162
The answers in this thread helped me figure out how to this in Docker Desktop. Figure I would share that info. There are optional settings you can access when creating a new container and you can set the host port here.
When you go to your containers you should see value in the ports, that's how you know you can connect to the container from your local computer.
Upvotes: 0
Reputation: 2102
Connect to a local container running postgres
Install psql
brew search postgres
brew install postgresql
Start the Docker
docker run --name postgres -e POSTGRES_DB=users \
-e POSTGRES_USER=john \
-e POSTGRES_PASSWORD=password \
-p 5432:5432 -d postgres
Connect to database
psql --host=localhost --username=john --dbname=users
Upvotes: 21
Reputation:
I'm assuming that you want to be able to view data present in your container everytime you connect to it from outside. To do this, you will have to persist data on the postgres image.
If you don't have persistent data, you will have to repeat everything you did the first time.
Steps 3, 5, 6, 7, and 8 answer your question directly.
Here is the detailed overview of the entire process I followed on Windows 10 powershell (commands are the same in Linux and macOS as well):
Step 1: Start powershell in non-admin mode
Step 2: Download postgres docker image:
docker pull postgres:latest
Step 3: Start docker container in detached mode and persist data on postgres image by creating a volume and binding it to a destination
(Note: by default 5432 is the default port that is used; but state it explicitly to prevent connection errors from clients like pgadmin
, dbeaver
, etc.)
docker run --name postgres-test -e POSTGRES_PASSWORD=password \
-p 5432:5432 -v postgres-data:/var/lib/postgresql/data \
-d postgres:latest
Step 4: Check status of running containers
docker ps -a
Step 5: Go inside container_name
in interactive mode
(Note: commands like ls
, pwd
, etc. can be executed here if you've checked Linux containers during installation)
docker exec -it postgres-test psql -U postgres
Step 6: Create sample data. At this point, you can play with psql
commands in the following manner:
# CREATE DATABASE test;
# \c test
# CREATE TABLE test_table(something int);
# INSERT INTO test_table VALUES (123);
# SELECT * FROM test_table;
# \q
Step 7: Open a database client application like pgadmin
or dbeaver
and enter the below in the connection fields:
Host: localhost
Database: test
User: postgres
Password: password
Step 8: Enter the query
SELECT * FROM test_table
in the query editor and you should be able to see the output 123
.
Upvotes: 29
Reputation: 1021
I am using django with postgres in Docker containers. in the docker-compose file, add the following:
db:
image: postgres:10-alpine
environment:
- POSTGRES_DB=app
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=supersecretpassword
ports:
- "6543:5432"
This ports
setting uses the port 6543
(it just needs to be different from 5432
) that is accessible by your local machine. For myself, I connected DBeaver to it. this will prevent port clashes between your app request and local machine request.
At first, I got a message saying that the port 5432 is in use (which is by django app) so I couldn't access by pgAdmin or DBeaver.
Upvotes: 85
Reputation: 61
After building my gateway-microservice application i had the same issue. Can not to connect to contenerized postgresql from Heidisql.
At this moment i have solved it by simply specifying postgresql password to docker-compose.yml as well as port.
So you should find and open docker-compose.yml. Then you should enter POSTGRES_PASSWORD (don`t let it to be empty), and specify the port โ5432:5432โ
services: microservice33-postgresql: environment: - POSTGRES_USER=microservice33 - POSTGRES_PASSWORD=wwww - POSTGRES_HOST_AUTH_METHOD=trust ports: - 5432:5432
link for reference and screenshots post
Upvotes: 1
Reputation: 3239
I managed to get it run on linux
run the docker postgres - make sure the port is published, I use alpine because it's lightweight.
docker run --rm -P -p 127.0.0.1:5432:5432 -e POSTGRES_PASSWORD="1234" --name pg postgres:alpine
using another terminal, access the database from the host using the postgres uri
psql postgresql://postgres:1234@localhost:5432/postgres
for mac users, replace psql with pgcli
Upvotes: 221
Reputation: 514
This one worked for me:
PGPASSWORD=postgres psql -h localhost -p 3307 -U postgres -d postgres
Use the above to load an initial script as:
PGPASSWORD=postgres psql -h localhost -p 3307 -U postgres -d postgres < src/sql/local/blabla.sql
Do not that i remap my ports as:
docker run -p3307:5432 --name postgres -e POSTGRES_PASSWORD=postgres -d postgres
Upvotes: 2
Reputation: 803
I know this is late, if you used docker-compose like @Martin
These are the snippets that helped me connect to psql inside the container
docker-compose run db bash
root@de96f9358b70:/# psql -h db -U root -d postgres_db
I cannot comment because I don't have 50 reputation. So hope this helps.
Upvotes: 27
Reputation: 27466
You can also access through docker exec command by:
$ docker exec -it postgres-container bash
# su postgres
$ psql
Or
$ docker exec -it postgres-container psql -U postgres
Upvotes: 118
Reputation: 2990
docker ps -a
to get container ids then
docker exec -it psql -U -W
Upvotes: -2
Reputation: 85
first open the docker image for the postgres
docker exec -it <container_name>
then u will get the root --root@868594e88b53:/#
it need the database connection
psql postgresql://<username>:<databasepassword>@postgres:5432/<database>
Upvotes: 6
Reputation: 13456
For some reason 5432 port seems protected. I changed my port config from 5432:5432
to 5416:5432
and the following command worked to connect to your postgres database from outside its docker container:
psql -h localhost -p 5416 -U <my-user> -d <my-database>
Upvotes: 22
Reputation: 423
To connect from the localhost you need to add '--net host':
docker run --name some-postgres --net host -e POSTGRES_PASSWORD=mysecretpassword -d -p 5432:5432 postgres
You can access the server directly without using exec from your localhost, by using:
psql -h localhost -p 5432 -U postgres
Upvotes: 18
Reputation: 455
There are good answers here but If you like to have some interface for postgres database management, you can install pgAdmin on your local computer and connect to the remote machine using its IP and the postgres exposed port (by default 5432).
Upvotes: -3
Reputation: 257
I tried to connect from localhost (mac) to a postgres container. I changed the port in the docker-compose file from 5432 to 3306 and started the container. No idea why I did it :|
Then I tried to connect to postgres via PSequel and adminer and the connection could not be established.
After switching back to port 5432 all works fine.
db:
image: postgres
ports:
- 5432:5432
restart: always
volumes:
- "db_sql:/var/lib/mysql"
environment:
POSTGRES_USER: root
POSTGRES_PASSWORD: password
POSTGRES_DB: postgres_db
This was my experience I wanted to share. Perhaps someone can make use of it.
Upvotes: 9
Reputation: 30723
You can run Postgres this way (map a port):
docker run --name some-postgres -e POSTGRES_PASSWORD=mysecretpassword -d -p 5432:5432 postgres
So now you have mapped the port 5432 of your container to port 5432 of your server. -p <host_port>:<container_port>
.So now your postgres is accessible from your public-server-ip:5432
To test: Run the postgres database (command above)
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
05b3a3471f6f postgres "/docker-entrypoint.s" 1 seconds ago Up 1 seconds 0.0.0.0:5432->5432/tcp some-postgres
Go inside your container and create a database:
docker exec -it 05b3a3471f6f bash
root@05b3a3471f6f:/# psql -U postgres
postgres-# CREATE DATABASE mytest;
postgres-# \q
Go to your localhost (where you have some tool or the psql client).
psql -h public-ip-server -p 5432 -U postgres
(password mysecretpassword)
postgres=# \l
List of databases
Name | Owner | Encoding | Collate | Ctype | Access privileges
-----------+----------+----------+------------+------------+-----------------------
mytest | postgres | UTF8 | en_US.utf8 | en_US.utf8 |
postgres | postgres | UTF8 | en_US.utf8 | en_US.utf8 |
template0 | postgres | UTF8 | en_US.utf8 | en_US.utf8 | =c/postgres
So you're accessing the database (which is running in docker on a server) from your localhost.
In this post it's expained in detail.
Upvotes: 735
Reputation: 1013
I already had running postgres on host machine and didn't want to allow connections from network, so I did run temporary postgres instance in container and created database in just two lines:
# Run PostgreSQL
docker run --name postgres-container -e POSTGRES_PASSWORD=password -it -p 5433:5432 postgres
# Create database
docker exec -it postgres-container createdb -U postgres my-db
Upvotes: 22
Reputation: 27466
In case, it is a django backend application, you can do something like this.
docker exec -it container_id python manage.py dbshell
Upvotes: 1