Joey Baruch
Joey Baruch

Reputation: 5209

add an airflow connection to a localhost database (postgres running on docker)

I have a dockerized postgres running locally, to which I can connect to via pgAdmin4 and via psql.

Using the same connection details, I set up an airflow connection on the UI

enter image description here

However, when trying to load a DAG that uses that connection, it throws an error:

Broken DAG: [/usr/local/airflow/dags/s3upload.py] Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/airflow/providers/postgres/hooks/postgres.py", line 113, in get_conn self.conn = psycopg2.connect(**conn_args) File "/usr/local/lib/python3.7/site-packages/psycopg2/init.py", line 127, in connect conn = _connect(dsn, connection_factory=connection_factory, **kwasync) psycopg2.OperationalError: could not connect to server: Connection refused Is the server running on host "127.0.0.1" and accepting TCP/IP connections on port 54320?

As mentioned, the postgres instance is running, and the port forwarding is active, as proven by successful pgAdmin and psql logins.

Any ideas?

Upvotes: 3

Views: 4306

Answers (1)

paltaa
paltaa

Reputation: 3244

use host.docker.internal, which will point to your localhost and not the container localhost, it will work if the pg port is mapped to your 5432 port.

Upvotes: 9

Related Questions