Can not connect to Postgres database instance on Google cloud - postgresql

I'm having trouble connect to the database on google cloud.
In my Dockerfile, I'm calling a python script to connect to the database, as follows:
Dockerfile:
....
ADD script.py /home/script.py
CMD ["/home/script.py"]
ENTRYPOINT ["python"]
Python script
import sqlalchemy as db
# x.x.x.x is the public ip of the database instance
engine = db.create_engine('postgresql+psycopg2://user:db_pass#x.x.x.x/db_name')
connection = engine.connect()
metadata = db.MetaData()
experiments = db.Table('experiments', metadata, autoload=True, autoload_with=engine)
But I keep getting this error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Operation timed out
Is the server running on host "x.x.x.x" and accepting
TCP/IP connections on port 5432?
Can someone help me? Thank you!!

Related

Connect to local Postgres from docker airflow

I have installed airflow with Postgres using docker compose. I am able to connect to Postgres from airflow defining connection in airflow website. Now I want to do different thing. I installed Postgres locally on my pc (not on docker). I would like by airflow running in docker to access pc's Postgres. How can I achieve that?
I've tried as follows to create test dag:
from airflow import DAG
from airflow.providers.postgres.hooks.postgres import PostgresHook
from airflow.operators.python import PythonOperator
from datetime import datetime
import psycopg2
def execute_query_with_psycopg(my_query, **kwargs):
print(my_query) # 'value_1'
conn_args = dict(
host='localhost',
user='postgres',
password='qaz',
dbname='postgres',
port=5432)
conn = psycopg2.connect(**conn_args)
cur = conn.cursor()
cur.execute(my_query)
for row in cur:
print(row)
with DAG(dag_id="test2",
start_date=datetime(2021, 1, 1),
schedule_interval="#once",
catchup=False) as dag:
task1 = PythonOperator(
task_id="test2_task",
python_callable=execute_query_with_psycopg,
op_kwargs={"my_query": 'select 1'})
task1
nevertheless I am getting following error:
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
psycopg2.OperationalError: could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
could not connect to server: Cannot assign requested address
Is the server running on host "localhost" (::1) and accepting
TCP/IP connections on port 5432?
[2022-05-31 18:30:06,005] {taskinstance.py:1531} INFO - Marking task as FAILED. dag_id=test2, task_id=test2_task, execution_date=20220531T175542, start_date=20220531T183005, end_date=20220531T183006
[2022-05-31 18:30:06,098] {local_task_job.py:151} INFO - Task exited with return code 1
Docker is not using the same network as your computer, meaning that once you run a PostgreSQL server locally, and a Docker image locally, they are not necessarily connected.
When you started the images using Docker compose, the situation is different, because Docker compose creates a network bridge between the images that it starts.
Regarding how to connect your local PostgreSQL to the Airflow Docker image, you can try to consult the following question:
Allow docker container to connect to a local/host postgres database
If you want your Airflow Docker image to use the network of your local PC (ALL network), you can start the Airflow container with the --network=host parameter, but use it with caution :)
For anyone having same issue that's how I resolved that specifying host as follows:
host='host.docker.internal'

Could not connect to Azure cloud hyperscale citus

i tried connecting my local postgresql to azure via hyperscale citus but got this error.
psql: error: could not connect to server: could not connect to server:
Connection timed out (0x0000274C/10060)
Is the server running on host "mydemo-c.postgres.database.azure.com" (20.42.29.217) and accepting
TCP/IP connections on port 5432?
I tried the steps asked earlier like re-configuring conf files and making available all listening ports but it didn't worked out. Any help?

Unable to Connect to AWS RDS Postgres from local system

I have created an AWS RDS Instance with Postgres 10.6
I am trying to connect to it from my local system using below command:
psql --host=dev.xyz.ap-south-1.rds.amazonaws.com --port=5432 --user="postgres" --password --dbname=abc
The Inbound rules i have set are
Allow TCP traffic on 5432 from Anywhere.
Still I am getting below error:
psql: could not connect to server: Connection timed out
Is the server running on host "dev.xyz.ap-south-1.rds.amazonaws.com" (xxx.xxx.xxx.xxx) and accepting
TCP/IP connections on port 5432?
If Publicly accessible = No, then you will not be able to access the RDS database from outside the VPC.
This is because the DNS Name of the database will not resolve to an IP address.

How to start postgresql server in Heroku?

I'm trying to deploy my app on Heroku but I keep getting the same error I get locally when trying to start the server before starting the database:
psycopg2.OperationalError: could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?
Normally, I would just type sudo service postgresql start and that would solve it but it doesn't seem to work on Heroku bash.
Any suggestions?
You need to check if PostgreSQL addon is added to your project if not then you will need to add that by running
heroku addons:create heroku-postgresql:hobby-dev
For more option follow this link: https://www.heroku.com/postgres

Not able to connect to postgres on aws ubuntu

When I run python3 manage.py makemigrations.
It throws the following error.
File "/usr/local/lib/python3.5/dist-packages/psycopg2/init.py", line 164, in connect
conn = _connect(dsn, connection_factory=connection_factory, async=async)
django.db.utils.OperationalError: could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
So I had checked whether postgres is working fine
I had tried sudo su - postgres
It went inside the postgres cmd prompt so it became
postgres#ip-10-254-3-58:~$
now when I try psql I get the same error as when I run python3 manage.py makemigrations.
postgres#ip-10-254-3-58:~$psql
psql: could not connect to server: No such file or directory
Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
Please help me in fixing this issue.
TIA
Kind Regards,
Bharath AK