Connecting to AWS PostgreSQL from Neomodel Django - postgresql

I am trying to implement neomodel package in my Django code which is designed as a back-end service. The problem I am facing is that I already established a PostgreSQL connection from Django but I am facing difficulty in in using the same database for neomodel. From the official neomodel website, I can see:
from neomodel import db
db.set_connection('bolt://neo4j:neo4j#localhost:7687')
Is there any feasible solution for me to connection to an external database to neomodel for graph analytics? Any help would be appreciated.
Thanks,
Winston

Try to use
from neomodel import db as neodb
neodb.set_connection('bolt://neo4j:neo4j#localhost:7687')
This code imports object called db from namespace neomodel into the current namespace and gives it the alias neodb

Related

Issues connecting to a Google Cloud SQL instance from Google Cloud Run

I have a postgresql Google Cloud SQL instance and I'm trying to connect a FastAPI application running in Google Cloud Run to it, however I'm getting ConnectionRefusedError: [Errno 111] Connection refused errors.
My application uses the databases package for async database connections:
database = databases.Database(sqlalchemy_database_uri)
Which then tries to connect on app startup through:
#app.on_event("startup")
async def startup() -> None:
if not database.is_connected:
await database.connect() <--- this is where the error is raised
From reading through the documentation here it suggests forming the connection string like so:
"postgresql+psycopg2://user:pass#/dbname?unix_sock=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME/.s.PGSQL.5432"
I've tried several different variations of the url, with host instead of unix_sock as the sqlalchemy docs seem to suggest, as well as removing the .s.PGSQL.5432 at the end as I've seen some other SO posts suggest, all to no avail.
I've added the Cloud SQL connection to the instance in the Cloud Run dashboard and added a Cloud SQL Client role to the service account.
I'm able to connect to the databases locally with the Cloud SQL Auth Proxy.
I'm at a bit of a loss on how to fix this, or even how to debug it as there doesn't seem to be any easy way to ssh into container and try out some things. Any help would be greatly appreciated, thanks!
UPDATE
I'm able to connect directly with sqlalchemy with:
from sqlalchemy import create_engine
engine = create_engine(url)
engine.connect()
Where url is any of these formats:
"postgresql://user:pass#/db_name?host=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME"
"postgresql+psycopg2://user:pass#/db_name?host=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME"
"postgresql+pg8000://user:pass#/db_name?unix_sock=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME/.s.PGSQL.5432"
Is there something due to databases's async nature that's causing issues?
turns out this was a bug with the databases package. this should now be resolved with https://github.com/encode/databases/pull/423

How to connect a MongoDb atlas database with Jaspersoft studio

I'm developing a Java Spring boot application with MongoDb database (MongoDB Atlas) and trying to generate reports from backend with Jasper reporting services. I have followed several tutorials to do that. But all of them show how to connect to a local database. Since I'm using MongoDB Atlas I'm wondering how to give Mongo URI while setting up the Data Adapter (See image)
If anyone knows a better approach to generate reports without using Japer reporting, please mention that as well. Thanks in advance!
You need to have mongodb atlas properly setup before you can connect to it.
First click on Database Access under Security in mongo atlas dashboard and create a new user
Then click on Network Access again under Security and whitelist your IP. The following image allows anyone to connect to the db
Finally, go to your cluster and click connect and you will be presented the following dialog. Copy the connection string from it and paste it into your jaspersoft connected interface.
Make sure you replace the <password> with the password of the user created above
That should work!
UPDATE: here is a screenshot of successful connection to jasperreport studio

Neo4j import postgres data from VM

I'm trying to import a Postgres database from a VM, and use Neo4j to visualize the graph.
For a sample of such a database, I am able to load the data using ETL tools and run it on my local Windows machine using Neo4j desktop.
However, when I try to load the database from VM (Centos), I can not load it from ETL. It gives me this error:
Connection failed. SQL state: 08001, message: Connection to localhost:7687 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
close
And on my VM, my Neo4j status looks like this:
Am I doing correctly? Or I should not use ETL tool for loading external database in the beginning? What is the best approach in my case?
Please advise and thank you so much!
You need to specify the username and password appropriate for logging into the neo4j server.

how can I migrate the database dump of MongoDB into PostgreSQL?

I replicate the application using the database as postgresql but later on I came to know that application was using mongodb and I got the dump of the app in json format which was of mongodb.
So any help regarding to migrate the mongo db dump into postgresql will be so appreciated.
Thank you!
You can migrate MongoDB into PostgreSQL using MoSQL.
Follow the github page of MoSQL. It contains the information, how you can do the migration.
Follow the this blog post for more information

Connection to CloudBees database using MySQL Workbench

I've just uploaded my locally developed app to CloudBees. It works fine: I can load the web pages and it can access the database.
However, I cannot connect to its database (also provided by CloudBees) using MySQL Workbench or the command line tool. It always says
Can't connect to MySQL server on 'ec2-50-19-213-178.compute-1.amazonaws.com' (10060)
Any CloudBees configuration that I might be missing?
double check your database connection parameters using SDK : bees db:info -p <databasename>
you should be able to connect to DB using mysql workbench and other mysql tools.
In the MySQL forum exists a collection of links for various types of connections using MySQL Workbench. One is probably especially interesting for you as it deals with Amazon RDS databases. Among others it shows what connection parameters are needed.
Seems that there were some firewall problems in the corporate router that prevented me from connecting before. I tried at home and it worked.