I am trying to import data from my local PostgreSQL database to neo4j
Firstly, I load the JDBC driver into memory
CALL apoc.load.driver("org.postgresql.Driver")
Then, I run this query to ingest data from postgresql
WITH "jdbc:postgresql://localhost:5432/graph-test?user=kt" as url
CALL apoc.load.jdbc(url,"os.operating_systems") YIELD row AS line
MERGE (o:Os {name: line.name})
MERGE (of:OsFamily {name: line.familly})
MERGE (o)-[:FROM]->(of)
Unfortunately, I received this error.
Failed to invoke procedure `apoc.load.jdbc`: Caused by: java.net.ConnectException: Connection refused (Connection refused)
Could this error be caused by the incorrect URL, jdbc plugin version, or something else?
Related
I am facing below exception while trying to connect to snowflake to pyspark:
py4j.protocol.Py4JJavaError: An error occurred while calling o117.load.
: net.snowflake.client.jdbc.SnowflakeSQLException: !200051!
at net.snowflake.client.core.SFBaseSession.getHttpClientKey(SFBaseSession.java:321)
at net.snowflake.client.core.SFSession.open(SFSession.java:408)
at net.snowflake.client.jdbc.DefaultSFConnectionHandler.initialize(DefaultSFConnectionHandler.java:104)
at net.snowflake.client.jdbc.DefaultSFConnectionHandler.initializeConnection(DefaultSFConnectionHandler.java:79)
at net.snowflake.client.jdbc.SnowflakeConnectionV1.initConnectionWithImpl(SnowflakeConnectionV1.java:116)
at net.snowflake.client.jdbc.SnowflakeConnectionV1.<init>(SnowflakeConnectionV1.java:96)
at net.snowflake.client.jdbc.SnowflakeDriver.connect(SnowflakeDriver.java:172)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at net.snowflake.spark.snowflake.JDBCWrapper.getConnector(SnowflakeJDBCWrapper.scala:209)
It looks like you are behind a firewall or a proxy server. I suggest using the Snowflake connectivity diagnostic tool SnowCD to make sure that all Snowflake URLs are reachable. If you see any errors, then you might want to check your firewall configuration or add a proxy configuration to spark the connection.
I want to load a large dataset (750 GB) into Skyrise. For this I use
copy LINEITEM from 's3://myBucket/'
credentials 'aws_access_key_id=key;aws_secret_access_key=secret'
null as '\000'
DELIMITER ','
region 'us-east-1'
ESCAPE;
After about 10 minutes I get
Unable to execute HTTP request: Connect to <some IP> failed: Connection refused (Connection refused)
I am able to load other datasets. What is the issue here?
It's likely that your connection was dropped due a timeout. Please review the following document for steps to correct this issue:
"Troubleshooting connection issues in Amazon Redshift"
I am having trouble to import a sql table to H2O.ai using Postgresql JDBC Driver in Ubuntu. I'm getting the follow error:
ERROR MESSAGE:
SQLException: ERROR: relation "XXX" does not exist
Position: 22
Failed to connect and read from SQL database with connection_url: jdbc:postgresql://localhost:5432/...**
I am executing H2O with the follow command:
java -cp h2o.jar:/usr/share/java/postgresql-9.4.1212.jar water.H2OApp
The JDBC driver is installed and already try to construct the Connection URL in several ways.
I'm using this one right now:
jdbc:postgresql://localhost:5432/XXX?&useSSL=false
I try to export Aurora PostgreSQL to S3 through aws data pipeline. However, I got this error: DriverClass not found for database:aurora
amazonaws.datapipeline.taskrunner.TaskExecutionException: Error copying record at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.processAll(SingleThreadedCopyActivity.java:65) at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.runActivity(SingleThreadedCopyActivity.java:35) at amazonaws.datapipeline.activity.CopyActivity.runActivity(CopyActivity.java:22) at amazonaws.datapipeline.objects.AbstractActivity.run(AbstractActivity.java:16) at amazonaws.datapipeline.taskrunner.TaskPoller.executeRemoteRunner(TaskPoller.java:136) at amazonaws.datapipeline.taskrunner.TaskPoller.executeTask(TaskPoller.java:105) at amazonaws.datapipeline.taskrunner.TaskPoller$1.run(TaskPoller.java:81) at private.com.amazonaws.services.datapipeline.poller.PollWorker.executeWork(PollWorker.java:76) at private.com.amazonaws.services.datapipeline.poller.PollWorker.run(PollWorker.java:53) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: DriverClass not found for database:aurora at private.com.amazonaws.services.datapipeline.database.RdsHelper.getDriverClass(RdsHelper.java:24) at amazonaws.datapipeline.database.ConnectionFactory.getRdsDatabaseConnection(ConnectionFactory.java:151) at amazonaws.datapipeline.database.ConnectionFactory.getConnection(ConnectionFactory.java:73) at amazonaws.datapipeline.database.ConnectionFactory.getConnectionWithCredentials(ConnectionFactory.java:278) at amazonaws.datapipeline.connector.SqlDataNode.createConnection(SqlDataNode.java:100) at amazonaws.datapipeline.connector.SqlDataNode.getConnection(SqlDataNode.java:94) at amazonaws.datapipeline.connector.SqlDataNode.prepareStatement(SqlDataNode.java:162) at amazonaws.datapipeline.connector.SqlInputConnector.open(SqlInputConnector.java:48) at amazonaws.datapipeline.connector.SqlInputConnector.<init>(SqlInputConnector.java:25) at amazonaws.datapipeline.connector.SqlDataNode.getInputConnector(SqlDataNode.java:79) at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.processAll(SingleThreadedCopyActivity.java:47)
The data pipeline node configuration as below
type: RdsDatabase
Jdbc Driver Jar Uri: S3Url
The value of S3Url is the postgresql driver downloaded from this page https://jdbc.postgresql.org/download.html and upload to fixed S3 location.
According to the above error message, the postgresql driver cannot be found. Where this postgresql jdbc driver could be found? or is there any wrong configuration in datapipeline?
Issue was resolved after change the postgresql connection node as following
Type: JdbcDatabase
ConnectionString: jdbc:postgresql://.....
Jdbc Driver Class: org.postgresql.Driver
I am upgrading my heroku database from a hobby dev to Standard 0 (using the official instructions https://devcenter.heroku.com/articles/upgrading-heroku-postgres-databases#upgrade-with-pg-copy-default).
All went well, until I promoted the new database and restarted the app. I then get the following error:
o.s.boot.SpringApplication : Application startup failed
...
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flywayInitializer' defined in class path resource [org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration$FlywayConfiguration.class]: Invocation of init method failed; nested exception is org.flywaydb.core.api.FlywayException: Unable to obtain Jdbc connection from DataSource
...
Caused by: org.flywaydb.core.api.FlywayException: Unable to obtain Jdbc connection from DataSource
...
Caused by: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "54.xxx.xx.xxx", user "u94bf9vxxxxxx", database "d2mqk0b6xxxxxx", SSL off
...
If I swap back to the old database again, everything works again. The only thing that I am changing is the promoted database.
Is there a difference between connecting to hobby and standard databases that I need to be aware of?
The relevant part of my application.yml looks as follows:
spring:
datasource:
driverClassName: org.postgresql.Driver
url: ${JDBC_DATABASE_URL}
username: ${JDBC_DATABASE_USERNAME}
password: ${JDBC_DATABASE_PASSWORD}
flyway:
enabled: true
locations: classpath:db/migrations
Any suggestions on how I can debug this would be very welcome too.
Looks like you aren't connecting with SSL where it is required by Heroku PostgreSQL installs.
See Herokus documentation on SSL for PostgreSQL.
See also Herokus documentation for enabling SSL on JDBC connections.
You will need to add something like &ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory to your JDBC URL.