i tried to make a
liquibase xml with liquibase diff
command.
I want to do this based on this documentation: http://www.liquibase.org/documentation/diff.html
If I try to exec the example command with my custom properties,
with
--driver:org.postgresql.Driver,
I got this problem:
"Unexpected error running Liquibase: java.lang.RuntimeException:
Cannot find database driver: org.postgresql.Driver"
You have to specify the classpath where the driver jar file is located using the classpath key (liquibase documentation).
Your properties file should look like:
driver: org.postgresql.Driver
classpath: postgresql-driver.jar
url: jdbc:postgres#localhost:5432
username: scott
password: tiger
Related
I have created a new environment for my application and called it docker. I'm trying stuff out so I set it like this:
application-docker.yml
micronaut:
application:
name: time
server:
netty:
access-logger:
enabled: true
logger-name: access-logger
datasources:
default:
url: jdbc:postgresql://db:5432/postgres
driverClassName: org.postgresql.Driver
username: postgres
password: postgres
schema-generate: CREATE_DROP
dialect: POSTGRES
schema: time
jpa.default.properties.hibernate.hbm2ddl.auto: update
flyway:
datasources:
default:
enabled: true
schemas: time
...
However when I try to run my app like this:
java -jar target/timeshare-0.1.jar -Dmicronaut.environments=docker -Dcom.sun.management.jmxremote -Xmx128m
If fails... because it can't connect to localhost!
08:11:00.949 [main] INFO com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting...
08:11:02.013 [main] ERROR com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Exception during pool initialization.
org.postgresql.util.PSQLException: Connection to localhost:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:285)
Why is it trying to connect to localhost? What am i missing?
It seems that Micronaut is not able to locate application-docker.yml file and then it is using the default one.
Because you can use for example -Dmicronaut.environments=not-existing-profile and even if it does not exist, it does not show any error.
So, make sure you have application-docker.yml file in the src/main/resources directory and also that the file is really exported into the result jar during build and is located in the root of the jar archive:
target/timeshare-0.1-all.jar
├── com
├── META-INF
├── org
├── application-docker.yml
├── application.yml
├── logback.xml
...
How are you building the result jar? When you use the shadowJar task then it must contain everything.
Another option is to use MICRONAUT_ENVIRONMENTS system variable:
export MICRONAUT_ENVIRONMENTS=docker
But this behaves the same way as -Dmicronaut.environments=docker startup option.
Another option is to specify exact path to the application-docker.yml configuration file by the micronaut.config.files startup option:
java -jar target/timeshare-0.1-all.jar -Dmicronaut.config.files=/some/external/location/application-docker.yml
I try to export Aurora PostgreSQL to S3 through aws data pipeline. However, I got this error: DriverClass not found for database:aurora
amazonaws.datapipeline.taskrunner.TaskExecutionException: Error copying record at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.processAll(SingleThreadedCopyActivity.java:65) at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.runActivity(SingleThreadedCopyActivity.java:35) at amazonaws.datapipeline.activity.CopyActivity.runActivity(CopyActivity.java:22) at amazonaws.datapipeline.objects.AbstractActivity.run(AbstractActivity.java:16) at amazonaws.datapipeline.taskrunner.TaskPoller.executeRemoteRunner(TaskPoller.java:136) at amazonaws.datapipeline.taskrunner.TaskPoller.executeTask(TaskPoller.java:105) at amazonaws.datapipeline.taskrunner.TaskPoller$1.run(TaskPoller.java:81) at private.com.amazonaws.services.datapipeline.poller.PollWorker.executeWork(PollWorker.java:76) at private.com.amazonaws.services.datapipeline.poller.PollWorker.run(PollWorker.java:53) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: DriverClass not found for database:aurora at private.com.amazonaws.services.datapipeline.database.RdsHelper.getDriverClass(RdsHelper.java:24) at amazonaws.datapipeline.database.ConnectionFactory.getRdsDatabaseConnection(ConnectionFactory.java:151) at amazonaws.datapipeline.database.ConnectionFactory.getConnection(ConnectionFactory.java:73) at amazonaws.datapipeline.database.ConnectionFactory.getConnectionWithCredentials(ConnectionFactory.java:278) at amazonaws.datapipeline.connector.SqlDataNode.createConnection(SqlDataNode.java:100) at amazonaws.datapipeline.connector.SqlDataNode.getConnection(SqlDataNode.java:94) at amazonaws.datapipeline.connector.SqlDataNode.prepareStatement(SqlDataNode.java:162) at amazonaws.datapipeline.connector.SqlInputConnector.open(SqlInputConnector.java:48) at amazonaws.datapipeline.connector.SqlInputConnector.<init>(SqlInputConnector.java:25) at amazonaws.datapipeline.connector.SqlDataNode.getInputConnector(SqlDataNode.java:79) at amazonaws.datapipeline.activity.copy.SingleThreadedCopyActivity.processAll(SingleThreadedCopyActivity.java:47)
The data pipeline node configuration as below
type: RdsDatabase
Jdbc Driver Jar Uri: S3Url
The value of S3Url is the postgresql driver downloaded from this page https://jdbc.postgresql.org/download.html and upload to fixed S3 location.
According to the above error message, the postgresql driver cannot be found. Where this postgresql jdbc driver could be found? or is there any wrong configuration in datapipeline?
Issue was resolved after change the postgresql connection node as following
Type: JdbcDatabase
ConnectionString: jdbc:postgresql://.....
Jdbc Driver Class: org.postgresql.Driver
my input: a collection("demo1") in mongo db (version 3.4.4 )
my output : my data imported in a database in hive("demo2") (version 1.2.1.2.3.4.7-4)
purpose : create a connector between mongo and hive
Error:
Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com/mongodb/util/JSON
I tried 2 solutions following those steps (but the error remains):
1) I create a local collection in mongo (via robomongo) connected to docker
2) I upload those version of jars and add it in hive
ADD JAR /home/.../mongo-hadoop-hive-2.0.2.jar;
ADD JAR /home/.../mongo-hadoop-core-2.0.2.jar;
ADD JAR /home/.../mongo-java-driver-3.4.2.jar;
Unfortunately the error doesn't change; so I upload those version, I hesitate in choosing right version for my export, so I try this:
ADD JAR /home/.../mongo-hadoop-hive-1.3.0.jar;
ADD JAR /home/.../mongo-hadoop-core-1.3.0.jar;
ADD JAR /home/.../mongo-java-driver-2.13.2.jar;
3) I create an external table
CREATE EXTERNAL TABLE demo2
(
id INT,
name STRING,
password STRING,
email STRING
)
STORED BY 'com.mongodb.hadoop.hive.MongoStorageHandler'
WITH
SERDEPROPERTIES('mongo.columns.mapping'='{"id":"_id","name":"name","password":"password","email":"email"}')
TBLPROPERTIES('mongo.uri'='mongodb://localhost:27017/local.demo1');
Error returned in hive :
Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com/mongodb/util/JSON
How can I resolve this problem?
Copying the correct jar files (mongo-hadoop-core-2.0.2.jar, mongo-hadoop-hive-2.0.2.jar, mongo-java-driver-3.2.2.jar) on ALL the nodes of the cluster did the trick for me.
Other points to take care about:
Follow all steps mentioned here religiously - https://github.com/mongodb/mongo-hadoop/wiki/Hive-Usage#installation
Adhere to the requirements given here - https://github.com/mongodb/mongo-hadoop#requirements
Other useful links
https://github.com/mongodb/mongo-hadoop/wiki/FAQ#i-get-a-classnotfoundexceptionnoclassdeffounderror-when-using-the-connector-what-do-i-do
https://groups.google.com/forum/#!topic/mongodb-user/xMVoTSePgg0
I want to initialise my datasource with a DLL script in my spring boot project (only during dev of course). As mentioned in the docs here I set the spring.datasource.schema property to the DLL scripts which is in src/main/resources/postgresql/define-schema.sql.
spring:
profiles: dev
datasource:
platform: postgresql
driver-class-name: org.postgresql.Driver
url: jdbc:postgresql://localhost:5432/postgres
username: postgres
password: ****
initialize: true
schema: ./postgresql/define-schema.sql
continue-on-error: false
jpa:
hibernate:
ddl-auto: validate
generate-ddl: false
show-sql: true
But the script won't be executed. I also tried to put it on the class path root and call it schema.sql ... Nothing happens.
The dev profile is selected, at least I see it in the log: The following profiles are active: dev
The application than fails on the JPA schema validation.
The only warning I get from hibernate:
Found use of deprecated [org.hibernate.id.SequenceGenerator] sequence-based id generator; use org.hibernate.id.enhanced.SequenceStyleGenerator instead. See Hibernate Domain Model Mapping Guide for details.
But I don't think this has any to do with the initialise problem.
I've got spring-boot-security-starter in my dependencies but not configured yet, could that be a problem source?
Does anybody recognise an obivous typo, mistake anything else?
Looking forward hearing from you!
Amp
Prefix the path to your sql script with classpath
Example:
spring.datasource.schema=classpath:/postgresql/define-schema.sql
I am upgrading my heroku database from a hobby dev to Standard 0 (using the official instructions https://devcenter.heroku.com/articles/upgrading-heroku-postgres-databases#upgrade-with-pg-copy-default).
All went well, until I promoted the new database and restarted the app. I then get the following error:
o.s.boot.SpringApplication : Application startup failed
...
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flywayInitializer' defined in class path resource [org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration$FlywayConfiguration.class]: Invocation of init method failed; nested exception is org.flywaydb.core.api.FlywayException: Unable to obtain Jdbc connection from DataSource
...
Caused by: org.flywaydb.core.api.FlywayException: Unable to obtain Jdbc connection from DataSource
...
Caused by: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "54.xxx.xx.xxx", user "u94bf9vxxxxxx", database "d2mqk0b6xxxxxx", SSL off
...
If I swap back to the old database again, everything works again. The only thing that I am changing is the promoted database.
Is there a difference between connecting to hobby and standard databases that I need to be aware of?
The relevant part of my application.yml looks as follows:
spring:
datasource:
driverClassName: org.postgresql.Driver
url: ${JDBC_DATABASE_URL}
username: ${JDBC_DATABASE_USERNAME}
password: ${JDBC_DATABASE_PASSWORD}
flyway:
enabled: true
locations: classpath:db/migrations
Any suggestions on how I can debug this would be very welcome too.
Looks like you aren't connecting with SSL where it is required by Heroku PostgreSQL installs.
See Herokus documentation on SSL for PostgreSQL.
See also Herokus documentation for enabling SSL on JDBC connections.
You will need to add something like &ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory to your JDBC URL.