I am using Spring-boot 0.5.0.M6 with Spring-Batch. Configuration has by using #EnableBatchProcessing with datasource etc configured in application.properties.
During first run of the application, everything works fine but after I stop the application and restart application, following error is seen
org.springframework.dao.DuplicateKeyException: PreparedStatementCallback; SQL [INSERT into BATCH_JOB_INSTANCE(JOB_INSTANCE_ID, JOB_NAME, JOB_KEY, VERSION) values (?, ?, ?, ?)]; Duplicate entry '1' for key 'PRIMARY'; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 'PRIMARY'
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:239)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:73)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:659)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:908)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:969)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:974)
When digging down, i had observed following lines in logs
2013-12-06 12:12:37 INFO ResourceDatabasePopulator:162 - Executing SQL script from class path resource [org/springframework/batch/core/schema-mysql.sql]
2013-12-06 12:12:37 INFO ResourceDatabasePopulator:217 - Done executing SQL script from class path resource [org/springframework/batch/core/schema-mysql.sql] in 13 ms.
Root problem over here was schema-drop-mysql.sql was not triggered by schema-mysql.sql was triggered, thereby creating two entries in BATCH_JOB_SEQ.
For resolution of the same, I have added
#EnableAutoConfiguration(exclude={BatchAutoConfiguration.class})
However due to this I need to execute schema-mysql.sql explicitly, which as of now is ok, but would be problem when spring-batch version would be updated with updates in schema
Hence have couple of questions:
1. How to autoconfigure batch for even executing schema-drop-mysql.sql before schema-mysql.sql?
2. is there way to configure this BatchDatabaseInitializer to run kind of "update" mode?
Regards
With the current version of Spring Batch autoconfiguration that isn't possible with the upcoming version it is possible to disable the automatic creation of the database tables by specifying the spring.batch.initializer.enabled property and setting it to false.
IMHO you shouldn't use the automatic creation/update features to create schema's either do it yourself or use tools like LiquiBase or FlyWay to do it more controlled.
Also see https://stackoverflow.com/questions/8418814/db-migration-tool-liquibase-or-flyway
You can always execute the schema-drop-mysql.sql yourself, as a work around you could add a #PreDestroy method to your #Configuration class which executes this script. (Maybe you could even add this to an #Configuration class which is enabled only in dev mode/profile).
Related
My application is based on Spring Boot 2.2.2.RELEASE and PostgreSQL. I am relying on Spring's AutoConfiguration as far as persistence is concerned. My application.properties file contains the following:
# Persistence
dbVendor=postgresql
# Basic connection options
spring.dataSource.driver-class-name=org.postgresql.Driver
spring.dataSource.url=jdbc:postgresql://is-0001/<database>
spring.dataSource.username=<username>
spring.dataSource.password=<password>
# Hibernate options
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
spring.jpa.properties.hibernate.format_sql=false
spring.jpa.properties.hibernate.hbm2ddl.auto=create
spring.jpa.properties.hibernate.show_sql=false
spring.jpa.properties.hibernate.implicit_naming_strategy=<package>.ImplicitNamingStrategyImpl
spring.jpa.properties.hibernate.physical_naming_strategy=<package>.PhysicalNamingStrategyImpl
# Options to create sql scripts
spring.jpa.properties.javax.persistence.schema-generation.scripts.action=create
spring.jpa.properties.javax.persistence.schema-generation.scripts.create-target=/development/projects/<project>/backend/sql/setup/createDb.sql
#spring.jpa.properties.javax.persistence.schema-generation.scripts.drop-target=/development/projects/<project>/backend/sql/setup/dropDb.sql
For some reason the setting for spring.jpa.properties.hibernate.hbm2ddl.auto=create is ignored by Spring - independent of its value - whereas the spring.jpa.properties.javax... properties are applied correctly which is easy to verify by looking at the generated SQL file (createDb.sql and dropDb.sql).
Does anybody have any idea what the reason for this behaviour could be? I would really be thankful as I have been trying to find the root cause for this issue for more than a day now?
Just as a side node: Spring Boot 2.0.5.RELEASE behaves the same.
You can have a look here:
https://docs.spring.io/spring-boot/docs/current/reference/html/howto.html#howto-initialize-a-database-using-hibernate
You need to set the property: spring.jpa.hibernate.ddl-auto=create
I need to add the below values to my existing data source through CLI commands in jboss EAP server
<connection-property name="auto Commit">false</connection-property>
<transaction-isolation>TRANSACTION_READ_COMMITTED</transaction-isolation>
I have tried to use below command but its says duplicate resource
TRANSACTION_READ_COMMITTED is the attribute value for transaction isolation.
transaction-isolation : This element specifies the java.sql.Connection transaction isolation level to use. The constants defined in the Connection interface are the possible element content values and include:
TRANSACTION_READ_UNCOMMITTED
TRANSACTION_READ_COMMITTED
TRANSACTION_REPEATABLE_READ
TRANSACTION_SERIALIZABLE
TRANSACTION_NONE
You can use the following command to add any of the above strategies:
/profile=full/subsystem=datasources/data-source=ExampleDS:add(transaction-isolation=<Strategy>)
In an existing Grails 3.1.15 application, recently upgraded to Hibernate 5.1, an odd issue with afterInsert (or other) hooks started to appear. After some hours of testing, I could track this down to Hibernate 5.1 + PostgreSQL combination - issue is not reproducible with H2. To reproduce, create a simple application consisting of 2 domain objects - an Audit trail and a User, as shown here:
class Audit {
Date dateCreated
String auditData
}
class User {
String name
String email
def afterInsert() {
new Audit(auditData: "User created: $this").save()
}
}
The code above works OK with Hibernate 4, however, if the application is upgraded to Hibernate5 plugin + Hibernate 5.1.x (tested with 5.1.0.Final and 5.1.5.Final) the above scenario will always lead to a ConcurrentModificationException when you attempt to save a new User instance. You can just use a scaffold controller to reproduce. Note this only happens with PostgreSQL as the data source - with H2 it would still work OK.
Now, according to GORM Docs (see chapter 8.1.3) one should use a new session when attempting to save other objects in beforeUpdate or afterInsert hooks anyway:
def afterInsert() {
Audit.withNewSession() {
new Audit(auditData: "User created: $this").save()
/* no exception logged, but Audit instance not preserved */
}
}
But this wouldn't really resolve the issue with PSQL. The exception is gone, the User instance is persisted, but the Audit instance is not saved. Again, this code would work OK with H2. To really avoid the issue with PSQL, you would have to manually flush the session in afterInsert:
def afterInsert() {
Audit.withNewSession() { session ->
new Audit(auditData: "User created: $this").save()
session.flush()
/* finally no exceptions, both User and Audit saved */
}
}
Question: is this a bug, or is this expected? I find it a bit suspicious that the manual flush is required in order for the Audit instance to be persisted - and even more so when I see it works without a flush with H2 and only seems to affect PostgreSQL. But I couldn't really find any reports - any pointers are appreciated!
For the sake of completeness, I tested with the following JDBC driver versions for PostgreSQL:
runtime 'org.postgresql:postgresql:9.3-1101-jdbc41'
runtime 'org.postgresql:postgresql:9.4.1208.jre7'
runtime 'org.postgresql:postgresql:42.0.0'
And for the upgrade to Hibernate 5.1, the following dependencies were used:
classpath "org.grails.plugins:hibernate5:5.0.13"
...
compile "org.grails.plugins:hibernate5:5.0.13"
compile "org.hibernate:hibernate-core:5.1.5.Final"
compile "org.hibernate:hibernate-ehcache:5.1.5.Final"
I am following basic tutorial at Spring Data Cassandra reference http://docs.spring.io/spring-data/cassandra/docs/1.1.0.RC1/reference/html/ and I am running into following exception
java.lang.IllegalArgumentException: Environment must not be null!
at org.springframework.util.Assert.notNull(Assert.java:112)
at org.springframework.data.repository.config.RepositoryConfigurationSourceSupport.<init>(RepositoryConfigurationSourceSupport.java:50)
at org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource.<init>(AnnotationRepositoryConfigurationSource.java:74)
at org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport.registerBeanDefinitions(RepositoryBeanDefinitionRegistrarSupport.java:74)
at org.springframework.context.annotation.ConfigurationClassParser.processImport(ConfigurationClassParser.java:394)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:204)
at org.springframework.context.annotation.ConfigurationClassParser.processConfigurationClass(ConfigurationClassParser.java:163)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:138)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:284)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:225)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:630)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:461)
at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:73)
at com.strides.platform.domain.UserRepositoryDaoTest.<init>(UserRepositoryDaoTest.java:28)
I have completed steps mentioned in document,
1) Use Cassandra Properties
2) Create Java configuration
3) Create domain and repository classes
I have autowired Environment variable in Test classes. I checked couple of sample projects and not sure what needs to be done more.
I've encountered this error message and found the problem only occuring when I used Spring Framework version 3.2.8.RELEASE.
My solution was to upgrade to version 3.2.9.RELEASE.
See also java.lang.IllegalArgumentException: Environment must not be null
Sorry for the big wall of text, but its mostly logs
Thx for any help in any of my problems
I've been trying to get help from Seam forums, but in vain.
I'm trying this Setup mentioned in the title, but unsuccessfully.
I have it all installed correctly and the problems start with the seam-gen.
This is my build.properties
#Generated by seam setup
#Sat Aug 29 19:12:18 BRT 2009
hibernate.connection.password=abc123
workspace.home=/home/rgoytacaz/workspace
hibernate.connection.dataSource_class=org.postgresql.ds.PGConnectionPoolDataSource
model.package=com.atom.Commerce.model
hibernate.default_catalog=PostgreSQL
driver.jar=/home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
action.package=com.atom.Commerce.action
test.package=com.atom.Commerce.test
database.type=postgres
richfaces.skin=glassX
glassfish.domain=domain1
hibernate.default_schema=Core
database.drop=n
project.name=Commerce
hibernate.connection.username=postgres
glassfish.home=C\:/Program Files/glassfish-v2.1
hibernate.connection.driver_class=org.postgresql.Driver
hibernate.cache.provider_class=org.hibernate.cache.HashtableCacheProvider
jboss.domain=default
project.type=ear
icefaces.home=
database.exists=y
jboss.home=/srv/jboss-5.1.0.GA
driver.license.jar=
hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
hibernate.connection.url=jdbc\:postgresql\:Atom
icefaces=n
./seam create-project works okay, but when I try generate-entities, I get the following...
generate-model:
[echo] Reverse engineering database using JDBC driver /home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
[echo] project=/home/rgoytacaz/workspace/Commerce
[echo] model=com.atom.Commerce.model
[hibernate] Executing Hibernate Tool with a JDBC Configuration (for reverse engineering)
[hibernate] 1. task: hbm2java (Generates a set of .java files)
[hibernate] log4j:WARN No appenders could be found for logger (org.hibernate.cfg.Environment).
[hibernate] log4j:WARN Please initialize the log4j system properly.
[javaformatter] Java formatting of 4 files completed. Skipped 0 file(s).
this is problem no.1. How do I fix this? What is this? I had to do this in eclipse. It worked.
Then I import the seam-gen created project into eclipse, and deploy to JBoss 5.1. While my servers start I've noticed the following..
03:18:56,405 ERROR [SchemaUpdate] Unsuccessful: alter table PostgreSQL.atom.productsculturedetail add constraint FKBD5D849BC0A26E19 foreign key (culture_Id) references PostgreSQL.atom.cultures
03:18:56,406 ERROR [SchemaUpdate] ERROR: cross-database references are not implemented: "postgresql.atom.productsculturedetail"
03:18:56,407 ERROR [SchemaUpdate] Unsuccessful: alter table PostgreSQL.atom.productsculturedetail add constraint FKBD5D849BFFFC9417 foreign key (product_Id) references PostgreSQL.atom.products
03:18:56,408 ERROR [SchemaUpdate] ERROR: cross-database references are not implemented: "postgresql.atom.productsculturedetail"*
03:18:56,408 INFO [SchemaUpdate] schema update complete
Problem no.2. What is this cross-database references?
What about this..
03:18:55,089 INFO [SettingsFactory] JDBC driver: PostgreSQL Native Driver, version: PostgreSQL 8.4 JDBC3 (build 701)
Problem no.3 I've said in the build.properties to use JDBC4 driver, I don't know why seam insists to use JDBC3 driver. Where do I change this?
When I go into http://localhost:5443/Commerce and try to browse the auto-generated CRUD UI.
I get this error.. Error reading 'resultList' on type com.atom.Commerce.action.ProductsList_$$_javassist_seam_2
And this is what is showing in my server logs...
03:34:00,828 INFO [STDOUT] Hibernate:
select
products0_.product_Id as product1_0_,
products0_.active as active0_
from
PostgreSQL.atom.products products0_ limit ?
03:34:00,848 WARN [JDBCExceptionReporter] SQL Error: 0, SQLState: 0A000
03:34:00,849 ERROR [JDBCExceptionReporter] ERROR: cross-database references are not implemented: "postgresql.atom.products"
Position: 81
03:34:00,871 SEVERE [viewhandler] Error Rendering View[/ProductsList.xhtml]
javax.el.ELException: /ProductsList.xhtml: Error reading 'resultList' on type com.atom.Commerce.action.ProductsList_$$_javassist_seam_2
Caused by: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: could not execute query
Problem no.4 What is going on here? Cross-database references?
Thx for any help in any of my problems.
You did receive a few answers on the Seam forums (here and here), but you didn't follow up. Anyway, all these are actually caused by one problem:
As Stuart Douglas told you, you shouldn't use a catalog when connecting to PostgreSQL. To fix this, replace the property "hibernate.default_catalog=PostgreSQL" in your properties file by the property: "hibernate.default_catalog.null=", so that your file looks like this:
...
model.package=com.atom.Commerce.model
hibernate.default_catalog.null= # <-- This is the replaced property
driver.jar=/home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
...
You should be able to use seam generate-entities fine afterwards (assuming the rest of your configuration is correct). I'd recommend doing the generation into a clean folder.
Cross-database references is when a query tries to access two or more different databases. PostgreSQL does not support this, and thus complains when there is more than 1 period in the table name, so in PostgreSQL.atom.productsculturedetail, the bold part should be removed. Hibernate adds this prefix when you tell it to use a default catalog, which we already fixed in step 1 above (by telling it not to use a catalog), so this problem should be fixed after you regenerate your entities.
(Note that this is effectively the same as what Stuart Douglas told you, that you should remove the catalog="PostgreSQL" attribute in the annotations on your entity classes.)
When you specified the postgresql-8.4-701.jdbc4.jar file in the properties file, this didn't mean that the driver supports JDBC4. Although the name of the file would suggest so, the driver's website clearly states that "The driver provides a reasonably complete implementation of the JDBC 3 specification". This shouldn't be a problem for you, as you're not using the driver directly (or at least you're not supposed to). The driver is sufficient for Hibernate to fulfill its requirements and provide the required functionality.
This issue is caused by the same problem above. Hibernate is unable to read data from the database because of the incorrect query. Fixing the catalog problem should fix this issue.