Problem generating ddl in spring data jpa application - jpa

We have a working Spring Boot (2.1.3) application. For local development we use
jpa:
hibernate:
ddl-auto: create-drop
Now we need to generate a ddl file for our db-guys (preferable during build). I'll tried by setting these additional properties:
javax:
persistence:
schema-generation:
create-source: metadata
action: create
create-target: create.sql
With these settings in place (ddl-auto changed to none) i started up my application. While it started fine there is not "create.ddl" to found.
Because i want the ddl file to be generated during build i added a test:
#RunWith(SpringRunner.class)
#DataJpaTest
#TestPropertySource(locations = "classpath:/testproperties/ddlgenerate.yml")
#AutoConfigureTestDatabase(replace = Replace.NONE)
public class GenerateDDL {
#Autowired
private EntityManager em;
#Test
public void generateDDL(){
em.close();
em.getEntityManagerFactory().close();
}
}
I read somewhere that the ddl should be generated during instantiation of EntityManager?!
The referenced classpath:/testproperties/ddlgenerate.yml only contains
spring:
jpa:
properties:
javax:
persistence:
schema-generation:
create-source: metadata
action: create
create-target: create.sql
Log indicates that the properties are loaded. The test is green but still no ddl file.
So how to get a ddl file generated (preferable during build)?

The problem is that #TestPropertySource does not support yaml files as source. All is working as soon as yaml is converted to properties file.

Related

SpringBootTest + JPA + Kafka - context is not loading properly during testing

I have Spring Boot application with kafka and jpa in it. I am using h2 as my in-memory database. For each test class execution, I don't want kafka to come up for each test class. I have 2 test classes, one is KafkaConsumerTest and another one is JPATest. KafkaConsumerTest is annotated with #SpringBootTest and it perfectly loads the entire application and passes all the test. However, for JPATest, I don't want to bring up the entire application and just few desired context to test out JPA related changes. When I do that, it is throwing the following exception.
Caused by: java.lang.IllegalArgumentException: dataSource or dataSourceClassName or jdbcUrl is required.
at com.zaxxer.hikari.HikariConfig.validate(HikariConfig.java:958)
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:109)
at org.eclipse.persistence.sessions.JNDIConnector.connect(JNDIConnector.java:138)
at org.eclipse.persistence.sessions.DatasourceLogin.connectToDatasource(DatasourceLogin.java:172)
at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.setOrDetectDatasource(DatabaseSessionImpl.java:233)
at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.loginAndDetectDatasource(DatabaseSessionImpl.java:815)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryProvider.login(EntityManagerFactoryProvider.java:256)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:769)
I am passing the datasource with jdbcUrl in my application.yml file
src/test/resources/application.yml
spring:
datasource:
jdbcUrl: jdbc:h2:mem:mydb
url: jdbc:h2:mem:mydb
driverClassName: org.h2.Driver
username: sa
kafka:
bootstrap-servers: ${spring.embedded.kafka.brokers}
KafkaConsumerTest.java
#RunWith(SpringRunner.class)
#SpringBootTest (classes = Application.class)
#DirtiesContext
#EmbeddedKafka(partitions = 1,
topics = {"${kafka.topic}"})
public class KafkaConsumerTest {
JpaTest.java
#RunWith(SpringRunner.class)
#ContextConfiguration(initializers = ConfigFileApplicationContextInitializer.class, classes = {JPAConfiguration.class})
public class NotificationServiceTest {
I tried putting loader as AnnotationConfigContextLoader.class but it gave me the same error. I tried specifying application.yml exclusively using TestPropertyResource but still the same error.
#TestPropertyResource(locations = {"classpath:application.yml"})
I think I am not able to load the context properly here and application.yml file is not able to pick or parse values here.
Any suggestions on how to resolve this.
I am able to solve the issue. The reason of this issue was spring context was not getting loaded properly for other tests as I was not using #SpringBootTest. How I bypassed the error and also loading the spring boot context only for one time was to create a base class like this.
#RunWith(SpringRunner.class)
#SpringBootTest(classes = Application.class)
#DirtiesContext
#EmbeddedKafka(partitions = 1,
topics = {"${kafka.topic}"})
public abstract class AbstractSpringBootTest {
}
Now every test class has to extend this class as per the following code. This way spring test will be loaded once only provided the context doesn't get changed during the tests run.
public class MyTest extends AbstractSpringBootTest {
Posting the solution which worked for me for other people's reference.

Springboot postgres Failed to determine a suitable driver class

I am trying to develop web application using SpringBoot and Postgres Database. However, on connecting to the application, I am getting error "Failed to determine a suitable driver class"
As per advise in older posts, I have tried using driver of different version of jdbc and also tried creating bean for NamedParameterJdbcTemplate manually. I also validated that libraries are present and is accessible from Java code and those are present in classpath. But its still giving the same issue.
I am using gradle to import all jars into build path.
Here is the git repository for the code:
https://github.com/ashubisht/sample-sbs.git
Gradle dependency code:
apply plugin: 'idea'
apply plugin: 'org.springframework.boot'
apply plugin: 'io.spring.dependency-management'
dependencies {
compile("org.springframework.boot:spring-boot-starter-web")
compile("org.springframework.boot:spring-boot-starter-websocket")
compile("org.springframework.boot:spring-boot-starter-jdbc")
//compile("org.postgresql:postgresql")
compile("org.postgresql:postgresql:9.4-1206-jdbc42")
testCompile("org.springframework.boot:spring-boot-starter-test")
testCompile group: 'junit', name: 'junit', version: '4.12'
}
Code for building Bean
#Configuration
#PropertySource("classpath:application.properties")
public class Datasource {
#Value("${db.driverClassName}")
private String driverClass;
#Value("${db.url}")
private String url;
#Value("${db.username}")
private String username;
#Value("${db.password}")
private String password;
#Bean
public NamedParameterJdbcTemplate namedParameterJdbcTemplate() throws Exception{
System.out.println(driverClass+" "+ url+" "+username+" "+password);
DriverManagerDataSource source = new DriverManagerDataSource();
source.setDriverClassName(driverClass);
source.setUrl(url);
source.setUsername(username);
source.setPassword(password);
NamedParameterJdbcTemplate namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(source);
return namedParameterJdbcTemplate;
}
}
Here is application.properties
server.port=8086
#spring.datasource.driverClassName=org.postgresql.Driver
#spring.datasource.url= jdbc:postgresql://localhost:5432/testdb
#spring.datasource.username=postgres
#spring.datasource.password=password
#spring.datasource.platform=postgresql
#spring.jpa.hibernate.ddl-auto=create-drop
db.driverClassName=org.postgresql.Driver
db.url=jdbc:postgresql://localhost:5432/testdb
db.username=postgres
db.password=password
The issue is resolved by creating two beans. Separate bean is created for DataSource and NamedParameterJdbcTemplate.
#Bean
public DataSource dataSource(){
System.out.println(driverClass+" "+ url+" "+username+" "+password);
DriverManagerDataSource source = new DriverManagerDataSource();
source.setDriverClassName(driverClass);
source.setUrl(url);
source.setUsername(username);
source.setPassword(password);
return source;
}
#Bean
public NamedParameterJdbcTemplate namedParameterJdbcTemplate(){
NamedParameterJdbcTemplate namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(this.dataSource());
return namedParameterJdbcTemplate;
}
For me the issue was in a miss-spell for postgresSql
its only one s,
replace
spring.datasource.url=jdbc:postgres://localhost:5432/databaseName
spring.datasource.url=jdbc:postgressql://localhost:5432/databaseName
with
spring.datasource.url=jdbc:postgresql://localhost:5432/databaseName
also check the same thing on hibernate dialect,
replace PostgresSQLDialect
with PostgreSQLDialect
Had the same problem.
The solution for me was to change application.properties file extension into application.yml
For me the error was
Failed to configure a DataSource: 'url' attribute is not specified and no embedded datasource could be configured.
Reason: Failed to determine a suitable driver class
Action:
Consider the following:
If you want an embedded database (H2, HSQL or Derby), please put
it on the classpath.
If you have database settings to be loaded from a particular profile you may need to activate it (no profiles are currently active).
and the issue was missing profile
so I added the following in the classpath and it worked
spring.profiles.active=dev
Please try it
spring.r2dbc.url=r2dbc:postgresql://ip:port/datafeed?currentSchema=user_management
spring.r2dbc.username=username
spring.r2dbc.password=12345
spring.r2dbc.driver=postgresql
Hope to help you!
I got the same error. It happens when you install sts version 3.
I found the solution to this problem by doing trial & error method.
This error is occured due to the non-availability of the connection between Application Properties & the server. I got to know by changing the port number in the application Properties to 9090, later then while running the application the console showed the default port number 8080.
Thus you should maven clean and maven build your Spring Boot Application.
After the above step, you run your application normally as spring boot application, the database will get connected and the application will get started.

Spring Boot with application managed persistence context

I am trying to migrate an application from EJB3 + JTA + JPA (EclipseLink). Currently, this application makes use of application managed persistent context due to an unknown number of databases on design time.
The application managed persistent context allows us to control how to create EntityManager (e.g. supply different datasources JNDI to create proper EntityManager for specific DB on runtime).
E.g.
Map properties = new HashMap();
properties.put(PersistenceUnitProperties.TRANSACTION_TYPE, "JTA");
//the datasource JNDI is by configuration and without prior knowledge about the number of databases
//currently, DB JNDI are stored in a externalized file
//the datasource is setup by operation team
properties.put(PersistenceUnitProperties.JTA_DATASOURCE, "datasource-jndi");
properties.put(PersistenceUnitProperties.CACHE_SHARED_DEFAULT, "false");
properties.put(PersistenceUnitProperties.SESSION_NAME, "xxx");
//create the proper EntityManager for connect to database decided on runtime
EntityManager em = Persistence.createEntityManagerFactory("PU1", properties).createEntityManager();
//query or update DB
em.persist(entity);
em.createQuery(...).executeUpdate();
When deployed in a EJB container (e.g. WebLogic), with proper TransactionAttribute (e.g. TransactionAttributeType.REQUIRED), the container will take care of the transaction start/end/rollback.
Now, I am trying to migrate this application to Spring Boot.
The problem I encounter is that there is no transaction started even after I annotate the method with #Transactional(propagation = Propagation.REQUIRED).
The Spring application is packed as an executable JAR file and run with embadded Tomcat.
When I try to execute those update APIs, e.g. EntityManager.persist(..), EclipseLink always complains about:
javax.persistence.TransactionRequiredException: 'No transaction is currently active'
Sample code below:
//for data persistence
#Service
class DynamicServiceImpl implements DynamicService {
//attempt to start a transaction
#Transactional(propagation = Propagation.REQUIRED)
public void saveData(DbJndi, EntityA){
//this return false that no transaction started
TransactionSynchronizationManager.isActualTransactionActive();
//create an EntityManager based on the input DbJndi to dynamically
//determine which DB to save the data
EntityManager em = createEm(DbJndi);
//save the data
em.persist(EntityA);
}
}
//restful service
#RestController
class RestController{
#Autowired
DynamicService service;
#RequestMapping( value = "/saveRecord", method = RequestMethod.POST)
public #ResponseBody String saveRecord(){
//save data
service.saveData(...)
}
}
//startup application
#SpringBootApplication
class TestApp {
public static void main(String[] args) {
SpringApplication.run(TestApp.class, args);
}
}
persistence.xml
-------------------------------------------
&ltpersistence-unit name="PU1" transaction-type="JTA">
&ltproperties>
&lt!-- comment for spring to handle transaction??? -->
&lt!--property name="eclipselink.target-server" value="WebLogic_10"/ -->
&lt/properties>
&lt/persistence-unit>
-------------------------------------------
application.properties (just 3 lines of config)
-------------------------------------------
spring.jta.enabled=true
spring.jta.log-dir=spring-test # Transaction logs directory.
spring.jta.transaction-manager-id=spring-test
-------------------------------------------
My usage pattern does not follow most typical use cases (e.g. with known number of DBs - Spring + JPA + multiple persistence units: Injecting EntityManager).
Can anybody give me advice on how to solve this issue?
Is there anybody who has ever hit this situation that the DBs are not known in design time?
Thank you.
I finally got it work with:
Enable tomcat JNDI and create the datasource JNDI to each DS programmatically
Add transaction stuff
com.atomikos:transactions-eclipselink:3.9.3 (my project uses eclipselink instead of hibernate)
org.springframework.boot:spring-boot-starter-jta-atomikos
org.springframework:spring-tx
You have pretty much answered the question yourself: "When deployed in a EJB container (e.g. WebLogic), with proper TransactionAttribute (e.g. TransactionAttributeType.REQUIRED), the container will take care of the transaction start/end/rollback".
WebLogic is compliant with the Java Enterprise Edition specification which is probably why it worked before, but now you are using Tomcat (in embedded mode) which are NOT.
So you simply cannot do what you are trying to do.
This statement in your persistence.xml file:
<persistence-unit name="PU1" transaction-type="JTA">
requires an Enterprise Server (WebLogic, Glassfish, JBoss etc.)
With Tomcat you can only do this:
<persistence-unit name="PU1" transaction-type="RESOURCE_LOCAL">
And you need to handle transactions by your self:
myEntityManager.getTransaction.begin();
... //Do your transaction stuff
myEntityManager.getTransaction().commit();

Autowiring issues with a class in Spring data Mongo repository

For a variety of reasons, I ended up using spring boot 1.2.0 RC2.
So a spring data mongo application that worked fine in spring boot1.1.8 is now having issues. No code was changed except for the bump to spring boot 1.2.0 RC2. This is due to the snapshot version of spring cloud moving to this spring boot version.
The repository class is as follows
#Repository
public interface OAuth2AccessTokenRepository extends MongoRepository<OAuth2AuthenticationAccessToken, String> {
public OAuth2AuthenticationAccessToken findByTokenId(String tokenId);
public OAuth2AuthenticationAccessToken findByRefreshToken(String refreshToken);
public OAuth2AuthenticationAccessToken findByAuthenticationId(String authenticationId);
public List<OAuth2AuthenticationAccessToken> findByClientIdAndUserName(String clientId, String userName);
public List<OAuth2AuthenticationAccessToken> findByClientId(String clientId);
}
This worked quite well before the bump in versions and now I see this in the log.
19:04:35.510 [main] DEBUG o.s.c.a.ClassPathBeanDefinitionScanner - Ignored because not a concrete top-level class: file [/Users/larrymitchell/rpilprojects/corerpilservicescomponents/channelMap/target/classes/com/cisco/services/rpil/mongo/repository/oauth2/OAuth2AccessTokenRepository.class]
I do have another mongo repository that is recognized but it was defined as a class implementation
#Component
public class ChannelMapRepository { ... }
This one is recognized (I defined it as a implementation class as a workaround for another problem I had). This class is recognized and seems to work fine.
19:04:35.513 [main] DEBUG o.s.c.a.ClassPathBeanDefinitionScanner - Identified candidate component class: file [/Users/larrymitchell/rpilprojects/corerpilservicescomponents/channelMap/target/classes/com/cisco/services/rpil/services/Microservice.class]
Anyone have an idea why? I looked up the various reasons for why component scanning would not work and nothing lends itself to my issue.
Try removing the #Repository annotation? Worked for me. This was an issue in Github as well.

OpenEJB + EclipseLink are not able to create tables on HSQL database

I have a vanilla maven WAR project, using the Java EE web profile, that executes its unit/integration tests using OpenEJB. During the OpenEJB start-up, instead of using the data source defined in jndi.properties, OpenEJB creates its own:
INFO - Auto-creating a Resource with id 'Default JDBC Database' of type 'DataSource for 'scmaccess-unit'.
INFO - Creating Resource(id=Default JDBC Database)
INFO - Configuring Service(id=Default Unmanaged JDBC Database, type=Resource, provider-id=Default Unmanaged JDBC Database)
INFO - Auto-creating a Resource with id 'Default Unmanaged JDBC Database' of type 'DataSource for 'scmaccess-unit'.
INFO - Creating Resource(id=Default Unmanaged JDBC Database)
INFO - Adjusting PersistenceUnit scmaccess-unit <jta-data-source> to Resource ID 'Default JDBC Database' from 'jdbc/scmaccess'
INFO - Adjusting PersistenceUnit scmaccess-unit <non-jta-data-source> to Resource ID 'Default Unmanaged JDBC Database' from 'null'
And then, further below, when it's time to create the table - as per the create-drop strategy defined on the app's persistence.xml file - I see several errors like this:
(...) Internal Exception: java.sql.SQLSyntaxErrorException: type not found or user lacks privilege: NUMBER
Error Code: -5509
The jndi.properties file:
##
# Context factory to use during tests
##
java.naming.factory.initial=org.apache.openejb.client.LocalInitialContextFactory
##
# The DataSource to use for testing
##
scmDatabase=new://Resource?type=DataSource
scmDatabase.JdbcDriver=org.hsqldb.jdbcDriver
scmDatabase.JdbcUrl=jdbc:hsqldb:mem:scmaccess
##
# Override persistence unit properties
##
scmaccess-unit.eclipselink.jdbc.batch-writing=JDBC
scmaccess-unit.eclipselink.target-database=Auto
scmaccess-unit.eclipselink.ddl-generation=drop-and-create-tables
scmaccess-unit.eclipselink.ddl-generation.output-mode=database
And, the test case:
public class PersistenceTest extends TestCase {
#EJB
private GroupManager ejb;
#Resource
private UserTransaction transaction;
#PersistenceContext
private EntityManager emanager;
public void setUp() throws Exception {
EJBContainer.createEJBContainer().getContext().bind("inject", this);
}
public void test() throws Exception {
transaction.begin();
try {
Group g = new Group("Saas Automation");
emanager.persist(g);
} finally {
transaction.commit();
}
}
}
Looks like eclipselink is trying to create a column with the type NUMBER and that type does not exist in HSQL. Did you specify that type in your mappings? If yes then fix that.
Otherwise it might help to add
<property name="eclipselink.ddl-generation" value="drop-and-create-tables"/>
<property name="eclipselink.create-ddl-jdbc-file-name" value="createDDL_ddlGeneration.jdbc"/>
<property name="eclipselink.drop-ddl-jdbc-file-name" value="dropDDL_ddlGeneration.jdbc"/>
<property name="eclipselink.ddl-generation.output-mode" value="both"/>
to your persistence.xml so you can see what create table statements are exactly generated. If eclipselink is using NUMBER on it's own for certain columns you can tell it to use something else by using the following annotations on the corresponding fields.
#Column(columnDefinition="NUMERIC")