Can't use #Transactional in Spring Boot Test - mongodb

I've created a simple application using Spring Boot, Kotlin & Mongodb. Now I would like to use the #Transactional annotation in the test CustomerServiceIntegrationTest to make sure each test case rolls back any changes it did to the database.
When I add #Transactional I get the message Failed to retrieve PlatformTransactionManager for #Transactional test.
Writing to the db otherwise works fine.
What did I miss?
https://github.com/urswiss/customer

Related

Hazelcast reactive support for WriteThrough in Hazelcast 4.2

We wanted to implement WriteThrough/write behind in Hazelcast using MapStore for Postgres DataStore using ReactiveClient. But could not succeed and we are also not getting any sample for reactive approach.
We are implementing MapStore Interface to achieve WriteThrough Caching Pattern.
we have to inject the Respository in Mapstore and we are trying inject using #Autowired Annotation.
But we are not able to do, we are getting error repository as null.
Can we get some working example for Hazelcast 4.2 with Postgress as Datastore in reactive way?
Thanks
Jeni Ambrose
We tried with Autowiring the Repositories in Mapstore implementation
Here's one way:
#Bean
public Config config(AccountRepository accountRepository) {
Config config = new ClasspathYamlConfig("hazelcast.yml");
MapStoreConfig accountMapStoreConfig = new MapStoreConfig();
accountMapStoreConfig.setInitialLoadMode(MapStoreConfig.InitialLoadMode.EAGER);
accountMapStoreConfig.setEnabled(true);
accountMapStoreConfig.setImplementation(new AccountMapLoader(accountRepository));
MapConfig accountMapConfig = new MapConfig();
accountMapConfig.setName("account");
accountMapConfig.setMapStoreConfig(accountMapStoreConfig);
config.getMapConfigs().put(accountMapConfig.getName(), accountMapConfig);
return config;
}
You can have most (or none!) of the config in a static file, load it, and amend it to add Spring managed bits.
Also, 4.2 is old, use 5.2.1 or whatever is newest when you get to doing this.

JPA EntityManagerFactory with AbstractRoutingDataSource containing multiple DB vendors?

So I have a perfectly good working example of using AbstractRoutingDataSource and JdbcTemplate with Oracle / Sybase & MsSql databases in the same running spring boot application. I use AOP and a custom annotation on the method so that it sets the data source name on the thread and then the AbstractDataSource hands the correct data source to JdbcTemplate when you run a query.
Now the issue I am facing, is how I go about configuring the hibernate dialects when configuring the EntityManagerFactoryBuilder, as these are obviously different and based on the underlying active data sources (can differ between environments). The code you would use to configure the EntityManagerFactory if all data sources were the same would be as follows.
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(EntityManagerFactoryBuilder builder) {
return builder
.dataSource(dataSource)
.packages("<the associated entity package name>")
.build();
But when I start the spring boot application, I get the error below
Caused by: org.hibernate.HibernateException: Access to DialectResolutionInfo cannot be null when 'hibernate.dialect' not set
Anyone know a workaround for this or is it not possible to have the same JPA Entities and CrudRepository instances spread across multiple datasources with different vendors?

How to Configure SpringBootApp with a Mongo Production Database

I am creating a Spring Boot App with Mongo DB and scratching my head a bit with how to set up the production database configuration.
With a SQL-based Database, I'd be used to setting up a data source bean like this
#Bean
public DataSource getDataSource()
{
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.driverClassName("org.h2.Driver");
dataSourceBuilder.url("jdbc:h2:file:C:/temp/test");
dataSourceBuilder.username("sa");
dataSourceBuilder.password("");
return dataSourceBuilder.build();
}
However,
It doesn't seem to be needed - my local app connects to a spun up instance of mongo db without any explicit configuration.
It doesn't seem to be a standard with mongo according to [this post][1]
I figured I'd give it a go to see if it would automagically configure in production, but I'm getting a DataAccessResourceFailureException. Info: heroku, did the mLab MongoDB add on.
I have no problem getting the url and I can certainly throw that in an environment variable, but I'm just not sure what I need to add to my app to configure it.
Set values in application.properties file like below
spring.data.mongodb.database = ${SPRING_DATA_MONGODB_DATABASE}
spring.data.mongodb.host = ${SPRING_DATA_MONGODB_HOST}
spring.data.mongodb.port = ${SPRING_DATA_MONGODB_PORT}
You can use the #Value annotation and access the property in whichever Spring bean you're using
#Value("${userBucket.path}")
private String userBucketPath;
The Externalized Configuration section of the Spring Boot docs, explains all the details that you might need.

Having problems understanding Springboot and MongoDB

I'm developing an application with SpringBoot. I already have a RestController and a RabbitMQ component that depending on the message I receive I get some data from a MongoDB and do some logic.
I set up the database as:
MongoClient mongoClient = MongoClients.create("mongodb://localhost:27017");
MongoDatabase db = mongoClient.getDatabase("databaseName");
MongoCollection<Document> collection = db.getCollection("collectionName");
Since I'm using SpringBoot I wanted to do it with Springboot and acess it in every SpringBoot component (the RestController and the RabbitMQ component).
I already understood that I have to put the settings on application.properties.
What I don't get is how do I acess the database afterwards.
Am I supposed to do a #Configuration class?
And how can I do, for example, collection.find(eq("id",userID)).first() everywhere?
Use the spring data JPA. You literally don't have to write any code.
Just follow this

Problem with EJB + POJO Helpers + EntitiyManager

I'm working with EJBs...I do the following and I don't know why the injected EntityManager is not working as one might expect.
EJB1 calls a method on EJB2 that writes to the DB.
when EJB2 returns EJB1 sends a message to a MDB.
MDB calls EJB3 that reads the DB and does some work.
My problem is that the EntityManager injected in all 3 EJBs with #PersistenceContext is not working properly. Calling persist() in EJB2 is not being reflected on the EntityManager injected in EJB3.
What might be wrong?
Hope I made my problem clear enough.
now working with Container managed transactions.
My problem is that the EntityManager injected in all 3 EJBs with #PersistenceContext is not working properly. Calling persist() in EJB2 is not being reflected on the EntityManager injected in EJB3.
In a Java EE environment, the common case is to use a Transaction-Scoped Container-Managed entity manager. And with such an entity manager, the persistence context propagates as the JTA transaction propagates.
In your case, I suspect you're using a REQUIRES_NEW transaction attribute for the method of EJB3. So:
when invoking EJB3#bar(), the container will suspend the transaction started for EJB2#foo() and start a new transaction
when invoking the entity manager from EJB3#bar(), a new persistence context will be created.
since the transaction started for EJB2#foo() has not yet committed, changes aren't "visible" to the new persistence context.
PS: Are you really creating new threads? If yes, little reminder: this is forbidden by the EJB spec.