Spring Data JPA | Dynamic runtime multiple database connection - spring-data-jpa

Use Case:
During JBoss server startup, one permanent database connection is already made using Spring Data JPA configurations(xml based approach).
Now when application is already up and running, requirement is to connect to multiple Database and connection string is dynamic which is available on run-time.
How to achieve this using Spring Data JPA?

One way to switch out your data source is to define a "runtime" repository that is configured with the "runtime" data source. But this will make client code aware of the different repos:
package com...runtime.repository;
public interface RuntimeRepo extends JpaRepository<OBJECT, ID> { ... }
#Configuration
#EnableJpaRepositories(
transactionManagerRef="runtimeTransactionManager",
entityManagerFactoryRef="runtimeEmfBean")
#EnableTransactionManagement
public class RuntimeDatabaseConfig {
#Bean public DataSource runtimeDataSource() {
DriverManagerDataSource rds = new DriverManagerDataSource();
// setup driver, username, password, url
return rds;
}
#Bean public LocalContainerEntityManagerFactoryBean runtimeEmfBean() {
LocalContainerEntityManagerFactoryBean factoryBean = new LocalContainerEntityManagerFactoryBean();
factoryBean.setDataSource(runtimeDataSource());
// setup JpaVendorAdapter, jpaProperties,
return factoryBean;
}
#Bean public PlatformTransactionManager runtimeTransactionManager() {
JpaTransactionManager jtm = new JpaTransactionManager();
jtm.setEntityManagerFactory(runtimeEmfBean());
return jtm;
}
}
I have combined the code to save space; you would define the javaconfig and the repo interface in separate files, but within the same package.
To make client code agnostic of the repo type, implement your own repo factory, autowire the repo factory into client code and have your repo factory check application state before returning the particular repo implementation.

Related

Springboot ignores the MongoDB atlas uri, trying to connect hosts=[127.0.0.1:27017]

I have been working in a application with Spring webflux and reactive mongo DB. in there i used mongo DB atlas as the database and it worked fine.
Recently i had to introduce mongo custom conversion to handle the Zoned Date Time objects.
#Configuration
public class MongoReactiveConfiguration extends AbstractReactiveMongoConfiguration{
#Override
public MongoCustomConversions customConversions() {
ZonedDateTimeReadConverter zonedDateTimeReadConverter = new ZonedDateTimeReadConverter();
ZonedDateTimeWriteConverter zonedDateTimeWriteConverter = new ZonedDateTimeWriteConverter();
List<Converter<?, ?>> converterList = new ArrayList<>();
converterList.add(zonedDateTimeReadConverter);
converterList.add(zonedDateTimeWriteConverter);
return new MongoCustomConversions(converterList);
}
#Override
protected String getDatabaseName() {
// TODO Auto-generated method stub
return "stlDB";
}
}
HoOwever now i no longer can connect to mongo db atlas, it ignores the proeprty spring.data.mongodb.uri and tries to connect local server with default configuration.
i tried
#EnableAutoConfiguration(exclude={MongoReactiveAutoConfiguration.class})
but then it ignored the above conversions as well. Is there any other configurations to override in AbstractReactiveMongoConfiguration to ignore the default server IP and port?
I had the same issue and could not find a solution other than configuring converters differently, without extending AbstractReactiveMongoConfiguration:
#Configuration
public class MongoAlternativeConfiguration {
#Bean
public MongoCustomConversions mongoCustomConversions() {
return new MongoCustomConversions(
Arrays.asList(
new ZonedDateTimeReadConverter(),
new ZonedDateTimeWriteConverter()));
}
}

Use multiple mongo DBs in same application for same model & same Repository

I need to implement Spring boot - MongoDb application where There are 2 mongo DBs which have exact same database name & collections. Based on User making a request, i need to choose whether to fetch data from DB1 or DB2 (only difference in mongo URI host - IP).
E.g. I need some way to create 2 mongoTemplates like mTempA & mTempB in my Repository & based on some condition, use either of the template to execute query as below:
#Repository
public class MyCustomRepository {
private Logger logger = LoggerFactory.getLogger(MyCustomRepository.class);
#Autowired
private MongoTemplateA mongoTemplateA;// Need to know if this is possible & how
#Autowired
private MongoTemplateB mongoTemplateB;// Need to know if this is possible & how
public List<MyModel> findByCriteria(MyRequest request) {
List<MyModel> result;
//Query query = <build query based on request>
if (request.getUserType().equals("A")) {
result = mongoTemplateA.find(query, MyModel.class);
} else {
result = mongoTemplateB.find(query, MyModel.class);
}
logger.debug("Result fetched with {} records", result.size());
return result;
}
}
I don't want to have 2 separate Repo (Class or Interfaces) or different models to be used. Just want to have 2 different mongoTemplates to be injected in single repo.
Is this possible? If yes, please give some example code.
I have followed below tutorial:
https://dzone.com/articles/multiple-mongodb-connectors-with-spring-boot
As rightly pointed out by #Lucia, below is how it can be done:
Have 2 different configuration placeholders
#Configuration
#EnableMongoRepositories(basePackages = "com.snk.repository", mongoTemplateRef = "mongoTemplateA")
public class MongoConfigA {
// Configuration class for DB 1 access
}
#Configuration
#EnableMongoRepositories(basePackages = "com.snk.repository", mongoTemplateRef = "mongoTemplateB")
public class MongoConfigB {
// Configuration class for DB 2 access
}
Get one class which will help in reading custom properties for mongo db properties in application.properties:
#ConfigurationProperties(prefix = "mongodb")
public class MultipleMongoProperties {
private MongoProperties adb = new MongoProperties();
private MongoProperties bdb = new MongoProperties();
public MongoProperties getAdb() {
return adb;
}
public MongoProperties getBdb() {
return bdb;
}
}
Add a configuration class to create mongoTemplates:
#Configuration
#EnableConfigurationProperties(MultipleMongoProperties.class)
public class MultipleMongoConfig {
#Autowired
private MultipleMongoProperties mongoProperties = new MultipleMongoProperties();
#Bean(name = "mongoTemplateA")
#Primary
public MongoTemplate mongoTemplateA() {
return new MongoTemplate(aDbFactory(this.mongoProperties.getAdb()));
}
#Bean(name = "mongoTemplateB")
public MongoTemplate mongoTemplateB() {
return new MongoTemplate(bDbFactory(this.mongoProperties.getBdb()));
}
#Bean
#Primary
public MongoDbFactory aDbFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClientURI(mongo.getUri()));
}
#Bean
public MongoDbFactory bDbFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClientURI(mongo.getUri()));
}
}
Add below decelerations to your service/repository:
#Autowired
#Qualifier("mongoTemplateA")
private MongoTemplate mongoTemplateA;
#Autowired
#Qualifier("MongoTemplateB")
private MongoTemplate MongoTemplateB;
Add below properties in your application.properties:
mongodb.adb.uri=mongodb://user:pass#myhost1:27017/adb
mongodb.bdb.uri=mongodb://user:pass#myhost2:27017/bdb
If you have mongo rplica set, URL can be set as:
mongodb.adb.uri=mongodb://user:pass#myhost1,myhost2,myhost13/adb?replicaSet=rsName
mongodb.bdb.uri=mongodb://user:pass#myhost1,myhost2,myhost13/bdb?replicaSet=rsName
Based on your logic, use either of the template.
Thought, there are few catches:
Notice the #Primary annotation, one bean needs to be marked as primary. I haven't find any solution if no template is marked primary.
If any of the mongo DB is down & application is started/restarted, application will not start/deploy. to avoid this, #Autowired needs to be changed to #Autowired(required = false).
If any of the mongo DB is down & application is already running, it automatically uses 2nd mongo BD (which is not down). So, even if you want to use A DB, if it's down, requests are processed with B DB & vice-versa.

Running Spring Batch test, doesn't initialize database

Trying to create some end-to-end tests for an spring batch application, which works great. I get an sql error because it is not initializing Spring Batch processing tables: org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist
I have this code in the src/test/resources/application.properties:
spring.datasource.initialize=true
spring.datasource.initialization-mode=always
spring.datasource.platform=postgresql
spring.batch.initialize-schema=always
Which is the same I have on `src/main/resources/application.properties and works.
This is the code I have for ApplicationTest:
#RunWith(SpringRunner.class)
#ContextConfiguration(classes={
TestConfiguration.class,
JobCompletionNotificationListener.class,
BatchConfiguration.class
})
#SpringBatchTest
public class ApplicationTests {
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testJob() throws Exception {
JobExecution jobExecution = jobLauncherTestUtils.launchJob();
}
}
I have an specific TestConfiguration to generate a Bean with the DataSource.
#Configuration
#PropertySource("application.properties")
public class TestConfiguration {
#Autowired
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty("spring.datasource.driverClassname"));
dataSource.setUrl(env.getProperty("spring.datasource.url"));
dataSource.setUsername(env.getProperty("spring.datasource.username"));
dataSource.setPassword(env.getProperty("spring.datasource.password"));
return dataSource;
}
I was expecting all tables to be created (internal Batch tables and the tables defined in schema-all.sql).
But I get the following error org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist.
I don't understand why in the main application all works automagically, and it doesn't in the test.
If a Spring test misses misses the BatchDataSourceInitializer that is being auto-configured by Spring Boot in the actual application, and you don't want to write a full #SpringBootTest, you can selectively add the auto-configuration for Spring Batch by adding the annotation
#ImportAutoConfiguration(BatchAutoConfiguration.class)
This will then provide the initializer for the injected DataSource.

How do I properly configure my Spring Boot application?

Using mongodb with Spring Data MongoDB backend. Using Mongo Repositories too.
This is my actual configuration:
/** MONGO CLIENT *****************************************************/
#Override
protected String getDatabaseName() {
return db;
}
#Override
public Mongo mongo() throws Exception {
/* I'm so dump to automatize this that I just do it manually */
return new Fongo("meh").getMongo(); //Using it for unit tests
//return new MongoClient(url, port); //Using it for IT
}
#Override
protected Collection<String> getMappingBasePackages() {
return Arrays.asList("com.foo");
}
/** BEANS ************************************************************/
#Bean
public Jackson2RepositoryPopulatorFactoryBean repositoryPopulator() {
Resource foo1 = (Resource) new ClassPathResource("collections/foo1.json");
Resource foo2 = (Resource) new ClassPathResource("collections/foo2.json");
Jackson2RepositoryPopulatorFactoryBean factory = new Jackson2RepositoryPopulatorFactoryBean();
factory.setResources(new Resource[] { foo1, foo2 });
return factory;
}
The repository populator is what I added and it's what gives me troubles.
When I compile and test my project I'm getting DuplicateKeyException because I guess the repository populator triggers more than once.
These are the annotations that I use on my test classes:
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
Is it well configured my application? What's the reasonable solution to avoid repository populator to trigger multiple times ?
Solution based on this guide (in spanish, sorry): https://www.paradigmadigital.com/dev/tests-integrados-spring-boot-fongo
Is needed to separate fongo configuration from mongo.
fongo configuration must be placed on test/
Just take the example code (and using MongoConfiguration.java too, my actual config is wrong) from the guide as a base and you will be fine.

How to properly create Spring Cloud Task with custom parameters?

According to the samples here (actually - timestamp task), I have implemented a small task class:
#SpringBootApplication
#EnableTask
#EnableConfigurationProperties({ RestProcessorTaskProperties.class })
public class RestProcessorTaskApplication {
public static void main(String[] args) {
SpringApplication.run(RestProcessorTaskApplication.class, args);
}
#Autowired
private RestProcessorTaskProperties config;
// some fields and beans
#Bean
public CommandLineRunner run(RestTemplate restTemplate) {
return args -> {
// doing some stuff
};
}
}
and then I've created Properties class (in the same package)
#ConfigurationProperties("RestProcessor")
public class RestProcessorTaskProperties {
private String host = "http://myhost:port";
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
}
But after I've registered task on my local Spring Cloud Data Server, I see numerous parameters, that, I suppose, was added automatically. I those mean parameters like:
abandon-when-percentage-full java.lang.Integer
abandoned-usage-tracking java.lang.Boolean
acceptors java.lang.Integer
access-to-underlying-connection-allowed java.lang.Boolean
and others...
Is it possible somehow to hide (or remove) them, so that when launching task I could configure only those parameters, that was added by me (single host property in my example above)?
By default Spring Cloud Data Flow will show you all the available properties for a boot application. However, you can create a whitelist of properties that you wish to show.
Here is a link to the Spring Cloud Data Flow reference doc that will discuss how to do this: http://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#spring-cloud-dataflow-stream-app-whitelisting.
And here is link to the timestamp starter app that has an example of this: https://github.com/spring-cloud/spring-cloud-task-app-starters/tree/master/spring-cloud-starter-task-timestamp