Spring batch partitioning example - spring-batch

I am new to Spring Batch. I have a working example of spring batch without remote capability. Though I understood the concepts of remote partitioning and chunking, I didnt find any example which explains step by step. Kindly suggest links or blogs.
I read data from one table, process it and put into another table.

Related

Can we write the spring batch metadata to multiple datasources?

I have a use case where I am using spring batch and writing to 3 different data sources based on the job parameters. All of this mechanism is working absolutely fine but the only problem is the meta data. Spring batch is using the default data Source to write the metadata . So whenever I write the data for a job, the transactional data always goes to the correct DB but the batch metadata always goes to default DB.
Is it possible to selectively write the meta data also to the respective databases based on the jobs parameter?
#michaelMinella , #MahmoudBenHassine Can you please help.

Spring batch with MongoDB and transactions

I have a Spring Batch application with two databases: one SQL DB for the Spring Batch meta data, and another which is a MongoDB where all the business data is stored. The relation DB still uses DataSourceTransactionManager.
However I dont think the Mongo writes are done within an active transaction with rollbacks. Here is the excerpt from the official Spring Batch documentation on MongoItemWriter:
A ItemWriter implementation that writes to a MongoDB store using an implementation of Spring Data's MongoOperations. Since MongoDB is not a transactional store, a best effort is made to persist written data at the last moment, yet still honor job status contracts. No attempt to roll back is made if an error occurs during writing.
However this is not the case any more; MongoDB introduced ACID transactions in version 4.
How do I go about adding transactions to my writes? I could use #Transactional on my service methods when I use ItemWriterAdapter. But still dont know what to do with MongoItemWriter... What is the right configuration here? Thank you.
I have a Spring Batch application with two databases: one SQL DB for the Spring Batch meta data, and another which is a MongoDB where all the business data is stored.
I invite you to take a look at the following posts to understand the implications of this design choice:
How to java-configure separate datasources for spring batch data and business data? Should I even do it?
How does Spring Batch transaction management work?
In your case, you have a distributed transaction across two data sources:
SQL datasource for the job repository, which is managed by a DataSourceTransactionManager
MongoDB for your step (using the MongoItemWriter), which is managed by a MongoTransactionManager
If you want technical meta-data and business data to be committed/rolled back in the scope of the same distributed transaction, you need to use a JtaTransactionManager that coordinates the DataSourceTransactionManager and MongoTransactionManager. You can find some resources about the matter here: https://stackoverflow.com/a/56547839/5019386.
BTW, there is a feature request to use MongoDB as a job repository in Spring Batch: https://github.com/spring-projects/spring-batch/issues/877. When this is implemented, you could store both business data and technical meta-data in the same datasource (so no need for a distributed transaction anymore) and you would be able to use the same MongoTransactionManager for both the job repository and your step.

Applying drools rules using spring batch

We have scenario where I have to get data from one database and update data to another database after applying business rules.
I want to use spring batch+drools+hibernate.
Can we apply rules in batch as we have million records at one time?
I am not an expert of drools and I am simply trying to give some context about Spring Batch.
Spring Batch is a Read -> Process -> Write framework and what we do with drools is same as what we do in Process step of Spring Batch i.e. we transform a read item in an ItemProcessor.
How Spring Batch helps you for handling large number of items is by implementing Chunk Oriented processing i.e. We read N-number of items in one go, transform these items one by one in Processor & then write a bulk of items in writer - this way we are basically reducing number of DB calls.
There are further scope of performance improvement by implementing parallelism via partitioning etc if your data can be partitioned on some criteria.
So we read items in bulk , transform one by one & then write in bulk to target database & I don't think hibernate is a good tool for bulk update / insert at write step - I would go by plain JDBC.
Your drools comes into picture at transformation step & that is going to be your custom code & its performance will have nothing to do with Spring Batch i.e. how you initialize sessions , pre compile rules etc . You will have to plug in this code in such a way that you don't initialize drools session etc every time but that should be one time activity.

Slick 3 batch Update

My requirement is to do a batch update to a table. I was able to do a Batch insert but could not find a way to do the batch update in Slick3. Any sample or link to a document will be very help full. Tried to search the web could not find a solution. Using Slick 3 on Postgresql.
Why don't you just add whatever data you need to perform to a seq and perform the query. That will do the batching for you. Link here . This applies to all operations not only insert.
Look at this SO question to fall back on standard JDBC and do the update batching.
Also there are some items the team is working on. See here.

How to chain two readers in Spring Batch using Java configuration

This is a Spring Batch problem.
I would like to read some information from a CSV, then use that to read from two different tables in a database, then perform an update on those rows. I have a reader than reads from a CSV, and can write to two tables by making a composite writer.
I would prefer a solution that uses Java configuration (it's too bad so many examples use XML configuration on the Web, and haven't been updated to do Java configuration).
The more sample code that you can provide, the better, in particular, if I had to use a listener or a processor, how would I perform the query and get the result.
What you're really looking for isn't chaining of readers but using an ItemProcessor to enrich the data that was read in from the CSV. I'd expect your step to be something along the lines of FlatFileItemReader for the reader, your own custom ItemProcessor that enriches the object provided from the reader, and then (as you mentioned) a CompositeItemWriter that delegates the writes to the appropriate other writers.