Reader and writer - do they happen in parallel? - spring-batch

In spring batch, let's say we create a step using the following
stepBuilders.get("evalStep")
.<List<Long>, List<Long>>chunk(1)
.reader(reader())
.writer(writer())
.build();
When reader is producing messages, does writer start processing them right away? Or does it wait for reader to be done in its entirety first?
If it is one-than-the-other, is there some way to set it up so that it runs in parallel?
Here is diagram showing desired solution. Reader and Writer run in parallel.

The writer does not start until the reader has finished reading a chunk of data. This is explained in the documentation with sequence diagrams and code samples here: Chunk-oriented Processing.
If it is one-than-the-other, is there some way to set it up so that it runs in parallel?
You can use a multi-threaded step to process items concurrently within chunks, or a partitioned step to process partitions in parallel. For more details about this, take a look at the documentation here: Scaling and Parallel Processing.

Related

How to process List of items in Spring batch using Chunk based processing| Bulk processing items in Chunk

I am trying to implement a Spring batch job where in order to process a record , it require 2-3 db calls which is slowing down the processing of records(size is 1 million).If I go with chunk based processing it would process each record separately and would be slow in performance. So, I need to process 1000 records in one go as bulk processing which would reduce the db calls and performance would increase. But my question is If I implement Tasklet then I would lose the functionality of restartability and retrial/skip features too and if implemented using AggregateInputReader I am not sure what would be the impact on restartability and transaction handling.
As per the below thread AggregateReader should work but not sure its impact on transaction handling and restartability in case of failure:
Spring batch: processing multiple record at once
The first extension point in the chunk-oriented processing model that gives you access to the list of items to be written is the ItemWriteListener#beforeWrite(List items). So if you do not want to enrich items one at a time in an ItemProcessor, you can use that listener to do the enrichment for the entire chunk at once.

Spring batch job is thread-safe?

I need to parallelize a single step of a batch spring job. Before the step to be parallelized, tasklets are run that put some results in the parameters of the job.
The results produced by the tasklets, are necessary to execute the Partitioner and the Items of the step to be parallelized.
A doubt is arising that I really can't solve. Since I can have the same job running simultaneously multiple times with different initial parameters, are the tasklets and step items safe thread-safe?
No, tasklets and chunk-oriented step components are not thread-safe. If they are shared between multiple job instances/executions running concurrently, you need to make them thread-safe.
You can achieve this by using JobScoped steps and StepScoped readers/writers. You can also use the SynchronizedItemStreamReader and the (upcoming) SynchronizedItemStreamWriter to make readers and writers thread-safe. All item readers and writers provided by Spring Batch have a mention about their thread-safety in the Javadoc.
You do not want to run multiple instances of the same job. It would be better to run multiple tasks or processes in the same step and or job. You might want to lookup job partitioning, and or Remote Chucking to do concurrent processing.
If it has to be isolated jobs then you might have your concurrent jobs write out to say a message que as their end (writer) step, and then have another job listen to read from that que.
https://docs.spring.io/spring-batch/2.1.x/cases/parallel.html

Chunk-oriented Processing without #StepScope

I have some confusion regarding #StepScope in chunk-oriented processing:
I have, lets say, 2 million record to read. So I want to run my spring batch application in chunks. Lets say 2000 items to read, and process and write, then go and read 2001th till 4000th item, process, write, etc.
Question is, if I don't use #StepScope, the batch will know that it has to read 2001st item, and not reread what it already has read?
Yes, even without using #StepScope the reader will read the next chunk and not re-read the same chunk again.
The scope step is actually required in order to use late binding of attributes from the job/step execution context. More details on this here: https://docs.spring.io/spring-batch/4.0.x/reference/html/step.html#late-binding
So if your reader does not need to access job parameters or attributes from the job/step execution context, it does not need to be step scoped and will still read data chunk by chunk. In sum, there is no relation between step scope and chunk-oriented processing.

Spring batch single threaded reader and multi threaded writer

Tried to find if this was asked before but couldn't.
Here is the problem. The following has to be achieved via Spring batch
There is one file to be read and processed. The item reader is not thread safe.
The plan is to have multithreaded homogenous processors and multithreaded homogenous writers injest items read by a single threaded reader.
Kind of like below:
----------> Processor #1 ----------> Writer #1
|
Reader -------> Processor #2 ----------> Writer #2
|
----------> Processor #3 ----------> Writer #3
Tried AsyncItemProcessor and AsyncItemWriter, but holding debug point on processor resulted in reader not being executed until the point was released i.e. single threaded processing.
Task executor was tried like below:
<tasklet task-executor="taskExecutor" throttle-limit="20">
Multiple threads on the reader were launched.
Synchronising the reader also didn't work.
I tried to read about partitioner but it seemed complex.
Is there an annotation to mark the reader as single threaded? Would pushing read data to Global context be a good idea?
Please guide towards a solution.
I guess nothing is in built in Spring Batch API for the pattern that you are looking for. Coding on your part would be needed to achieve what you are looking for.
Method ItemWriter.write already takes a List of processed items based on your chunk size so you can divide up that List into as many threads as you like. You spawn your own threads and pass a segment of list to each of threads to write .
Problem is with method ItemProcesor.process() as it processes item by item so you are limited by a single item and you wouldn't be able to much of a threading for a single item.
So challenge is to write your own reader than can hand over a list of items to processor instead of a single item so you can process those items in parallel & writer will work on a list of list.
In all of this set up, you have to remember that threads spawned by you will be out of read - process - write transaction boundary of Spring batch so you will have to take care of that on your own - in terms of merging processed output for all threads and waiting till all threads are complete and handling any errors. All in all, its very risky.
Making a item reader to return a list instead single object - Spring batch
Came across this with a similar problem at hand.
Here's how I am doing it at the moment. As #mminella suggested, synchronized itemReader with the flatfileItemReader as delegate. This works with decent performance. The code writes about ~4K records per second at the moment but the speed doesn't entirely depend on the design, other attributes contribute as well.
Tried other approaches to increase performance, both kind of failed.
Custom Synchronized ItemReader that aggregates with FlatFileItemReader as delegate but I ended up with maintaining a lot a state that caused performance drop. Maybe the code needed optimization or Synchronization is just faster.
Fired each insert PreparedStatement batch in different thread but didn't increase much performance but I still am counting on this in case I run into an environment where individual threads for batches would result in significant performance boost.

Spring batch job to process a dynamic set of files parallely

I am trying to design a spring batch job which I want to process a dynamic set of files parallely. Meaning when the batch job itself is started, the number of files to process is not known - the files are available dynamically. The job should run and continue to process the files parallely as and when a new file arrives,till it has finished processing all files.
I have gone through the spring batch project page, and from my understanding it looks like Multi-threaded Step is suitable for my case. But the thing that I am not sure of is whether it can support dynamic availability of files to be processed?
Any inputs will be highly appreciated.
Thanks and regards,
Priya
You have a couple options here:
MultiResourceItemReader - This ItemReader wraps an ItemWriter like the FlatFileItemReader and loops through the resources provided via an expression.
Partitioning - This option is better for parallel processing of files. Using the MultiResourcePartitioner, you can execute files in parallel with all the restartability, etc features you'd normally get with Spring Batch.
You can read more about partitioning in the documentation here: http://docs.spring.io/spring-batch/trunk/reference/html/scalability.html