I need to parallelize a single step of a batch spring job. Before the step to be parallelized, tasklets are run that put some results in the parameters of the job.
The results produced by the tasklets, are necessary to execute the Partitioner and the Items of the step to be parallelized.
A doubt is arising that I really can't solve. Since I can have the same job running simultaneously multiple times with different initial parameters, are the tasklets and step items safe thread-safe?
No, tasklets and chunk-oriented step components are not thread-safe. If they are shared between multiple job instances/executions running concurrently, you need to make them thread-safe.
You can achieve this by using JobScoped steps and StepScoped readers/writers. You can also use the SynchronizedItemStreamReader and the (upcoming) SynchronizedItemStreamWriter to make readers and writers thread-safe. All item readers and writers provided by Spring Batch have a mention about their thread-safety in the Javadoc.
You do not want to run multiple instances of the same job. It would be better to run multiple tasks or processes in the same step and or job. You might want to lookup job partitioning, and or Remote Chucking to do concurrent processing.
If it has to be isolated jobs then you might have your concurrent jobs write out to say a message que as their end (writer) step, and then have another job listen to read from that que.
https://docs.spring.io/spring-batch/2.1.x/cases/parallel.html
Related
I am trying to understand how Kafka Stream work under the hood (to know it a little better), and came across confluent link, and it is really wonderful.
It says two terms viz: StreamThreads and StreamTasks.
I am not able to understand what exactly is StreamTasks?
Is it executed by StreamThread?
As per doc, StreamThreads can have multiple StreamTasks, so won't there be any data sharing and won't this thread run slower? How does a StreamThread "run" multiple StreamTasks?
Any explanation in simple words would be of great help.
"Tasks" are a logical abstractions of work than can be done in parallel (ie, stuff that can be processed independent from each other). Kafka Streams basically creates a task for each input topic partition, because data in different partitions can processed independent from each other (it's a simplification, but holds if you have a single input topic; for joins it's a little bit different).
A StreamThread is basically a JVM thread. Task are assigned to StreamsThread for execution. In the current implementation, a StreamThread basically loops over all tasks and processes some amount of input data for each task. In between, the StreamThread (that is using a KafkaConsumer) polls the broker for new data for all its assigned tasks.
Because tasks are independent from each other, you can run as many thread as there are tasks. For this case, each thread would execute only a single task.
Is it possible to use spring batch as a regular job framework?
I want to create a device service (microservice) that has the responsibility
to get events and trigger jobs on devices. The devices are remote so it will take time for the job to be complete, but it is not a batch job (not periodically running or partitioning large data set).
I am wondering whether spring batch can still be used a job framework, or if it is only for batch processing. If the answer is no, what jobs framework (besides writing your own) are famous?
Job Description:
I need to execute against a specific device a job that will contain several steps. Each step will communicate with a device and wait for a device to confirm it executed the former command given to it.
I need retry, recovery and scheduling features (thought of combining spring batch with quartz)
Regarding read-process-write, I am basically getting a command request regarding a device, I do a little DB reads and then start long waiting periods that all need to pass in order for the job/task to be successful.
Also, I can choose (justify) relevant IMDG/DB. Concurrency is outside the scope (will be outside the job mechanism). An alternative that came to mind was akka actors. (job for a device will create children actors as steps)
As far as I know - not periodically running or partitioning large data set are not primary requirements for usage of Spring Batch.
Spring Batch is basically a read - process - write framework where reading & processing happens item by item and writing happens in chunks ( for chunk oriented processing ) .
So you can use Spring Batch if your job logic fits into - read - process - write paradigm and rest of the things seem secondary to me.
Also, with Spring Batch , you should also evaluate the part about Job Repository . Spring Batch needs a database ( either in memory or on disk ) to store job meta data and its not optional.
I think, you should put more explanation as why you need a Job Framework and what kind of logic you are running that you are calling it a Job so I will revise my answer accordingly.
Multi Thread Excecution
Parallelization
Multi threading is optimal when the number of threads (in general a Subjob count one thread) do not exceed the number of processors of the machine you use for parallel executions. Otherwise, some of the Subjobs have to wait until any processor is freed up.
Also note that you cannot parallelize more than your number of CPU, otherwise it will wait for the processors and will be overhead for processors.
Parallelization helps you to manage complex Job systems. It executes several subjobs simultaneously and synchronizes the execution of a subjob with other sub-jobs within the main Job.
The exact difference between above two is Parallelize(or multi thread enabled) linked sub jobs run parallel regardless of which ones finish first, on the other hand Synchronize linked sub jobs starts to run only when all other parallelize sub jobs finishes.
So, Parallelization is best when you have a request that need some of subjobs to run parallel, and a subjob starts to run only when all other parallelize subjobs finishes.
It also makes your job design more flexible.
For detailed information you can visit this link
I have a clarification.
Is it possible for us to run multiple instances of a job at the same time.
Currently, we have single instance of a job at any given time.
If it is possible, please let me know how to do it.
Yes you can. Spring Batch distinguishes jobs based on the JobParameters. So if you always pass different JobParameters to the same job, you will have multiple instances of the same job running.
A simple way is just to add a UUID parameter to each request to start a job.
Example:
final JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addString("instance_id", UUID.randomUUID().toString(), true);
jobLauncher.run(job,jobParametersBuilder.toJobParameters());
The boolean 'true' at the end signal to Spring Batch to use that parameter as part of the 'identity' of the instance of the job, so you will always get new instances with each 'run' of the job.
Yes you can very much run tasks in parallel as also documented here
But there are certain things to be considered
Does your application logic needs parallel execution? Because if if you are going to run steps in parallel, you would have to take care and build application logic so that the work done by parallel steps is not overlapping (Unless that is the intention of your application)
Yes, it's completely possible to have multiple instances (or executions) of a job run concurrently.
Can WWF handle high throughput scenarios where several dozen records are 'actively' being processed in parallel at any one time?
We want to build a workflow process which handles a few thousand records per hour. Each record takes up to a minute to process, because it makes external web service calls.
We are testing Windows Workflow Foundation to do this. But our demo programs show processing of each record appear to be running in sequence not in parallel, when we use parallel activities to process several records at once within one workflow instance.
Should we use multiple workflow instances or parallel activities?
Are there any known patterns for high performance WWF processing?
You should definitely use a new workflow per record. Each workflow only gets one thread to run in, so even with a ParallelActivity they'll still be handled sequentially.
I'm not sure about the performance of Windows Workflow, but from what I have heard about .NET 4 at Tech-Ed was that its Workflow components will be dramatically faster then the ones from .NET 3.0 and 3.5. So if you really need a lot of performance, maybe you should consider waiting for .NET 4.0.
Another option could be to consider BizTalk. But it's pretty expensive.
I think the common pattern is to use one workflow instance per record. The workflow runtime runs multiple instances in parallel.
One workflow instance runs one thread at a time. The parallel activity calls Execute method of each activity sequentially on this single thread. You may still get performance improvement from parallel activity however, if the activities are asynchronous and spend most of the time waiting for external process to finish its work. E.g. if activity calls an external web method, and then waits for a reply - it returns from Execute method and does not occupy this thread while waiting for the reply, so another activity in the Parallel group can start its job (e.g. also call to a web service) at the same time.