Put a deadline in spring batch - spring-batch

In a java program.
I need to read database, take theses data, doing some rest call,  write data in a txt file (who have an header, data and a footer).
Job start saturday night and need to finish before saturday morning. If not finish, we need to close file (write footer before) and start a new one.
I started to check some tool to do this job. Spring batch seem interesting.
I can split job with reader, process, writer.
Is there something to check if a job has reach is deadline
Job will be launch with Jentskin

I guess you must use a scheduler for that.
You must read from DB the end date every minute or so, and
if (endDate.compareTo(new Date())<=0)
than the scheduler'job must stop the batch job.
You can use Quartz

Related

Autosys trigger same DataStage job multiple times with different inovacation IDs

Here is what I am trying to do, not sure if it is possible:
Autosys gets File1:10pm starts DataStage Job 1.1:10pm
Job1.1:10pm is still running
Autosys gets File1:20pm, it needs to start the same Job1 but run it as Job1.1:20pm, even though Job1.1:10pm is still running & not wait for it to finish, go ahead & run.
Can Autosys call the same DataStage job every time it gets a new file & run it with the new timestamp as the invocation id. Without waiting for the previous job to finish.
Thanks ya'll
Yes - absolutely - this is possible. To enable different InvocationIds you have to check the "multiple instance" property in the jobs properties. With this you allow multiple simultaneous runs of the job.
The invocationID can be a parameter as well when calling it from a sequence.
When your (multiple intance) job writes to a file make sure that each filename is unique to avoid side effects due to the multiple runs at the same time. This can be done by specifying DSJobInvocationId as part of the filename. Note that it is a parameter provided by DataStage which needs to be written exactly as shown with the upper and lower case letters. DataStage will the replace it with the content of your job invocationid at runtime.

When scheduling spring batch using quartz or simple spring scheduling , its not starting from beginning

I have a simple chunk based spring batch job which i scheduled using quartz but the problem is I have one print statement in reader which is getting printed only at first ran. I am new to this so I want to understand , in scheduled batch job is reading happens only once??
My requirement is each run should read data from source. Can someone help how reader work in scheduled environment . If any more information is require plz let me know

SpringBatch: getting the executionid of a completed instance by its JobParameters

My software is coreographing a number of spring batch jobs. The output of a job is partially an input for the next job . It may happen that the entire process (the entire jobs chain) is restarted, even if one or more jobs in the chain have been successfully completed. In this case, when I try tu run one of the jobs again with the same parameters, I get a JobInstanceAlreadyCompletedException as expected. I could skip and go on to the next job but I would need to access the context of the completed instance in order to get the output produced by its steps and pass them over to the next job.
According to the JobExplorer APIs, this is just possible if you have the executionId of the completed instance. I can't get it from the JobInstanceAlreadyCompletedException , and it looks there are no APIs for getting it from the already used parameters list. Do you know a way to get this executionId given the parameters? Or to get access, in whatever way, to the completed instance job context?
Why not put all this jobs into one main job and using JobSteps to integrate the jobs? This way, already completed subjobs will be treated as completed steps, which will not be started again. Moreover, all information is available in the job/step contexts, even if you restart?
Another way would be to save all needed parameters and information into a file and use this to start the next job instead of beeing dependent on the Jobexecution info. Your last step could simply be a tasklet, that writes an appropriate property file.

Spring batch/integration dynamic poller/trigger

We have job which polls for file and db every M-F between 1PM-5PM using cron expression. During this time if file arrives it downloads the file and invoke a job. This is working fine and we have used spring integration and batch.
Now we need some customization where we have multiple job where job1 one should poll like above once file is processed successfully, it should stop polling.
Second requirement is, in case if file does not come during polling period we want to send some notification to ops team so that they can take some actions.
Would that help ? Exit Spring Integration when no more messages
You would be able to implement custom behavior in that advice, based on polling result and the time of the day.
Garry is also mentionning that conditional pollers are coming in next versions :
http://docs.spring.io/spring-integration/docs/4.2.0.BUILD-SNAPSHOT/reference/html/messaging-channels-section.html#conditional-pollers

Retry failed writing operations without delaying other steps in Spring Batch application

I am maintaining a legacy application written using Spring Batch and need to tweak it to never lose data.
I have to read from various webservice (one for each step) and then write to a remote database. Things goes bad when connection with the DB drops because all itens read from webservice are discarded (can't read the same item twice), and the data is lost because can not be written.
I need to setup Spring Batch to keep already read data on one step to retry the writing operation next time the step runs. The same step can not read more data until the write operation is successfully concluded.
When not being able to write, the step should keep the read data and pass execution to the next step, after a while, when it's time to the failed step to run again, it should not read another item, retrying the failed writing operation instead.
The batch application should runs in an infinite loop and each step should gather data from one different source. Failed writing operations should be momentarily skipped (keeping the read data) to not delay others steps but should resume from the write operation next time they are called.
I am researching in various web sources aside from official docs, but Spring Batch hasn't the most intuitive docs I have come across.
Can this be achieved? If yes, how?
You can write the data you need to persist in case the job fails to the Batch Step's ExecutionContext. You can restart the job again with this data:
Step executions are represented by objects of the StepExecution class.
Each execution contains a reference to its corresponding step and
JobExecution, and transaction related data such as commit and rollback
count and start and end times. Additionally, each step execution will
contain an ExecutionContext, which contains any data a developer needs
persisted across batch runs, such as statistics or state information
needed to restart
More from: http://static.springsource.org/spring-batch/reference/html/domain.html#domainStepExecution
I do not know if this will be ok with you, but here are my thoughts on your configuration.
Since you have two remote sources that are open to failure, let us partition the overall system with two jobs (not two steps)
JOB A
Step 1: Tasklet
Check a shared folder for files. If files exist, do not proceed to the next step. Will be more understandable when writing about JOB B
Step 2: Webservice to files
Read from your web service and write results to flatfiles in the shared folder. Since you would be using flatfiles for the output, you will solve your "all items read from webservice are discarded and the data is lost because can not be written."
Use Quartz or equivalent for the scheduling of this job.
JOB B
Poll the shared folder for generated files and create a joblauncher with the file (file.getWhere as a jobparameter). Spring integration project may help in this polling.
Step 1:
Read from the file, write them to remote db and move/delete file if writing to db is successful.
No scheduling will be needed since job launching originates from polled in files.
Sample Execution
Time 0: No file in the shared folder
Time 1: Read from web service and write to shared folder
Time 2: Job B file polling occurs, tries to write to db.
If successfull, the system continues to execute.
If not, when Job A tries to execute on its scheduled time, it will skip reading from web service since files still exist in the shared folder. It will skip until Job B consumes the files.
I did not want to go into implementation specifics but Spring Batch can handle all of these situations. Hope that this helps.