Batch Job exit status using Spring Cloud Task - spring-batch

I'm trying to setup a spring batch project to be deployed on Spring Cloud Data Flow server, but first I must "wrapp" it on a Spring Cloud Task application.
Spring Batch generates metadata (start/end, status, parameters, etc) on BATCH_ tables. Cloud Task do the same but on TASK_ tables.
Reading the documentation of Spring Cloud Task, it said that in order to pass the batch information to the task, it must be set
spring.cloud.task.batch.failOnJobFailure=true and also
To have your task return the exit code based on the result of the
batch job execution, you will need to write your own
CommandLineRunner.
So, any indications on how I should write my own CommandLineRunner ?
For now, only having set the propertie, if I force the task to fail, I'm getting Failed to execute CommandLineRunner .... Job UsersJob failed during execution for jobId 3 with jobExecutionId of 6

Related

Spring cloud task in dataflow dashboard is complete but one partition of spring batch failed

I have a simple spring batch remote partitioning project running with spring cloud task. Im using spring cloud dataflow at run the spring batch.
Spring boot version - 2.7.8
Spring cloud task - 2.4.5
spring cloud data flow - 2.10.1-SNAPSHOT
I'm using docker compose for spring cloud dataflow locally, and im using spring cloud local deployer as well.
I have spring batch to move data from one table to another within same database. I'm partitioning the data of 100 records into 4 partitions. I'm initially made one of the partition to fail.
After processsing. batch_job_execution job is failed and batch_step_execution one of the partition is failed. But the same is not reflected in scdf dashboard.
In scdf dashboard, task and task execution is completed but job execution is failed status
Two questions:-
How do i makesure the scdf reflect right batch status in dashboard?
How do i restart the failed job execution?
For First question, i tried with "spring.cloud.task.batch.fail-on-job-failure=true" property in application.properties, but i get "Job must not be null nor empty" from TaskJobLauncherApplicationRunnerFactoryBean.java
For second question, i tried relauch the task using below REST API ->
curl 'http://localhost:9393/jobs/executions/1' -i -X PUT
-H 'Accept: application/json'
-d 'restart=true'
But it restarted all the partitiones

Spring batch process is getting invoked in BUILD step only via Jenkins pipeline

I have written a spring batch job and trying to deploy it via our jenkins pipeline.. This pipeline first build the code , create image and then deploy to kubernetese.
In my batch job, I am looking for a file in some specific directory and if the file is not there, our process sends an email.
I am observing one unique thing, whenever my Jenkins pipeline is running, after build step , I am receiving an email for the file unavailability whereas it should send an email or process the file based on its schedule. It seems like the process is getting triggered or invoked in build step only.
Is there any configuration that is required to invoke on the process on its schedule time only in spring batch and not while building ?
If you are using Spring Boot, you need to set the property spring.batch.job.enabled to false, because by default, Spring Boot executes all jobs in the application context on startup.

How to use Spring Cloud Dataflow to get Spring Batch Status

I have been using Spring Batch and my metadata is in DB2. I have been using Spring Batch admin API (jars) to look at the current status of various jobs and getting details about job, like number of items read, commit count, etc. Now, since Spring Batch Admin is moved to spring-data-cloud, how do look at these informations? Is there a good API set I could use?
Basically, in Spring Cloud Data flow, you first need to create Spring Cloud Task that will have your Batch application: See example [here][1]
With the help of Spring Cloud #EnableTaskLauncher you can get the current status of job, run the job, stop the job, etc.
You need to send TasKLauncherRequest for it.
See APIs of TaskLauncher
Edition:
To get spring batch status, u need to have first Task execution id of spring cloud task. Set<Long> getJobExecutionIdsByTaskExecutionId(long taskExecutionId); method of [TaskExplorer][3]
See Task Explorer for all the apis. With it, use JobExplorer to get status of jobs

Launching Spring Batch Task from Spring Cloud Data Flow

I am having a web based spring batch application. My Batch Job will be kick started on an API call. Here is my method exposed as a web service.
#RequestMapping(value = "/v1/initiateEntityCreation", method = RequestMethod.GET)
public String initiateEntityCreation()
throws JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException,
JobParametersInvalidException, NoSuchJobException, JobInstanceAlreadyExistsException {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addDate("Date", new Date());
Long executionContext = jobOperator.start("InitiateEntityCreation", String.format("Date=%s", new Date()));
return executionContext.toString();
}
My batch job is working fine and i have a Mysql Instance as my Job Repository. I have integrated Spring Cloud data flow to my batch application. I have my #EnableTask annotation and all necessary dependencies. I have connected my Spring Cloud data local server to my spring batch jon Repository instance.
Here is what my command line argument for SCDF.
java -jar spring-cloud-dataflow-server-local-1.2.3.RELEASE.jar --
spring.datasource.url=jdbc:mysql://localhost:3306/springbatchdb--
spring.datasource.username=root --spring.datasource.password=password --
spring.datasource.driver-class-name=org.mariadb.jdbc.Driver
My Local server is running and capturing all the job execution instances. I have registered my spring batch application to SCDF and defined a task for SCDF with the definition.
When am trying to launch the job from SCDF, am getting "Task successfully executed". But my job is not getting executed.
If I check task executions, am seeing like
StartTime N/A, EndTime N/A and if I drill down to task execution there are not batch jobs that have been run. Please let me know how we can launch a web based spring batch job using Spring Cloud data flow.

Convert non-launchable job to launchable job in Spring Batch Admin

I have a Spring Batch job developed with Spring Boot (1.4.1.RELEASE).
It successfully runs from command line and writes job execution data to MySQL. It shows up as non-launchable job in Spring Batch Admin (2.0.0.M1, pointing to MySQL) and I can see job execution metrics.
Now I'd like to turn it into a launchable job so I can run it within Spring Batch Admin.
I wonder if anyone has done that before. The documentation has a section Add your Own Jobs For Launching. But it does not specify where to add the implementation jar(s) for the job?
Is it spring-batch-admin/WEB-INF/lib?
With Spring Boot, the non-launchable job is one big, all-in-one executable jar. Its dependencies overlap with Spring Batch Admin. For example, they both have spring-batch*.jar, spring*.jar but different versions.
Is there a way, like the job definition xml file, to keep them in separate contexts? Thank you.
Spring Batch Admin looks for your job definitions in src/main/resources/META-INF/spring/batch/jobs folder. You could add your job-definition.xml file in that folder and define your batch jobs in that xml.