Spring cloud task in dataflow dashboard is complete but one partition of spring batch failed - spring-batch

I have a simple spring batch remote partitioning project running with spring cloud task. Im using spring cloud dataflow at run the spring batch.
Spring boot version - 2.7.8
Spring cloud task - 2.4.5
spring cloud data flow - 2.10.1-SNAPSHOT
I'm using docker compose for spring cloud dataflow locally, and im using spring cloud local deployer as well.
I have spring batch to move data from one table to another within same database. I'm partitioning the data of 100 records into 4 partitions. I'm initially made one of the partition to fail.
After processsing. batch_job_execution job is failed and batch_step_execution one of the partition is failed. But the same is not reflected in scdf dashboard.
In scdf dashboard, task and task execution is completed but job execution is failed status
Two questions:-
How do i makesure the scdf reflect right batch status in dashboard?
How do i restart the failed job execution?
For First question, i tried with "spring.cloud.task.batch.fail-on-job-failure=true" property in application.properties, but i get "Job must not be null nor empty" from TaskJobLauncherApplicationRunnerFactoryBean.java
For second question, i tried relauch the task using below REST API ->
curl 'http://localhost:9393/jobs/executions/1' -i -X PUT
-H 'Accept: application/json'
-d 'restart=true'
But it restarted all the partitiones

Related

Running batch jobs in Pivotal Cloud foundary

We have a requirement to migrate mainframe batch jobs to PCF cloud but as 3R's of security Rotate, Repave and Repair it might be possible that in the instance where batch job is running as spring batch that instance can be repaved/repair and our running jobs got terminated. In that scenario how to ensure that during repavement/repair of an PCF instance our jobs will not get impacted. We are looking for best way to migrate jobs in PCF cloud, any help/suggestion will be really helpful.

Orchestration of batch job into a microservices architecture - SCDF

i have a microservice which he have 5 embeded batch job that runs every night at 00:00 , i want to outsource those batches using Spring Cloud DataFlow , my questions are :
-how can i connect SCDF to the actual microservice for local deployment
-is there an alternative to get a scheduler in SCDF for
local deployment
Spring Cloud Data Flow uses Spring Cloud Skipper to deploy and launch.
This question seems similar to your query. Does spring-cloud-dataflow provide support for scheduling applications defined as tasks?

Batch Job exit status using Spring Cloud Task

I'm trying to setup a spring batch project to be deployed on Spring Cloud Data Flow server, but first I must "wrapp" it on a Spring Cloud Task application.
Spring Batch generates metadata (start/end, status, parameters, etc) on BATCH_ tables. Cloud Task do the same but on TASK_ tables.
Reading the documentation of Spring Cloud Task, it said that in order to pass the batch information to the task, it must be set
spring.cloud.task.batch.failOnJobFailure=true and also
To have your task return the exit code based on the result of the
batch job execution, you will need to write your own
CommandLineRunner.
So, any indications on how I should write my own CommandLineRunner ?
For now, only having set the propertie, if I force the task to fail, I'm getting Failed to execute CommandLineRunner .... Job UsersJob failed during execution for jobId 3 with jobExecutionId of 6

How to use Spring Cloud Dataflow to get Spring Batch Status

I have been using Spring Batch and my metadata is in DB2. I have been using Spring Batch admin API (jars) to look at the current status of various jobs and getting details about job, like number of items read, commit count, etc. Now, since Spring Batch Admin is moved to spring-data-cloud, how do look at these informations? Is there a good API set I could use?
Basically, in Spring Cloud Data flow, you first need to create Spring Cloud Task that will have your Batch application: See example [here][1]
With the help of Spring Cloud #EnableTaskLauncher you can get the current status of job, run the job, stop the job, etc.
You need to send TasKLauncherRequest for it.
See APIs of TaskLauncher
Edition:
To get spring batch status, u need to have first Task execution id of spring cloud task. Set<Long> getJobExecutionIdsByTaskExecutionId(long taskExecutionId); method of [TaskExplorer][3]
See Task Explorer for all the apis. With it, use JobExplorer to get status of jobs

Spring batch admin and starting Master/Slave jobs

Is it possible to configure spring batch admin to start Master and slave jobs. We have one process as master and 3-4 slave nodes.
Spring batch admin is running in separate JVM process but all spring batch jobs are using same batch db schema.
Spring Batch Admin only has the abilities to launch locally deployed jobs. So while you can launch a job that has master/slave configurations, the job that owns the master must be deployed locally. You could wire things up to launch remote jobs, but you'd have to wire things up yourself.
That being said, Spring XD (http://projects.spring.io/spring-xd/) is a distributed runtime that is able to launch jobs that are remotely deployed.