Spring Cloud Data Flow UI - spring-batch

We have a Spring Batch Application that is triggered by a Task Command Line Runner that is periodically triggered. We are looking for a UI to view the Job Execution status, can we use the Spring Cloud Data Flow UI dependency and get the UI view capability of these Job Executions?

You cannot just use the SCDF GUI outside on your own without SCDF — they are tightly coupled.
When Task/batch-job are launched from SCDF, the task/job executions are automatically tracked in the common datasource; likewise, the SCDF GUI will show task and batch-job details automatically, as well [see task executions / job executions].
Whether using a scheduler or manually launching the jobs, as far as the launch from both approaches goes through SCDF, everything should just work.

Related

What is the current recommended approach to manage/stop a spring-batch job?

We have some spring-batch jobs are triggered by autosys with shell scripts as short lived processes.
Right now there's no way to view what is going on in the spring-batch process so I was exploring ways to view the status & manage(stop) the jobs.
Spring Cloud Data Flow is one of the options that I was exploring - but it seems that may not work when jobs are scheduled with Autosys.
What are the other options that I can explore in this regard and what is the recommended approach to manage spring-batch jobs now?
To stop a job, you first need to get the ID of the job execution to stop. This can be done using the JobExplorer API that allows you to explore meta-data that Spring Batch is aware of in the job repository. Once you get the job execution ID, you can stop it by calling the JobOperator#stop method, please refer to the Stopping a job section of the reference documentation.
This is independent of any method you used to launch the job (either manually, or via a scheduler or a graphical tool) and allows you to gracefully stop a job and leave the repository in a consistent state (ready for a restart if needed).

How to stop a task in Spring Cloud Data Flow?

I have written a batch job using spring batch and I executed it as task in spring cloud data flow admin, My problem is when I launch the task, It cannot be canceled or paused, It runs at once and the status gets completed:
,
It could be that you're running a batch-job that completes rapidly.
Perhaps you could have a sample to mimic the behavior of a long-running batch-job (eg: a job that runs for 5mins or more); with that type of an application, once after the launch, you'd be able to switch to the "Jobs" tab to select the option "Stop the job" to stop the currently running batch-job.
Here's the visual toggle for reference:

Launching task in Spring Cloud Dataflow with application properties

I have a Spring Cloud Task in SCDF that launches successfully with the task definition:
some-task --some.property=test
I'd like to set the some.property property on task launch instead though. I thought I could do this by setting the deployment property app.*.some.property=test, but this doesn't work with either the local or cloudfoundry task launchers/deployers.
The above deployment property works with streams, but not tasks. Is it suppose to work with tasks, if not, why?
Yes. We can pass properties while launching a Task.
Task applications required same Database connections which Dataflow server uses to log the steps and executions. I deployed below task in local SCDF.
task create --definition "timestmp_custm --timestamp.format=\"dd.MM.yyyy\"" --name taskTimestmp2
task launch taskTimestmp2 --arguments "--spring.datasource.url=jdbc:mysql://localhost:3306/mydb --spring.datasource.username=root --spring.datasource.driver-class-name=org.mariadb.jdbc.Driver"

How to integrate spring-xd batch jobs with Control-M scheduler

I'm trying to solve integration between Control-M scheduler and batch jobs running within spring-xd.
In our existing environment, Control-M agents run on the host and batch jobs are triggered via bash script from Control-M.
In the spring-xd architecture a batch job is pushed out into the XD container cluster and will run on an available container. This means however I don't know what XD container the job will run on. I could pin it to a single container with a deployment manifest, but that goes against the whole point of the cluster.
One potential solution.
Run a VM outside the XD container cluster with the Control-M agent and trigger jobs through the XD API via a bash script. The script would need to wait for the job to complete, by either polling for the job completion via the XD API or wait for an event to signal the completion.
Thinking further ahead this could be a solution to triggering batch jobs deployed in PCF.
In a previous life, I had the enterprise scheduler use Perl scripts to interact with the old Spring Batch Admin REST API to start jobs and poll for completion.
So, yes, the same technique should work fine with XD.
You can also tap into the job events.

Reload a spring batch Job

We need to change an already running job. We should be able to push the job change without restarting the server.
Is it possible to reload a Spring batch job after the jobs / application context has been loaded.
The DefaultJobLoader allows you to reload the application context for your jobs.
Dynamic job deployment and editing of deployed job configurations (without requiring a server restart) is a feature we implemented in Trooper Batch profile (built on Spring Batch and Spring Batch admin). Screen shots are here : https://github.com/regunathb/Trooper/wiki/Writing-Batch-jobs-in-Trooper