How do I start a job, when the job name is not known at deployment time? - wildfly

I'm trying to start a batch job, which isn't known at deployment time. (Admin users can define their own jobs via rest-api)
I'm calling:
JobOperator jobOperator = BatchRuntime.getJobOperator();
-- > Class org.wildfly.extension.batch.jberet.deployment.JobOperatorService -
Which dosn't allow to start unknown jobs.
Javadoc says:
* Note that for each method the job name, or derived job name, must exist for the deployment. The allowed job names and
* job XML descriptor are determined at deployment time.
How can i start jobs that are not determined at deployment?
Thanks in advance

You can have some conventions in your batch job naming, so that it is kind of known at deployment time to bypass the deployment-time validation. For instance, you can package a placeholder job in your application:
<?xml version="1.0" encoding="UTF-8"?>
<job id="submitted-job" xmlns="http://xmlns.jcp.org/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/jobXML_1_0.xsd" version="1.0">
<!-- this job is defined and submitted dynamically by the client -->
</job>
At runtime, the admin can then dynamically fill in the job content.

Related

Spring Batch annotation based Job added as a step to XML based job

We have a Spring batch project which is XML based
We need to create a new job and we need to add the job as a nested job to previous XML based job
Is it possible to create the new Job annotation based and add a step to existing XML based job?
I have created a Tasklet Step and tried adding to XML based Job as a Step and am getting.
Cannot convert value of type 'org.springframework.batch.core.step.tasklet.TaskletStep' to required type 'org.springframework.batch.core.step.tasklet.Tasklet' for property 'tasklet': no matching editors or conversion strategy found
A tasklet is not the appropriate type to delegate a step processing to a job, you should use a JobStep instead.
The main job can be defined in XML and refer to the "delegate" job (which could be a bean defined in XML or Java config). Here is an example:
<batch:job id="mainJob">
<batch:step id="step">
<batch:job ref="subjob">
</batch:job>
</batch:step>
</batch:job>
In this example, subjob could be a Spring Batch job defined in XML or Java config.

How to add the description of job in spring batch admin user interface?

Is there any way to add the description of job at the user interface of spring batch admin?
Although, I tried to added the description of the job, spring batch admin cannot support it.
I would like to know that whether spring batch admin does not support it or not.
I know i'm late to the party but I figured it out and it works flawlessly for me. All you have to do is :
Add a messages.properties file in your classpath (under
src/main/resources).
Add yourJobName.description=Your description goes here in that file.
Override manager-context.xml for SBA by creating a file on path src/main/resources/META-INF/spring/batch/servlet/override/manager-context.xml
The content of the above created file should be :
`
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
<!-- Override messageSource bean in order to provide custom text content -->
<bean id="messageSource" class="org.springframework.context.support.ResourceBundleMessageSource">
<property name="basename" value="messages" />
</bean>
</beans>
`
That's it. Your custom description shows up in SBA. Hope this helps someone who's looking for it.
There isn't the ability out of the box to display the job's description. That is only contained in the XML and the data seen in the UI comes from the JobRepository. You'd have to extend the UI to add that functionality.

How to process CASes produced by CAS Multiplier concurrently

I am implementing UIMA pipeline with CASMultiplier and UIMA AS. I have a Segmenter Analysis Engine (A CASMultiplier) and a Analysis Engine (Annotator A). I created a Aggregate Analysis Engine of the Segmenter and Annotator A, and then I create a UIMA AS Deployment descriptor file with intention, the Segmenter produces CASes, and then the Annotator A process with CASes concurrently. The contents of the aggregate analysis engine descriptor file and the deployment descriptor file are as following:
AAE descriptor file:
<analysisEngineDescription xmlns="http://uima.apache.org/resourceSpecifier">
<frameworkImplementation>org.apache.uima.java</frameworkImplementation>
<primitive>false</primitive>
<delegateAnalysisEngineSpecifiers>
<delegateAnalysisEngine key="Segmenter">
<import location="../cas_multiplier/SimpleTextSegmenter.xml"/>
</delegateAnalysisEngine>
<delegateAnalysisEngine key="AnnotatorA">
<import location="AnnotatorA.xml"/>
</delegateAnalysisEngine>
</delegateAnalysisEngineSpecifiers>
<analysisEngineMetaData>
<name>Segmenter and AnnotatorA</name>
<description>Splits a document into pieces and runs Annotator on each
piece independently. All segments are output.</description>
<configurationParameters/>
<configurationParameterSettings/>
<flowConstraints>
<fixedFlow>
<node>Segmenter</node>
<node>AnnotatorA</node>
</fixedFlow>
</flowConstraints>
<capabilities>
<capability>
<inputs/>
<outputs>
<type allAnnotatorFeatures="true">com.trang.uima.types.Target</type>
<type allAnnotatorFeatures="true">com.eg.uima.types.IntermediateResult</type>
</outputs>
<languagesSupported>
</languagesSupported>
</capability>
</capabilities>
<operationalProperties>
<modifiesCas>true</modifiesCas>
<multipleDeploymentAllowed>true</multipleDeploymentAllowed>
<outputsNewCASes>true</outputsNewCASes>
</operationalProperties>
</analysisEngineMetaData>
</analysisEngineDescription>
Deployment descriptor file:
<?xml version="1.0" encoding="UTF-8"?><analysisEngineDeploymentDescription xmlns="http://uima.apache.org/resourceSpecifier">
<name>SegmenterAndBackTranstion</name>
<description>Deploys Segmenter and BackTranskation with 3 instances of BackTransation</description>
<version/>
<vendor/>
<deployment protocol="jms" provider="activemq">
<casPool numberOfCASes="5" initialFsHeapSize="2000000"/>
<service>
<inputQueue endpoint="SegmentAnBackTranslationQueue" brokerURL="tcp://localhost:61616" prefetch="0"/>
<topDescriptor>
<import location="../../descriptors/langrid_uima/SegmenterAndBackTranslationAE.xml"/>
</topDescriptor>
<analysisEngine async="false">
<scaleout numberOfInstances="5"/>
<casMultiplier poolSize="8" initialFsHeapSize="2000000" processParentLast="false"/>
<asyncPrimitiveErrorConfiguration>
<processCasErrors thresholdCount="0" thresholdWindow="0" thresholdAction="terminate"/>
<collectionProcessCompleteErrors timeout="0" additionalErrorAction="terminate"/>
</asyncPrimitiveErrorConfiguration>
</analysisEngine>
</service>
</deployment>
</analysisEngineDeploymentDescription>
After have this setting, I run the pipeline, however, it seems the CASes are process synchronously, one at a time.
Could anyone tell me, what am doing wrong? Is there a way to process CASes produced by CASMultiplier concurrently?
Thank you very much!

Spring integration Configuration to wait for outcome of async batch job

I use Spring Batch admin project in which I have got a job which processes files from a particular folder asynchronously. Currently I run it via batch admin ui by passing the relevant job parameters.
Now, I am trying to automate this process by using file inbound channel adapter. I have configured the service activator which would invoke the batch job whenever it receives a file. I have a new requirement now to invoke another batch job once the the first file upload job is complete . To do this, I have created another service activator that uses the output channel of the first service activator. But since the batch job runs asynchronously, then next batch job is getting executed immediately. Is there a way for the second batch job to wait till the first batch job completes.
My current configuation is
<file:inbound-channel-adapter id="filesIn" directory="file:${input.directory}" filename-pattern="*.csv" prevent-duplicates="true">
<integration:poller id="poller" fixed-delay="10000"/>
</file:inbound-channel-adapter>
<integration:channel id="statusChannel"/>
<integration:service-activator input-channel="filesIn" output-channel="statusChannel"
ref="handler" method="process"/>
<bean id="handler" class="AnalysisMessageProcessor">
<property name="job" ref="A-importGlobalSettingsDataJob"/> <!--1st job -->
<property name="requestHandler" ref="jobMessageHandler"/>
</bean>
<bean id="jobMessageHandler" class="org.springframework.batch.integration.launch.JobLaunchingMessageHandler">
<constructor-arg ref="jobLauncher" /> <!--spring batch admins async job launcher -->
</bean>
<integration:service-activator input-channel="statusChannel" ref="jobHandler" method="process"/>
<bean id="jobHandler" class="JobHandler"> <!--This Job handler should get invoked only after the 1st batch job is completed. Currently I am just printing the exit status code of 1st job-->
Any help would be very much appreciated.
You basically have 2 options:
Actively Poll for the Job Execution Status
Trigger the next Batch Job (Part of a second Spring Integration flow?) in an event-driven approach using listeners
For the first approach checkout "Querying the Repository" (Part of the Spring Batch reference documentation):
http://static.springsource.org/spring-batch/reference/html/configureJob.html#queryingRepository
The second option would generally be best. Thus, I believe you may want to look into using a Spring Batch JobExecutionListener
http://static.springsource.org/spring-batch/apidocs/org/springframework/batch/core/JobExecutionListener.html
Please checkout the section "Providing Feedback with Informational Messages" in the following document:
https://github.com/ghillert/spring-batch-admin/blob/BATCHADM-160/spring-batch-integration/README.md#providing-feedback-with-informational-messages

Load jobs at startup to Spring Batch Admin

From the Spring Batch Admin documentation, it mentioned that jobs will be loaded if job configuration file is located in classpath under META-INF/spring/batch/jobs/*.xml
Documentation
In the spring-batch-admin-sample that comes with STS, the jobs are loaded when the admin web application is deployed, under the file classpath:\META-INF\batch\module-context.xml And it is bootstrapped at deployment. Not sure how that works...
While I can load the job configuration by uploading in the user interface, http://localhost:8080/simple-batch-admin/configuration, some of my custom beans were not autowired for some reason. So the desirable behavior would be to load all the jobs when Admin is deployed.
Thank you in advance.
After several round of digging, I was able to load the job file. I have to place my job file in /META-INF/spring/batch/jobs/ folder not /META-INF/batch/ Also, in order for my jobLauncher, jobRepository, dataSource, etc. to get discover at load time. I have to put it in src/main/resources/META-INF/spring/batch/spring/batch/bootstrap/**/
All because of two files in spring-batch-admin-resources-1.2.0.RELEASE.jar in org.springframework.batch.admin.web.resources
servlet-config.xml
<import resource="classpath*:/META-INF/spring/batch/servlet/resources/*.xml" />
<import resource="classpath*:/META-INF/spring/batch/servlet/manager/*.xml" />
<import resource="classpath*:/META-INF/spring/batch/servlet/override/*.xml" />
which allows me to add menu and controller under the src/main/resources/META-INF/spring/batch/servlet/override/*xml
and
webapp-config.xml
<import resource="classpath*:/META-INF/spring/batch/bootstrap/**/*.xml" />
<import resource="classpath*:/META-INF/spring/batch/override/**/*.xml" />
where I put my launch context