I use quartz to schedule a custom job to run daily, at a specific time.
However, the machine running mule may be down during that specific scheduled run time, and the custom job cannot get run on that day. So, I wonder if it is possible that I can use jmx to invoke the quartz custom job's execute() method manually.
The easiest is to use a <composite-source> to allow your flow to be triggered both by Quartz and HTTP. That way you can manually trigger it with a simple curl invocation.
Something like:
<flow name="dualTriggerFlow">
<composite-source>
<quartz:inbound-endpoint ...>
...
</quartz:inbound-endpoint>
<http:inbound-endpoint exchange-pattern="request-response"
host="localhost" port="8081" path="/jobs/myjob/trigger" />
</composite-source>
...
Of course it depends on what type of Quartz job you're executing. I'm assuming an event-generator-job.
Related
We have a Spring batch project which is XML based
We need to create a new job and we need to add the job as a nested job to previous XML based job
Is it possible to create the new Job annotation based and add a step to existing XML based job?
I have created a Tasklet Step and tried adding to XML based Job as a Step and am getting.
Cannot convert value of type 'org.springframework.batch.core.step.tasklet.TaskletStep' to required type 'org.springframework.batch.core.step.tasklet.Tasklet' for property 'tasklet': no matching editors or conversion strategy found
A tasklet is not the appropriate type to delegate a step processing to a job, you should use a JobStep instead.
The main job can be defined in XML and refer to the "delegate" job (which could be a bean defined in XML or Java config). Here is an example:
<batch:job id="mainJob">
<batch:step id="step">
<batch:job ref="subjob">
</batch:job>
</batch:step>
</batch:job>
In this example, subjob could be a Spring Batch job defined in XML or Java config.
I'm trying to start a batch job, which isn't known at deployment time. (Admin users can define their own jobs via rest-api)
I'm calling:
JobOperator jobOperator = BatchRuntime.getJobOperator();
-- > Class org.wildfly.extension.batch.jberet.deployment.JobOperatorService -
Which dosn't allow to start unknown jobs.
Javadoc says:
* Note that for each method the job name, or derived job name, must exist for the deployment. The allowed job names and
* job XML descriptor are determined at deployment time.
How can i start jobs that are not determined at deployment?
Thanks in advance
You can have some conventions in your batch job naming, so that it is kind of known at deployment time to bypass the deployment-time validation. For instance, you can package a placeholder job in your application:
<?xml version="1.0" encoding="UTF-8"?>
<job id="submitted-job" xmlns="http://xmlns.jcp.org/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/jobXML_1_0.xsd" version="1.0">
<!-- this job is defined and submitted dynamically by the client -->
</job>
At runtime, the admin can then dynamically fill in the job content.
I would like to put a hook somewhere in the following code/config to be able to spot a JobInstanceAlreadyCompleteException and then email the production support team that this occurred.
I have tried a JobExecutionListener#beforeJob() method in Spring Batch, but the JobInstanceAlreadyCompleteException is occurring before job execution.
I am using this Spring Batch Integration configuration from the documentation:
<int:channel id="inboundFileChannel"/>
<int:channel id="outboundJobRequestChannel"/>
<int:channel id="jobLaunchReplyChannel"/>
<int-file:inbound-channel-adapter id="filePoller"
channel="inboundFileChannel"
directory="file:/tmp/myfiles/"
filename-pattern="*.csv">
<int:poller fixed-rate="1000"/>
</int-file:inbound-channel-adapter>
<int:transformer input-channel="inboundFileChannel"
output-channel="outboundJobRequestChannel">
<bean class="io.spring.sbi.FileMessageToJobRequest">
<property name="job" ref="personJob"/>
<property name="fileParameterName" value="input.file.name"/>
</bean>
</int:transformer>
I want to handle JobInstanceAlreadyCompleteException in case the same CSV file name appears as the job parameter. Do I extend org.springframework.integration.handler.LoggingHandler?
I notice that class is reporting the error:
ERROR org.springframework.integration.handler.LoggingHandler - org.springframework.messaging.MessageHandlingException: org.springframework.batch.core.repository.JobInstanceAlreadyCompleteException: A job instance already exists and is complete for parameters={input.file.name=C:\Users\csv\file2015.csv}. If you want to run this job again, change the parameters.
The ERROR org.springframework.integration.handler.LoggingHandler is done from the default errorChannel which is reached from the <poller> on your <int-file:inbound-channel-adapter>.
So, to handle it manually your just need to specify your own error-channel there a go ahead with email sending:
<int-file:inbound-channel-adapter>
<int:poller fixed-rate="1000" error-channel="sendErrorToEmailChannel"/>
</int-file:inbound-channel-adapter>
<int-mail:outbound-channel-adapter id="sendErrorToEmailChannel"/>
Of course, you will have to do some ErrorMessage transformation before sending ti over e-mail, but that is already details of the target business logic implementation.
I use Spring Batch admin project in which I have got a job which processes files from a particular folder asynchronously. Currently I run it via batch admin ui by passing the relevant job parameters.
Now, I am trying to automate this process by using file inbound channel adapter. I have configured the service activator which would invoke the batch job whenever it receives a file. I have a new requirement now to invoke another batch job once the the first file upload job is complete . To do this, I have created another service activator that uses the output channel of the first service activator. But since the batch job runs asynchronously, then next batch job is getting executed immediately. Is there a way for the second batch job to wait till the first batch job completes.
My current configuation is
<file:inbound-channel-adapter id="filesIn" directory="file:${input.directory}" filename-pattern="*.csv" prevent-duplicates="true">
<integration:poller id="poller" fixed-delay="10000"/>
</file:inbound-channel-adapter>
<integration:channel id="statusChannel"/>
<integration:service-activator input-channel="filesIn" output-channel="statusChannel"
ref="handler" method="process"/>
<bean id="handler" class="AnalysisMessageProcessor">
<property name="job" ref="A-importGlobalSettingsDataJob"/> <!--1st job -->
<property name="requestHandler" ref="jobMessageHandler"/>
</bean>
<bean id="jobMessageHandler" class="org.springframework.batch.integration.launch.JobLaunchingMessageHandler">
<constructor-arg ref="jobLauncher" /> <!--spring batch admins async job launcher -->
</bean>
<integration:service-activator input-channel="statusChannel" ref="jobHandler" method="process"/>
<bean id="jobHandler" class="JobHandler"> <!--This Job handler should get invoked only after the 1st batch job is completed. Currently I am just printing the exit status code of 1st job-->
Any help would be very much appreciated.
You basically have 2 options:
Actively Poll for the Job Execution Status
Trigger the next Batch Job (Part of a second Spring Integration flow?) in an event-driven approach using listeners
For the first approach checkout "Querying the Repository" (Part of the Spring Batch reference documentation):
http://static.springsource.org/spring-batch/reference/html/configureJob.html#queryingRepository
The second option would generally be best. Thus, I believe you may want to look into using a Spring Batch JobExecutionListener
http://static.springsource.org/spring-batch/apidocs/org/springframework/batch/core/JobExecutionListener.html
Please checkout the section "Providing Feedback with Informational Messages" in the following document:
https://github.com/ghillert/spring-batch-admin/blob/BATCHADM-160/spring-batch-integration/README.md#providing-feedback-with-informational-messages
I am using Msmqdistributor service of Enterprise Library 3.1 to distribute logs from various application.I have defined multiple listeners in categorySources/specialSources, but if one listener is failed than subsequent listeners will be never executed.
Following is my config code.
<specialSources>
<allEvents switchValue="Warning" name="All Events">
<listeners>
<add name="Database Listener A" />
<add name="Custom Trace Listener A" />
<add name="Custom Trace Listener B" />
</listeners>
</allEvents>
<notProcessed switchValue="Warning" name="Unprocessed Category" />
<errors switchValue="Warning" name="Logging Errors & Warnings"/>
</specialSources>
If I am specifying wrong connection string for Database Listener A then it will be failed to insert logs in database. But it also stops the jobs of next Custom Trace Listener A and Custom Trace Listener B.So here Custom Trace Listener A and Custom Trace Listener B will be never executed if Database Listener A is failed.
Anybody can help please ?
Thanks
Mitesh Patel
This behavior seems to be "by design". See the answer for Microsoft Enterprise Library 4.1 Logging Fails on Windows XP SP3.
Of course, this doesn't really help you. One workaround would be to attach one listener to a category. So for your 3 listeners you could add 3 categories. This will work but is not particularly elegant.
Since you indicate that you are using 2 custom trace listeners another approach to mitigate the issue would be to code the custom trace listeners to swallow any exceptions (not usually a good idea, though). You could also order your trace listeners from least likely to fail to most likely to fail. E.g. Database > Event Log > Flat File. In your scenario place the custom trace listeners before the database trace listener.
It is also good practice to add a trace listener to the errors category. Not sure if you have that in your app or not (but the posted config doesn't).