Quartz scheduler in not running in war - gwt

I am using quartz scheduler in gwt web application.My application structure is like below.
I have two project.One is gwt web application client project(for ui part) & other is java project for server side call.(for database interaction).In client project I put a server project for reference.While running I create a war from client and add a jar of server project to war folder.
Now I used scheduler at server side for some task to auto complete.While running it locally (with out war) scheduler working properly.
But while running war at jboss server scheduler is not running.
My all scheduler related code and Quartz.jar is at server side.There is no any reference of Quartz in client side project.is this the problem???
Here is my code for scheduler
public class QuartzJob implements Job {
public void execute(JobExecutionContext jobExecutionContext)
throws JobExecutionException {
JobDataMap map = jobExecutionContext.getJobDetail().getJobDataMap();
ActivityTransactionSettingsMap map2 = (ActivityTransactionSettingsMap) map
.get("task");
if (map2.getAutoCompleteDate() != null) {
WorkFlowFacade facade = new WorkFlowFacade();
facade.completeAutoCompleteTask(map2);
Scheduler scheduler=(Scheduler) map.get("scheduler");
try {
scheduler.shutdown();
} catch (SchedulerException e) {
e.printStackTrace();
}
}
}
}
Scheduler scheduler = StdSchedulerFactory.getDefaultScheduler();
scheduler.start();
JobDataMap map2 = new JobDataMap();
map2.put("task", actsMap);
map2.put("scheduler", scheduler);
JobDetail job = newJob(QuartzJob.class).withIdentity("job"+String.valueOf(actsMap.getId()))
.usingJobData(map2).build();
Trigger trigger = newTrigger().withIdentity("trigger"+String.valueOf(actsMap.getId()))
.startAt(actsMap.getAutoCompleteDate()).build();
scheduler.scheduleJob(job, trigger);
Or do I need to shift my scheduler related project at client side only??
I am not getting how to solve this.
Please help me out

For the scheduler to run , there should be something to kick start it . I am not sure how the process is happening but you could write this scheduler in a servlet
public class MySchedulerServlet extends GenericServlet {
private static final long serialVersionUID = 1477091380142883153L;
/**
* Constant to represent property for the cron expression.
*/
private static final String CRON_EXPRESSION = "0 0 0 ? * SUN";
public void init(ServletConfig servletConfig) throws ServletException {
super.init(servletConfig);
// The Quartz Scheduler
Scheduler scheduler = null;
try {
// Initiate a Schedule Factory
SchedulerFactory schedulerFactory = new StdSchedulerFactory();
// Retrieve a scheduler from schedule factory
scheduler = schedulerFactory.getScheduler();
// Initiate JobDetail with job name, job group and
// executable job class
JobDetail jobDetail = new JobDetail("RetryJob", "RetryGroup", QuartzJob.class);
// Initiate CronTrigger with its name and group name
CronTrigger cronTrigger = new CronTrigger("cronTrigger",
"triggerGroup");
// setup CronExpression
CronExpression cexp = new CronExpression(CRON_EXPRESSION);
// Assign the CronExpression to CronTrigger
cronTrigger.setCronExpression(cexp);
// schedule a job with JobDetail and Trigger
scheduler.scheduleJob(jobDetail, cronTrigger);
// start the scheduler
scheduler.start();
} catch (Exception e) {
e.printStackTrace();
}
}
public void service(ServletRequest serveletRequest,
ServletResponse servletResponse) throws ServletException, IOException {
}
}
and in your web.xml load sceduler on startup. This works for me.
<servlet>
<servlet-name>QuartzInitializer</servlet-name>
<servlet-class>org.quartz.ee.servlet.QuartzInitializerServlet</servlet-class>
<init-param>
<param-name>shutdown-on-unload</param-name>
<param-value>true</param-value>
</init-param>
<init-param>
<param-name>start-scheduler-on-load</param-name>
<param-value>true</param-value>
</init-param>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-name>MySchedulerServlet </servlet-name>
<servlet-class>com.servlet.MySchedulerServlet </servlet-class>
<load-on-startup>2</load-on-startup>

I used thread in this case.
public class AutoCompleteTaskThread extends Thread {
private ActivityTransactionSettingsMap taskMap;
public AutoCompleteTaskThread(ActivityTransactionSettingsMap map) {
this.taskMap = map;
}
#Override
public void run() {
try {
new AutoCompleteTaskScheduler().ScheduleJob(taskMap);
} catch (Exception e) {
e.printStackTrace();
}
}
}
As below and in local machine it is working fine.but for jboss server it is not working.

I used an instance of AutoCompleteTaskThread class and called start method on this.
private void addAutoCompleteTask(ActivityTransactionSettingsMap newTask) {
AutoCompleteTaskThread thread = new AutoCompleteTaskThread(newTask);
thread.start();
}
Here for I started a thread for any new task for which I want to start new thread.

Related

Reset scheduled job after completion

I have an scheduled job implemented with Spring batch. Right now when it finishes it doesn't start again because it is detected as completed, is it possible to reset its state after completion?
#Component
class JobScheduler {
#Autowired
private Job job1;
#Autowired
private JobLauncher jobLauncher;
#Scheduled(cron = "0 0/15 * * * ?")
public void launchJob1() throws Exception {
this.jobLauncher.run(this.job1, new JobParameters());
}
}
#Configuration
public class Job1Configuration{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job job1() {
return this.jobBuilderFactory.get("job1")
.start(this.step1()).on(STEP1_STATUS.NOT_READY.get()).end()
.from(this.step1()).on(STEP1_STATUS.READY.get()).to(this.step2())
.next(this.step3())
.end()
.build();
}
}
I know I can set a job parameter with the time or the id, but this will launch a new execution every 15 minutes. I want to repeat the same execution until is completed without errors, and then, execute a new one.
You can't restart your job because you're setting the job status to COMPLETE by calling end() in .start(this.step1()).on(STEP1_STATUS.NOT_READY.get()).end().
You should instead either fail the job by calling .start(this.step1()).on(STEP1_STATUS.NOT_READY.get()).fail()
or stop the job by calling .start(this.step1()).on(STEP1_STATUS.NOT_READY.get()).stopAndRestart(step1())
Those options will mean the job status is either FAILED or STOPPED instead of COMPLETE which means that if you launch the job with the same JobParameters, it will restart the previous job execution.
See https://docs.spring.io/spring-batch/docs/current/reference/html/step.html#configuringForStop
To launch the job in a way that handles restarting previous instances or starting a new instance, you could look at how the SimpleJobService in spring-batch-admin does it and modify the launch method slightly for your purposes. This requires you to specify an incremental job parameter that is used to launch new instances of your job.
https://github.com/spring-attic/spring-batch-admin/blob/master/spring-batch-admin-manager/src/main/java/org/springframework/batch/admin/service/SimpleJobService.java#L250
#Override
public JobExecution launch(String jobName, JobParameters jobParameters) throws NoSuchJobException,
JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException,
JobParametersInvalidException {
JobExecution jobExecution = null;
if (jobLocator.getJobNames().contains(jobName)) {
Job job = jobLocator.getJob(jobName);
JobExecution lastJobExecution = jobRepository.getLastJobExecution(jobName, jobParameters);
boolean restart = false;
if (lastJobExecution != null) {
BatchStatus status = lastJobExecution.getStatus();
if (status.isUnsuccessful() && status != BatchStatus.ABANDONED) {
restart = true;
}
}
if (job.getJobParametersIncrementer() != null && !restart) {
jobParameters = job.getJobParametersIncrementer().getNext(jobParameters);
}
jobExecution = jobLauncher.run(job, jobParameters);
if (jobExecution.isRunning()) {
activeExecutions.add(jobExecution);
}
} else {
if (jsrJobOperator != null) {
// jobExecution = this.jobExecutionDao
// .getJobExecution(jsrJobOperator.start(jobName, jobParameters.toProperties()));
jobExecution = new JobExecution(jsrJobOperator.start(jobName, jobParameters.toProperties()));
} else {
throw new NoSuchJobException(String.format("Unable to find job %s to launch",
String.valueOf(jobName)));
}
}
return jobExecution;
}
I think the difficulty here comes from mixing scheduling with restartability. I would make each schedule execute a distinct job instance (for example by adding the run time as an identifying job parameter).
Now if a given schedule fails, it could be restarted separately until completion without affecting subsequent schedules. This can be done manually or programmtically in another scheduled method.
This is the solution I came up with after all the comments:
#Component
class JobScheduler extends JobSchedulerLauncher {
#Autowired
private Job job1;
#Scheduled(cron = "0 0/15 * * * ?")
public void launchJob1() throws Exception {
this.launch(this.job1);
}
}
public abstract class JobSchedulerLauncher {
#Autowired
private JobOperator jobOperator;
#Autowired
private JobExplorer jobExplorer;
public void launch(Job job) throws JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException,
JobParametersInvalidException, NoSuchJobException, NoSuchJobExecutionException, JobExecutionNotRunningException, JobParametersNotFoundException, UnexpectedJobExecutionException {
// Get the last instance
final List<JobInstance> jobInstances = this.jobExplorer.findJobInstancesByJobName(job.getName(), 0, 1);
if (CollectionUtils.isNotEmpty(jobInstances)) {
// Get the last executions
final List<JobExecution> jobExecutions = this.jobExplorer.getJobExecutions(jobInstances.get(0));
if (CollectionUtils.isNotEmpty(jobExecutions)) {
final JobExecution lastJobExecution = jobExecutions.get(0);
if (lastJobExecution.isRunning()) {
this.jobOperator.stop(lastJobExecution.getId().longValue());
this.jobOperator.abandon(lastJobExecution.getId().longValue());
} else if (lastJobExecution.getExitStatus().equals(ExitStatus.FAILED) || lastJobExecution.getExitStatus().equals(ExitStatus.STOPPED)) {
this.jobOperator.restart(lastJobExecution.getId().longValue());
return;
}
}
}
this.jobOperator.startNextInstance(job.getName());
}
}
My job now uses an incrementer, based on this one https://docs.spring.io/spring-batch/docs/current/reference/html/job.html#JobParametersIncrementer:
#Bean
public Job job1() {
return this.jobBuilderFactory.get("job1")
.incrementer(new CustomJobParameterIncrementor())
.start(this.step1()).on(STEP1_STATUS.NOT_READY.get()).end()
.from(this.step1()).on(STEP1_STATUS.READY.get()).to(this.step2())
.next(this.step3())
.end()
.build();
}
In my case my scheduler won't start 2 instances of the same job at the same time, so if I detect a running job in this code it means that the server restarted leaving the job with status STARTED, that's why I stop it and abandon it.

Spring Cloud Data Flow Task Persist Arguments Between Executions

I'm experimenting with SCDF and successfully running Spring Batch Jobs as Tasks. But I'm having a issue with Task Arguments persisting. It seems that each time I need to execute the Task I should provide it with the command line arguments.
In my use case I need the command line arguments to be set once and for all for a Task.
Thank you
This is by design. The task application's arguments have to be passed every time the task application gets launched as the command line arguments aren't meant to be propagated between subsequent task launches.
Only the task deployment properties you pass as Parameters above are designed to be persisted and re-used as you launch the subsequent task launches. These deployment properties also include the task application properties (the ones that are passed with the app. prefix) along with the platform-specific deployer properties (the properties with the deployer. prefix).
Given these design aspects, I agree there could be use cases (like yours) to pass the same arguments between task launches. Hence, I suggest you create a story with your specific cases here and we'll revisit the design to scope out this.
After some research I ended up using the Parameters instead of the Arguments.
First I created a Spring Batch with multiple CommandLineRunners (2 in my case), one for the production which will use the "application properties" that will be overridden by SCDF parameters, and one for the other environments (DEV, ...) that will get launched through simple command line arguments or through API.
First CommandLineRunner:
#Component
#Slf4j
#Profile("prod")
public class ProdJobCommandLineRunner implements CommandLineRunner {
#Value("${jobname}")
private String jobName;
#Value("${argument1}")
private String argument1;
#Value("${argument2}")
private String argument2;
#Autowired
private ApplicationContext context;
#Autowired
private JobLauncher jobLauncher;
#Override
public void run(String... args) {
log.info("Begin Launching Job with Args {}", Arrays.asList(args));
log.error("JOB NAME: " + jobName);
if (!CollectionUtils.isEmpty(Arrays.asList(args))) {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addString("argument1", argument1);
jobParametersBuilder.addString("argument2", argument2);
try {
Job job = (Job) context.getBean(jobName);
jobLauncher.run(job, jobParametersBuilder.toJobParameters());
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException | JobParametersInvalidException e) {
log.error("Exception ", e);
}
}
log.info("End Launching Job with Args {}", Arrays.asList(args));
}
}
Second CommandLineRunner:
#Component
#Slf4j
#Profile("!prod")
public class DefaultJobCommandLineRunner implements CommandLineRunner {
#Autowired
private ApplicationContext context;
#Autowired
private JobLauncher jobLauncher;
#Override
public void run(String... args) {
log.info("Begin Launching Job with Args {}", Arrays.asList(args));
if (!CollectionUtils.isEmpty(Arrays.asList(args))) {
Map<String, String> params = parseJobArgs(args);
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
if (Boolean.parseBoolean(params.getOrDefault("force_restart", "false"))) {
jobParametersBuilder.addString("force_restart", LocalDateTime.now().toString());
}
try {
String jobName = params.get("job_name");
log.info("JOB NAME: " + jobName);
Job job = (Job) context.getBean(jobName);
jobLauncher.run(job, jobParametersBuilder.toJobParameters());
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException | JobParametersInvalidException e) {
log.error("Exception ", e);
}
}
log.info("End Launching Job with Args {}", Arrays.asList(args));
}
private Map<String, String> parseJobArgs(String[] args) {
Map<String, String> params = new HashMap<>();
Arrays.asList(args).forEach(arg -> {
String key = StringUtils.trimAllWhitespace(arg.split("=")[0]);
String value = StringUtils.trimAllWhitespace(arg.split("=")[1]);
params.put(key, value);
});
return params;
}
}
Import the app in SCDF, say for example TESTAPP
Create multiple Tasks, depending on how many use cases you have, using the same imported app
For each task when launched for the first time, set the Parameters you have following the naming convention:
app. "APP_NAME". "property key"="property value"
In this case for example it will be: app.TESTAPP.jobname=JOB_NAME
I hope this helps.

How to get quartz scheduler end event

I have a following code where i have started a quartz scheduler
internal static IScheduler MyQuartzScheduler = null;
private static async void StartProcessing()
{
try
{
Logger.Info("Starting Quartz");
StdSchedulerFactory factory = new StdSchedulerFactory();
MyQuartzScheduler = await factory.GetScheduler();
await MyQuartzScheduler.Start();
QuartzScedulerMessage = String.Format("Quart Scheduler Started on {0}", DateTime.Now);
}
catch (Exception ex)
{
QuartzScedulerMessage = ex.Message;
Logger.Fatal(ex.Message, ex);
}
}
My scheduler is started and working fine. I were just curious if for any reason (like i have ado job store in my sql and db connectivity break) if scheduler crashes .Any how i can get scheduler ending or crashing event?
I think you should try Scheduler Listener in Quartz.NET here is the documentation ;
Scheduler Listener

ERROR org.quartz.core.JobRunShell - Job group1.JobReport threw an unhandled Exception

There is a strange problem when generating the report using QUARTZ scheduler
I can generate a report fine! no problem.
the method (reportsBean) works normally, but there is a problem when passing through quartz
Any idea please?? I don't know what to do anymore :/
13449 [MyScheduler_Worker-1] ERROR org.quartz.core.JobRunShell - Job
group1.JobReport threw an unhandled Exception:
java.lang.NullPointerException at
com.changes.bean.ReportsBean.createPdfCriticalChanges(ReportsBean.java:104)
at com.changes.quartz.JobReport.execute(JobReport.java:36) at
org.quartz.core.JobRunShell.run(JobRunShell.java:202) at
org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
13452 [MyScheduler_Worker-1] ERROR org.quartz.core.ErrorLogger - Job
(group1.JobReport threw an exception. org.quartz.SchedulerException:
Job threw an unhandled exception. [See nested exception:
java.lang.NullPointerException] at
org.quartz.core.JobRunShell.run(JobRunShell.java:213) at
org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Caused by: java.lang.NullPointerException at
com.changes.bean.ReportsBean.createPdfCriticalChanges(ReportsBean.java:104)
at com.changes.quartz.JobReport.execute(JobReport.java:36) at
org.quartz.core.JobRunShell.run(JobRunShell.java:202)
reportsbean
public class JobReport implements Job {
public void execute(JobExecutionContext context) throws JobExecutionException {
//BasicConfigurator.configure();
try {
ReportsBean reportsBean = new ReportsBean();
reportsBean.createPdfCriticalChanges();
SimpleDateFormat dateFormat = new SimpleDateFormat("dd/MM/yyyy – hh:mm:ss");
System.out.println("Rodou: " + dateFormat.format( new Date() ));
} catch (JRException | SQLException e) {
e.printStackTrace();
}
}
}
quartz.properties
org.quartz.scheduler.instanceName = MyScheduler
org.quartz.threadPool.threadCount = 3
org.quartz.jobStore.class = org.quartz.simpl.RAMJobStore
org.quartz.plugin.jobInitializer.class =org.quartz.plugins.xml.XMLSchedulingDataProcessorPlugin
org.quartz.plugin.jobInitializer.fileNames = com/changes/quartz/quartz-config.xml
org.quartz.plugin.jobInitializer.failOnFileNotFound = true
web.xml
<!-- Inicio Quartz -->
<servlet>
<servlet-name>QuartzServlet</servlet-name>
<servlet-class>com.changes.quartz.servlet.QuartzServlet</servlet-class>
</servlet>
<servlet>
<servlet-name>QuartzInitializer</servlet-name>
<servlet-class>org.quartz.ee.servlet.QuartzInitializerServlet</servlet-class>
<init-param>
<param-name>config-file</param-name>
<param-value>quartz.properties</param-value>
</init-param>
<init-param>
<param-name>shutdown-on-unload</param-name>
<param-value>true</param-value>
</init-param>
<init-param>
<param-name>start-scheduler-on-load</param-name>
<param-value>true</param-value>
</init-param>
<load-on-startup>2</load-on-startup>
</servlet>
<!-- Fim Quartz -->
The Problem is that you have a NullPointerException in your ReportsBean.
Since the signature of the Quartz execute method is
public void execute(JobExecutionContext context) throws JobExecutionException
Quartz can only deal with JobExecutionException's that will be thrown within this method. But in your case it gets an unexpected NullPointerException.
To solve this Problem you should remove the cause of the NullPointer.
From the sourcecode above I couldnt figure out the cause of this exception since it occurs within your ReportsBean.
Certainly your method ReportsBean.createPdfCriticalChanges trys to access an uninitialized member.
at the line 104:
String report FacesContext.getCurrentInstance().getExternalContext().getRealPath("/web/reports/criticalcr.jrxml");
Remember: it's working out of quartz.
ReportsBean
public void createPdfCriticalChanges() throws JRException,SQLException {
System.out.println("generating report...");
String report = FacesContext.getCurrentInstance().getExternalContext().getRealPath("/web/reports/criticalcr.jrxml");
JasperReport pathjrxml = JasperCompileManager.compileReport(report);
//JasperReport pathjrxml = JasperCompileManager.compileReport("web/reports/criticalcr.jrxml"); //Funciona com o inicia Agenda em XML "web/reports/changetracker_criticalcr.jrxml"
JasperPrint printReport = JasperFillManager.fillReport(pathjrxml, null, conn.getConn());
JasperExportManager.exportReportToPdfFile(printReport, "/web/reports/changetracker_criticalcr.pdf"); //Funciona com o inicia Agenda em XML "web/reports/criticalcr.pdf"
System.out.println("report generated!");
}
it's not throws JobExecutionExceptionnb use #postController and #override method init
I had same issue, I solved by putting time greater than current time in table that stores next_fire_time. then restart the server.
ReportsBean is not able to create an object on this place. If it's a Spring application please use #Autowired annotations to fix this issue:
#Autowired
ReportsBean reportsBean;
Then use call reportsBean.yourMethod();.
I fix this problem by creating a Lookup Utils, job was like this :
public class LimpezaColetaDadosPessoaJob implements Job {
#Override
public void execute(JobExecutionContext context) throws JobExecutionException {
try {
this.executar();
} catch (final BusinessException e) {
} catch (final Exception e) {
e.printStackTrace();
}
}
private void executar() throws NamingException {
final ColetorDadosPessoasIntegracaoService service = LookupUtils
.lookup(ColetorDadosPessoasIntegracaoService.class);
service.executarLimpezaDaTabela();
}
}

how to run the quartz scheduler as soon as I started the server in JBOSS5.1.1

Currently I have added the quartz-1.8.6 and quartz-jboss-1.8.6 jars and quartz- service.xml file in jboss5.1.1 and i have added the quartz1.8.6 related jar files in the EAR application and I have return the code as follows:
public void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
try {
InitialContext ctx = new InitialContext();
SchedulerFactory sf = new StdSchedulerFactory();
Scheduler sched = sf.getScheduler();
JobDetail job = new JobDetail("job1", "group1", SimpleJob.class);
CronTrigger trigger = new CronTrigger("trigger1", "group1", "job1", "group1", "0/5 * * * * ?");
sched.addJob(job, true);
Date ft = sched.scheduleJob(trigger);
try
{
Thread.activeCount();
catch (Exception e)
{
}
sched.start();
}
catch (Exception exc){
exc.printStackTrace();
}
.....
}
but this code is running successfully once i logged in the link ...but i needed it to run the crontask once the server started