How to make a quartz job create another job to execute after it? - quartz-scheduler

I am tying to implement the following algorithm with Quartz and not really sure if it can be done. This is my first attempt at using quartz.
User notification Job - This job computes a monthly report and emails to a user, it expects a user id and other parameters that are used to generate the customized user report
There are potentially 10,000+ of these reports that need to be generated
Monthly Job to figure who needs reports Fires
search the database to look for users that need to be sent a monthly report
for each user found create a jobDetail that will compute the monthly report and deliver it to report sender that takes care of sending the report
schedule each of the jobDetails from step 2 to execute right after this job finishes
What I have not been able to figure out.
How to make sure that monthly job executes in a single transaction so that all users that need a monthly report are identified and jobs are scheduled to notify them
How to schedule a job right away to execute right after the job that created them?
I am using Spring 3.2 an Quartz 2.1

Nice use case for quartz usage.
You can try sheduling a new job from within the job class. This can be possibly with by creating a new jobdetail and trigger from inside execute() method.

Related

Run an Azure Data Factory Pipeline Continuously

I have a requirement to incrementally copy data from one SQL table to another SQL table. The watermark or key column is an Identity column. My boss wants me to restart the load as soon as it's done...and as you know, the completion time may vary. In Azure Data Factory, the trigger options are Scheduled, Tumbling Window and Custom Event. Does anyone know which option would allow me to achieve this continuous running of the pipeline and how to configure it?
Create a new pipeline. Call it "run forever". Add an "until" activity with a never-true condition e.g. #equals(1, 2).
Inside the until execute the pipeline which copies between tables. Ensure "wait for completion" is ticked.
If the table copy fails then "run forever" will fail and will have to be manually re-started. You likely do not want "run forever" to be scheduled as the scheduled invocations will queue should it, in fact, not fail.
Pipeline in ADF are batch-based. You can set "micro batches" of 1 min with schedule trigger or 5 mins with tumbling window.
ADF will run in a batch. If you want to continuously load the data then you can go for Stream Analytics / Event Hub which would load real-time data

Quartz capabilities

I am trying to create a Quartz scheduler using Java which will be able to call an API and pass in data.
I am totally new to Quartz but now I understand the Job concept and how to create one. I understand the trigger concept and how to trigger one
and I understand how the scheduler works.
What I am having difficult with is how can I pass in the information which is required to be passed to the API. I have an example of an API being called and the data is entered into the DB but the information has been hard coded into the class be passed into the JobDetails.
Ie. the user passes a message to the system which needs to be sent to the user in 12 hours and not before, so what i was planning was create a Job and a trigger in to set the execute time to 12 hours. How to do i pass the message into the scheduler? Where should this message be stored? Is what I am trying to do possible? Have i misunderstood what Quartz is capable of doing?
Thank you for your time. Any assistance would be greatly appreciated.
Take a look at JobDataMap. If you are creating a new job for each user action you can store the message in there which will be available during the execution.
JobDataMap Holds state information for Job instances.
JobDataMap instances are stored once when the Job is added to a scheduler. They are also re-persisted after every execution of jobs annotated with #PersistJobDataAfterExecution.
JobDataMap instances can also be stored with a Trigger. This can be useful in the case where you have a Job that is stored in the scheduler for regular/repeated use by multiple Triggers, yet with each independent triggering, you want to supply the Job with different data inputs.
The JobExecutionContext passed to a Job at execution time also contains a convenience JobDataMap that is the result of merging the contents of the trigger's JobDataMap (if any) over the Job's JobDataMap (if any).
In case you have a single job but for each user action you are creating a new trigger, you can follow the solution given here.
Third option will be, for each user action, persist the message and time to send email to the database. Have a job that runs periodically and scans through database for eligible records for which email has to be sent

Call external web service from within Crm 2016

I have some functionality I need to implement in Dynamics Crm 2016. I need to scan all records for a custom entity and update any record where a certain condition is true. This is a bit too complex to do via a workflow (I can't change owner via a workflow step) so I'm thinking perhaps I could perform this logic in a custom plugin. I don't know if it makes sense to call this plugin from a workflow in crm though, as I need to perform the logic on all records for this particular entity, and I need the logic to run regularly, i.e. daily/weekly. What's the best way to do this?
I figured this out. It was actually possible to do entirely within Crm. What I was trying to do was the following.
I have a custom entity called announcement, and it has a custom field called embargo date.
I needed to somehow check periodically if the embargo date has been reached, meaning, is the embargo date today? If so, then I needed to change the owner of this entity.
If the embargo date has not yet been reached, then I need to wait until it is, checking the date again everyday till it is reached.
I managed this with a workflow. I added my check conditions, if they were true I assigned the entity to another user.
If my conditions weren't true, I added wait step to wait for 1 day,then another step to Start workflow where I called the current workflow recursively. Meaning, if the conditions aren't true have the workflow call itself again.

Dynamics CRM workflow failing with infinite loop detection - but why?

I want to run a plug-in every 30 minutes, to poll an external system for changes. I am in CRM Online, so I don't have ready access to a scheduling engine.
To run the plug-in, I have a 'trigger' entity with a timezone independent date-
Updating the field also triggers a workflow, which in pseudocode has this logic:
If (Trigger_WaitUntil >= [Process-Execution Time])
{
Timeout until Trigger:WaitUntil
{
Set Trigger_WaitUntil to [Process-Execution Time] + 30 minutes
Stop Workflow with status of: Succeeded
}
}
If Trigger_WaitUntil < [Process-Execution Time])
{
Send email //Tell an admin that the recurring task has self-terminated
Stop Workflow with status of: Canceled
}
So, the behaviour I expect is that every 30 minutes, the 'WaitUntil' field gets updated (and the Plug-in and workflow get triggered again); unless the WaitUntil date is before the Execution time, in which case stop the workflow.
However, 4 hours or so later (probably 8 executions, although I haven't verified that yet) I get an infinite loop warning "This workflow job was canceled because the workflow that started it included an infinite loop. Correct the workflow logic and try again. For information about workflow".
My question is why? Do workflows have a correlation id like plug-ins, which is being carried through to the child workflow? If so, is there anyway I can prevent this, whilst maintaining the current basic mechanism of using a single trigger record to manage the schedule (I've seen other solutions in which workflows create new records, but then you've got to go round tidying up the old trigger records as well)
Yes, this behavior is well-known. The only way to implement recurring workflows without issues with infinite loops in Dynamics CRM and using only OOB features is usage of Bulk Deletion functionality. This article describes how to implement it - http://www.crmsoftwareblog.com/2012/08/using-the-bulk-deletion-process-to-schedule-recurring-workflows/
UPD: If you want to run your code every 30 mins then you will have to create 48 bulkdelete jobs with correspond startdatetime like 12:00, 12: 30, 1:00 ...
The current supported method for CRM is to use the Azure Scheduler.
Excerpt:
create a Web API application to communicate with CRM and our external
provider running on a shared (free) Azure web site and also utilize
the Azure Scheduler to manage the recurrence pattern.
The free version of the Azure Scheduler limits us to execution no more
than once an hour and a maximum of 5 jobs. If you have a lot going on
$20 a month will get you executions every minute and up to 50 jobs -
which sounds like a pretty good deal.
so if you wanted every 30 minutes, you could create two jobs, one on the half hour, and one on the hour.
The Bulk Deletion is an interesting work around and something we've used before. It creates extra work and maintenance though so I try to avoid it if possible.
I would generally recommend building a windows application and using the windows scheduling feature (I know you said you don't have a scheduler available but this is often forgotten). This approach works really well and is very easy to troubleshoot. Writing to logs and sending error email alerts is pretty easy to make it robust. The server doesn't need to be accessible externally, it only needs to reach CRM. If you had CRM on-prem, you could just use the same server.
Azure Scheduler is a great suggestion. This keeps you in the cloud which is nice.
SSIS is another option if you already have KingswaySoft or Cozy Roc in place.
You could build a workflow that creates another record and cleans up after itself; however, this is really using the wrong tool for the job. Also, it's very easy for it to fail and then not initiate the next record.
There is a solution called "Scheduled Workflow Runner". You create a FetchXML query to create a record set to run against, and point it at an on-demand workflow that you want it to run on each record.
http://alexanderdevelopment.net/post/2013/05/18/scheduling-recurring-dynamics-crm-workflows-with-fetchxml/

Creating a custom scheduler using spring quartz

I have a requirement to create a custom scheduler. I would like all the parameters defining the frequency that my jobs will run to be stored in db tables. This would allow my customers to change the frequency etc via a nice little webapp (webapp is a different application to my main one).
I know using quartz you can define all your job triggers programatically but is that just at start time? How would it work if my customer logs on a changes the schedule in the webapp. Am I able to re-define a job trigger in the original app by checking for changes periodially?
Does anyone know of any nice examples of this?
regards
You have bunch of methods in Scheduler interface. JavaDoc here.
Replace already scheduled job with :
Add a new job with replace=true in addJob method
OR
Deleting an existing job ( method: deletejob )
And then adding a new job with modified details ( addjob)
Replace already scheduled triggers with
rescheduleJob ( If jobdetail associated with previous trigger is same for new trigger too )
OR
unscheduleJob followed by scheduleJob
If used with spring you can use spring.quartz.overwrite-existing-jobs=true