How to Maintain the quartz triggers history in data base after firing in java - quartz-scheduler

i am usign SimpleTriggerBean and CronTriggerBean after executing that triggers i need to maintain the history of triggers for that i create table name is quartz_fired_triggred. How to store the detail of fired triggers in that particular table.
Scheduler scheduler = StdSchedulerFactory.getDefaultScheduler();
JobExecutionContext jobExeContext = new JobExecutionContext(scheduler, firedBundle, job);

Set the property JobDetail.Durable = true - which instructs Quartz not to delete the Job when it becomes an “orphan” (when the Job not longer has a Trigger referencing it).
Please note that the triggers are being deleted.
Reference : http://www.quartz-scheduler.net/documentation/faq.html

Related

Is there a way to do this inside Memgraph with triggers?

When calling a query module from a trigger statement, what I expected to get within a function was a state of the graph before the query changes occurred. However, this is not the case, and made changes are already visible in the graph state. I’ll write a dummy example of such behavior:
CREATE TRIGGER my_trigger ON CREATE
BEFORE COMMIT EXECUTE
CALL my_module.function(createdVertices, createdEdges) YIELD node, property
SET node.property = property;
Function my_module.function should execute after following statement:
MATCH (n {id: 19}), (m {id: 10}), (u {id: 14})
CREATE (n)-[new_edge_1:RELATION]->(m), (n)-[new_edge_2:RELATION]->(u)
In my_module.function, we’ll see a graph state with just created new_edge_1 and new_edge_2. If our calculation depends on having incremental changes, where we need the state of graph before any changes occur, is there a way to do it inside Memgraph with triggers?
Unfortunately, there is no way to do that with the triggers. During a single transaction, the changes of each query are visible in the next query. Triggers basically execute their statements as a query that is part of that transaction (if the BEFORE COMMIT is used).
You could probably use the changed values and ignore edges that are created in that transaction.

Capture events from staging to ensure only events which are NOT cancelled and update existing

Given that I have a staging table that is incrementally populated daily and keeps track of events where the following scenarios occur:
Created- Event_UUID is created
Canceled- If the event is canceled, the event_UUID associated with the cancellation will be the same event_UUID from when the event was created.
Rescheduled- Upon an event being rescheduled, the old event is canceled and a new event is created; this will generate a new event_UUID but there will be an old_event_uuid which have the details of the old event.
Given the following sample data:
How would I create a table which loads from staging on a daily basis but only keeps the events which are "created"/"active" and NOT canceled?
What would the sql script look like?
Try to select only the most recent row for every UUID and discard all of them, that have type canceled:
SELECT s.*
FROM staging s
WHERE NOT EXISTS (SELECT 1 FROM staging s2 WHERE s2.event_uuid = s.event_uuid AND s2.time > s.time)
AND s.event_type = 'event.created' -- alternatively: s.event_type <> 'event.canceled'

Trigger is running even if I create a new lead

As per the below code, this meant to update the first and last name only if I update a new lead but it is updating the first and last name of new lead as well
Can someone please suggest. Thank you!
trigger HelloWorld on Lead (before update) {
for(Lead l : Trigger.new){
l.FirstName = 'Hello';
l.LastName = 'World';
}
}
Do you have a workflow or process builder field update action on the lead object? If so it would trigger an update on the record that was newly inserted thus would fire the trigger.
So in lightning there is s default Lead assignment rule that is active, if you disable it, the update doesn't happen. But it is weird that the rule causes an update in Lightning but not in classic, so that might still be a bug that should be reported to Salesforce.

Update properties of zodb objects from a celery task via pyramid_celery not working

So in my content system (based on substanced) I set the name of the zodb content item programatically based on the values of other properties of that object. Some of the child elements of this content item are named based on their parent. I need to programitically update the names of the children too.
If I do this from within the set function of the Property of the parent object then it works but it is slow. Instead I am trying to run it afterwards using celery. I use very similar code but it doesn't update the child object name. Any ideas?
The code I am using to update the zodb object is
new_name = agent_artifact_ref_name_calc_value(content_ob.__parent__, struct_ref)
struct_artifact_ref = get_properties(content_ob, registry)
struct_artifact_ref[u'name'] = new_name
setattr(content_ob, 'name', new_name)
transaction.commit
Indexing the zodb is slow so it is deferred and the deferred process is run by supervisor which is run every 5 seconds.
The celery task to update the children is run from an event which is triggered after the parent object is updated. I have delayed this task so that it is run after the parent object is re-indexed. I can pick the correct child objects by querying the catalog. I then update the names of the objects. In substanced changing the name of an object is equivalent to a remove and then an add. Changing the name of the child object triggers all the reindex methods for the catalog but doesn't seem to pick up the change in the name.
Any ideas?

Salesforce- Trigger - Deleting object

Hello fellow code lovers, I am attempting to learn apex code and stuck on a problem. Since most of you guys here are avid code lovers and excited about problem solving, I figured I could ran by problem by.
I am attempting to Create a Trigger on an object called Book that does the following:
On Delete, all associated Chapters are also deleted
There is also an object named Chapter that is has a lookup to book.
Here is my attempt. This is my first ever attempt at apex so please be patient. Is anyone willing to dabble in this piece of code?
trigger DeleteChaptersTrigger on Book__c (before delete) {
List<Book__c> book = Trigger.old;
List<Chapter__c> chapters = new List<Chapter__c>();
Set set = new Set();
for (Book__c b :books){
if ()
}
}
You need to write all trigger with consideration that the trigger might be processing many records at any one time so you need to bulkify your trigger code.
Here are the variables that available on the trigger object.
You want to get all the record ids that will be deleted. Use the keyset method on the oldmap to get this info without looping and creating your own collection. Then you can just delete the records returned from the query.
trigger DeleteChaptersTrigger on Book__c (before delete) {
Set<string> bookids = Trigger.oldMap.keyset();
delete [SELECT Id FROM Chapter__c WHERE Book__c IN :bookids];
}
Apex code is unlike other languages where it gets confused with reuse of used words, like set as a variable name. Apex is also case insensitive.
Since you have full control, I recommend changing the way your custom objects are related to each other.
A chapter has no meaning/value without a book so we want to change the relationship between the two.
Remove the lookup on the Chapter object and replace it with a master-detail. When a master record gets deleted, Salesforce automatically deletes the detail related records. This is what you want, and without coding.