Vtiger6.4 workflow Every time record is modified not working? - email

I have vtiger6.4 installed and smtp configuration is also done.When I create a new lead it sends the mail.I have created a Workflow for lead and selected Every time the record is modified email should be send to the assigned to.But it is not working.Suggest me some solution.

There are to option to trigger workflow on save event.
Every Time Record is saved
Every Time Record is modified
First is
Every Time Record is saved
This means even no field is updated and you just click save it will trigger workflow event.
You have selected
Every time the record is modified
So that means atleast one Field must be updated using Record Model Class. If you are updating it using query then Workflow will not be Triggered.
Now you have to choose which trigger event you want to choose. If still you have problem then possible that you have set condition in 2nd step. You can provide more details so can help you in better way.

Vtiger Email on workflows send email via cronjobs.
Read more at:
vTigerCRM 7 - Scheduler isn't running any cron jobs unless manually triggered

Related

Smartsheet-api, Is there any way to get manually deleted row using smartsheet api or sdk call

I am deleting row from a sheet, On a sheet I have daily job which needs to recognize the deleted records, I need a way to recognize them using smartsheet api or sdk..
Thanks in advance..
I don't believe this scenario (identifying deleted rows) is explicitly supported by the API at this time. Seems like you could still use the API to achieve your goal though, with a bit more work (code) on your part.
Your code would have to get the sheet data (i.e., all sheet rows) at a regular interval and save that data somewhere -- then each time job runs, get the sheet data again and compare that data to the data you saved the previous time the job ran (to identify any rows that had been deleted).
Edit 9/26: Added Webhooks info
Note that with the approach I've described above, any rows that had been added AND deleted during the interval between job runs would not be detected. If it's important to identify each and every time a row is deleted, a better (and much more efficient) approach would be to use Webhooks. By using webhooks, your application subscribes to notifications for a specified sheet, and then would receive a callback (HTTP POST) from Smartsheet any time the sheet changes. Your application would need to inspect the information in each callback it receives to identify 'deleted row' events (eventType = deleted and objectType = row).
A simple way to do this is to add a column with a checkmark named "delete" or something similar, then with automation you can move the row to another sheet when the flag is detected, the row will be removed from the original sheet, but you will have a record of the deleted row in a different sheet that you can read or do what ever you need to do, this will also prevent deletions by mistake and you can even restore the row back if you need to. I don't think you need much code to implement this solution.

In Maximo, how do I get in between workflow process whether data in the table has been modified?

I want to show a different option to the user in workflow through input node, depending upon whether the user has modified the record or not.
Problem is if I would use a condition node with custom class to detect whether object has been modified by some person or not in between the workflow process then as soon as the person clicks on route workflow the save is automatically called and isModified() flag gets false, How do I get in condition node whether some person has modified the record or not.
I have to show different options to the user if he has modified and different option on routing workflow if he have not modified.
Sounds to me like you need to enable eAudit on the object and then to check whether eauditusername on the most recent audit record for that object bears the userid of the current user.
It's a little hokey and tempts fate, but if your condition node is early in the workflow's route when this button is pressed, you could try and check to see if the changedate on the object (assuming you are working with one of the many objects that has one) is within the last 5 seconds. There is a gap where the record could be routed twice within a few seconds, but the gap is fairly hard to hit. There is also a gap where if the system slows down at that point and takes more than 5 seconds to get to and run your condition, then it would appear to be not modified. You can play with the delay to find a sweet spot of the fewest false positives and negatives.

Call external web service from within Crm 2016

I have some functionality I need to implement in Dynamics Crm 2016. I need to scan all records for a custom entity and update any record where a certain condition is true. This is a bit too complex to do via a workflow (I can't change owner via a workflow step) so I'm thinking perhaps I could perform this logic in a custom plugin. I don't know if it makes sense to call this plugin from a workflow in crm though, as I need to perform the logic on all records for this particular entity, and I need the logic to run regularly, i.e. daily/weekly. What's the best way to do this?
I figured this out. It was actually possible to do entirely within Crm. What I was trying to do was the following.
I have a custom entity called announcement, and it has a custom field called embargo date.
I needed to somehow check periodically if the embargo date has been reached, meaning, is the embargo date today? If so, then I needed to change the owner of this entity.
If the embargo date has not yet been reached, then I need to wait until it is, checking the date again everyday till it is reached.
I managed this with a workflow. I added my check conditions, if they were true I assigned the entity to another user.
If my conditions weren't true, I added wait step to wait for 1 day,then another step to Start workflow where I called the current workflow recursively. Meaning, if the conditions aren't true have the workflow call itself again.

Akka Persistence: Where do the execution of the Command Goes when it is not simply a state update

Just for clarification: Where do the execution of a command goes, when the execution is not simply a state update (like in most examples found online)
For instance, in my case,
The Command is FetchLastHistoryChangeSet which consist in fetching the last history changeset from an external service based on where we left off last time. In other words the time of the newest change of the previous history ChangeSet Fetched.
The Event would be HistoryChangeSetFetched(changeSet, time). In correlation to what has been said above, the time should be that of the newest change of the newly history ChangeSet Fetched (as per the command event currently being handled)
Now in all example that i see, it is always: (i) validating the command, then, (ii) persisting the event, and finally (iii) handling the event.
It is in handling the event that i have seen custom code added in addition to the updatestate logic. Where, the custom code is usually added after the update state function. But this custom is most of the time about sending message back to the sender, or broadcasting it to the event bus.
As per my example, it is clear that i need to do quite few operation to actually call Persist(HistoryChangeSetFetched(changeSet, time)). Indeed i need the new changeset, and the time of the newest change of it.
The only way i see it possible is to do the fetch in the validating the command
That is:
case FetchLastHistoryChangeSet => val changetuple = if ValidateCommand(FetchLastHistoryChangeSet) persit(HistoryChangeSetFetched(changetuple._1, changetuple._2)) { historyChangeSetFetched =>
updateState(historyChangeSetFetched)
}
Where the ValidateCommand(FetchLastHistoryChangeSet)
would have as logic, to read last changeSet time (newest change of the changeSet), fetch a new changeset based on it, if it exist, get the time of its newest change, and return the tuple.
My question is, is that how it is supposed to work. Validating command
can be something as complex as that ? i.e. actually executing the
command ?
As it says in the documentation: "validation can mean anything, from simple inspection of a command message's fields up to a conversation with several external services"
So I think what you're trying to do is exactly right. Any interaction with an external service must be done at the command validation stage.

handling merge scenario in user event script in netsuite

I have successfully handled 'create', 'delete' and 'edit' types in afterSubmit event in a User Event script in NetSuite. What I need now is a way to capture Merge events. When I merge two customer records in Netsuite, the function below isn't invoked at all while it's invoked when I create, delete or edit a customer:
function afterSubmit(type)
{
...
}
Is there any way to handle merge scenarios?
Merge is not an event, it is handled by a duplicate manager.
Unless you hijack the merge button from the client side, I'm not sure it can be done.
Based on #felipechang's advice, I created a custom Merge Suitelet and all of the logic needed to go with it. All code can be found here
Step 1
Create (or add logic to) the Customer User Event script to hide the existing merge button and add a separate one.
gist
Step 2
Create (or add logic to) the Cutomer Client script to wire up the merge button click event.
gist
Step 3
Create a Merge page Suitelet that mimics the functionality of the out-of-the-box merge page but behaves differently on submit.
gist
Step 4
Create a Merge page Client script
gist
Step 5
Create a scheduled task that gets launched on Merge submit to check the progress of the merge task and then fire off custom logic if it succeeds
gist
Hopefully that saves someone some time.
An alternative solution to editing netsuite standard scripts, is to run a scheduled mapReduce on the netsuite record type in question. Running a search in the mapReduces getInputData, filtering on ["systemnotes.context","anyof","DPL"] (DPL => Duplicate Resolution) and processing the affected records in the map. Depending how immediately you need the event of the merge dictates the regularity of the scheduling. Unfortunately you cannot get the data of the merging record. If this is needed, I would recommend the business process is put in place to add the desired data to the master record before doing a merge.