Workflow Bookmark Issue - workflow

WF Tasks - T1 > T2 > T3
Steps
1 * Workflow Instance is started & bookmarked at first task T1
2 * Trying to Load the workflow Instance,Instance starts successfully and moves to next task,
but bookmark information is not getting updated to
[System.Activities.DurableInstancing].[InstancesTable]
it shows the old book marked information only
I tried tracing the workflow its coming to bookmark stage and sending bookmark information of next task T2 in the code activity of bookmark and context.CreateBookmark(bookmarkName,new BookmarkCallback(OnReadComplete)); is called but its not updating the the instance information with new bookmark..

The workflow persistence database is not updated until the workflow persist again. That is by design so you can restart from a known point if you application crashes. You can force persistence by adding Persist activities to your workflow.

Adding with "The problem solver" answer you can refer for persistance points.
http://msdn.microsoft.com/en-us/library/dd489420.aspx

Related

Keeping database table clean using Entity Framework

I am using Entity Framework and I have a table that I use to record some events that gets generated by a 3rd party. These events are valid for 3 days. So I like to clean out any events that is older than 3 days to keep my database table leaner. Is there anyway I can do this? Some approach that won't cause any performance issue during clean up.
To do the mentioned above there are some options :
1) Define stored procedure mapped to your EF . And you can use Quarts Trigger to execute this method using timely manner.
https://www.quartz-scheduler.net
2) SQL Server Agent job which runs every day at the least peak time and remove your rows.
https://learn.microsoft.com/en-us/sql/ssms/agent/schedule-a-job?view=sql-server-2017
If only the method required for cleaning purpose nothing else i recommend you go with option 2
first of all. Make sure the 3rd party write a timestamp on every record, in that way you will be able to track how old the record is.
then. Create a script that deletes all record that are older than 3 days.
DELETE FROM yourTable WHERE DATEDIFF(day,getdate(),thatColumn) < -3
now create a scheduled task in SQL management Studio:
In SQL Management Studio, navigate to the server, then expand the SQL Server Agent item, and finally the Jobs folder to view, edit, add scheduled jobs.
set script to run once every day or whatever please you :)

Vtiger6.4 workflow Every time record is modified not working?

I have vtiger6.4 installed and smtp configuration is also done.When I create a new lead it sends the mail.I have created a Workflow for lead and selected Every time the record is modified email should be send to the assigned to.But it is not working.Suggest me some solution.
There are to option to trigger workflow on save event.
Every Time Record is saved
Every Time Record is modified
First is
Every Time Record is saved
This means even no field is updated and you just click save it will trigger workflow event.
You have selected
Every time the record is modified
So that means atleast one Field must be updated using Record Model Class. If you are updating it using query then Workflow will not be Triggered.
Now you have to choose which trigger event you want to choose. If still you have problem then possible that you have set condition in 2nd step. You can provide more details so can help you in better way.
Vtiger Email on workflows send email via cronjobs.
Read more at:
vTigerCRM 7 - Scheduler isn't running any cron jobs unless manually triggered

handling merge scenario in user event script in netsuite

I have successfully handled 'create', 'delete' and 'edit' types in afterSubmit event in a User Event script in NetSuite. What I need now is a way to capture Merge events. When I merge two customer records in Netsuite, the function below isn't invoked at all while it's invoked when I create, delete or edit a customer:
function afterSubmit(type)
{
...
}
Is there any way to handle merge scenarios?
Merge is not an event, it is handled by a duplicate manager.
Unless you hijack the merge button from the client side, I'm not sure it can be done.
Based on #felipechang's advice, I created a custom Merge Suitelet and all of the logic needed to go with it. All code can be found here
Step 1
Create (or add logic to) the Customer User Event script to hide the existing merge button and add a separate one.
gist
Step 2
Create (or add logic to) the Cutomer Client script to wire up the merge button click event.
gist
Step 3
Create a Merge page Suitelet that mimics the functionality of the out-of-the-box merge page but behaves differently on submit.
gist
Step 4
Create a Merge page Client script
gist
Step 5
Create a scheduled task that gets launched on Merge submit to check the progress of the merge task and then fire off custom logic if it succeeds
gist
Hopefully that saves someone some time.
An alternative solution to editing netsuite standard scripts, is to run a scheduled mapReduce on the netsuite record type in question. Running a search in the mapReduces getInputData, filtering on ["systemnotes.context","anyof","DPL"] (DPL => Duplicate Resolution) and processing the affected records in the map. Depending how immediately you need the event of the merge dictates the regularity of the scheduling. Unfortunately you cannot get the data of the merging record. If this is needed, I would recommend the business process is put in place to add the desired data to the master record before doing a merge.

Dynamics CRM workflow failing with infinite loop detection - but why?

I want to run a plug-in every 30 minutes, to poll an external system for changes. I am in CRM Online, so I don't have ready access to a scheduling engine.
To run the plug-in, I have a 'trigger' entity with a timezone independent date-
Updating the field also triggers a workflow, which in pseudocode has this logic:
If (Trigger_WaitUntil >= [Process-Execution Time])
{
Timeout until Trigger:WaitUntil
{
Set Trigger_WaitUntil to [Process-Execution Time] + 30 minutes
Stop Workflow with status of: Succeeded
}
}
If Trigger_WaitUntil < [Process-Execution Time])
{
Send email //Tell an admin that the recurring task has self-terminated
Stop Workflow with status of: Canceled
}
So, the behaviour I expect is that every 30 minutes, the 'WaitUntil' field gets updated (and the Plug-in and workflow get triggered again); unless the WaitUntil date is before the Execution time, in which case stop the workflow.
However, 4 hours or so later (probably 8 executions, although I haven't verified that yet) I get an infinite loop warning "This workflow job was canceled because the workflow that started it included an infinite loop. Correct the workflow logic and try again. For information about workflow".
My question is why? Do workflows have a correlation id like plug-ins, which is being carried through to the child workflow? If so, is there anyway I can prevent this, whilst maintaining the current basic mechanism of using a single trigger record to manage the schedule (I've seen other solutions in which workflows create new records, but then you've got to go round tidying up the old trigger records as well)
Yes, this behavior is well-known. The only way to implement recurring workflows without issues with infinite loops in Dynamics CRM and using only OOB features is usage of Bulk Deletion functionality. This article describes how to implement it - http://www.crmsoftwareblog.com/2012/08/using-the-bulk-deletion-process-to-schedule-recurring-workflows/
UPD: If you want to run your code every 30 mins then you will have to create 48 bulkdelete jobs with correspond startdatetime like 12:00, 12: 30, 1:00 ...
The current supported method for CRM is to use the Azure Scheduler.
Excerpt:
create a Web API application to communicate with CRM and our external
provider running on a shared (free) Azure web site and also utilize
the Azure Scheduler to manage the recurrence pattern.
The free version of the Azure Scheduler limits us to execution no more
than once an hour and a maximum of 5 jobs. If you have a lot going on
$20 a month will get you executions every minute and up to 50 jobs -
which sounds like a pretty good deal.
so if you wanted every 30 minutes, you could create two jobs, one on the half hour, and one on the hour.
The Bulk Deletion is an interesting work around and something we've used before. It creates extra work and maintenance though so I try to avoid it if possible.
I would generally recommend building a windows application and using the windows scheduling feature (I know you said you don't have a scheduler available but this is often forgotten). This approach works really well and is very easy to troubleshoot. Writing to logs and sending error email alerts is pretty easy to make it robust. The server doesn't need to be accessible externally, it only needs to reach CRM. If you had CRM on-prem, you could just use the same server.
Azure Scheduler is a great suggestion. This keeps you in the cloud which is nice.
SSIS is another option if you already have KingswaySoft or Cozy Roc in place.
You could build a workflow that creates another record and cleans up after itself; however, this is really using the wrong tool for the job. Also, it's very easy for it to fail and then not initiate the next record.
There is a solution called "Scheduled Workflow Runner". You create a FetchXML query to create a record set to run against, and point it at an on-demand workflow that you want it to run on each record.
http://alexanderdevelopment.net/post/2013/05/18/scheduling-recurring-dynamics-crm-workflows-with-fetchxml/

building audit trail functionality

Following is a use case in a workflow system
Work order enters into a system. Work order will have a target which goes through different workflow states before completing a work order.
Say Work order for a target Vehicle came into a system - workflow for this work oder involves 2 tasks say
a)wash vehicle
b)inspect vehicle
Say wash vehicle workflow task changes vehicle attribute from "not washed" to "washed". And say "inspect vehicle" workflow task changes vehicle attribute "not inspected" to "inspection done"
If user is pulling work order data user will always see latest vehicle data (in this example assuming both workflow tasks are completed user will see value "washed" and "inspection done". However when user pulls ONLY workflow Task Wash Vehicle data -> user will see "washed" -Though second task was done, workflow Task 1 will only see that that it modified. Getting data for Workflow Task 2 will see both "washed" and "inspection done"
This involves milstoning (audit trail) of data; One approach is as shown below image - where when workflow task modifies data it'll update version number, modified_ts and maintain that version number in it's own data row (via a JOIN table as depicted below). Basically this is nothing but maintaining a reference to a history record for workflow task data so when pulling workflow task data it knows which history record to pull back. please ignore parent_id and other notes, noise in a below picture. it's not relevant for this question.
I am thinking event sourcing will also be another alternative design - however don't want to apply event sourcing(or any other similar solution) as a whole sale solution but only for this particular use case (affecting only 3 or so tables where audit trail matters). I am trying to evaluate if CQRS/Event sourcing is a right fit as a partial solution (again only limited to 3-4 tables which need to preserve history/audit trail data) or ES/CQRS will be an overkill? any other thoughts?
P.S. Though this isn't related to Scala - Scala is a platform we are using hence tagging it to see if there are language specific solutions that can help. tagging Akka for finding out if ES/CQRS via Akka persistence is an option or not. Postgresql is a db - And DB triggers is not a solution I am looking for.