Camunda Run 2 tasks in parallel and cancel the 2nd task if the first completes - workflow

I am using Camunda as workflow engine.
I define the registry process as follows:
1- User submit request.
2- Admin check request. If request needs editing, two tasks will raise, "Edit Request by User" and "Approve/Reject by Admin" (second task is used because user may not continue so that admin could finish the process).
If one of the tasks is completed, the other task should be canceled and only one path of execution continued.
How can I do that?
Now if User complete "Edit Request by User" task, the "Approve/Reject by Admin" remains active (two paths of executions)
Thanks a lot in advance

You have several options:
use a conditional interrupting boundary event on the task you want to cancel. Let the other user task set a data, which the conditional event reacts to. When the data is set the task will be cancelled.
Move the part of the process into an embedded sub process. Attach an interrupting boundary event to this whole embedded sub process scope. The event again react to a data you set when the one user task completes (or to another event you throw)
Move the part of the process into an embedded sub process. Add an event-based sub process to the scope of the embedded sub process. The start event of the embedded sub process should be an interrupting event, e.g. a conditional event reacting to a data change.
Least favorable: move the part of the process into an embedded sub process. Let the flow route into an terminating end event after the user task has been completed
If all this sounds like gibberish to you, then you may want to check out:
https://docs.camunda.org/manual/latest/reference/bpmn20/events/conditional-events/
https://docs.camunda.org/manual/latest/reference/bpmn20/subprocesses/event-subprocess/
https://docs.camunda.org/manual/latest/reference/bpmn20/events/terminate-event/

Related

How to manually cancel a ADF pipeline when an activity fails or completes?

I have a pipeline which has two concurrent list of activities running at the same time lets say the first list of activities has a flag at end which makes the until loop in the second list of activity to complete if flag is true. Now the problem is the flag is set at the end of the pipeline once all the activities in list one is completed. In case if any one of the activity prior to the set flag activity fails the set flag is never set to TRUE and the until loop runs for infinite time which is not ideal.
Is there a way that I can manually cancel the activity when any one of the activities fails?
Secondly the until loop has an inner wait activity so once it enters the loop it will wait for 50 minutes until it checks for the flag condition next time. The wait is important so I can't exclude the wait however I would want the until loop to end as soon as the flag is set to true even though the wait is running. Basically i'm trying to end the Until loop.
I did try the steps in MS Docs : Pipeline cancel run : https://learn.microsoft.com/en-us/rest/api/datafactory/pipelineruns/cancel
But this does not work because even when the RUN ID is correct it says incorrect RUN ID.
Could some one please advise how to achieve this?

Manually releasing a seized resource from an agent?

I have a fairly straightforward process:
batch-seize-delay process
Order agent types are batched into a Batch agent type, then a third agent type is seized as a Resource for that Batch. On seize, a message is sent to the Resource agent's statechart for some actions to be taken. However, if a certain condition is met upon receipt of the message, then the Batch agent needs to release the Resource agent and seize a different one so the process can be completed. I've written code in the Resource agent that adds the rejected Batch agent into the collection shown above when it's rejected (rejectionsCollection.add(Batch)). Then, the Batch agent is reinserted at the second source block using an injection call and I've coded the 'New agent' option with rejectionsCollection.get(0). But, I have to also call remove() in the seize and delay blocks otherwise I get flowchart errors (same agent in two blocks at the same time).
When I use seize.remove(batch) as an action to take if the condition is met, but the problem is that the resource agent doesn't get released. I've also set the seize advanced option 'Canceled units' equal to 'go to release' and set 'Release for canceled units' as my release block, but this doesn't work. The third agent remains seized and eventually I run out of Resource agents (which shouldn't happen).
I've also tried copying it into a NewBatch agent Batch newBatch = batch; but it still gives a flowchart error. I've also tried using clone() but I haven't figured out the right syntax (I'm not the most experienced java programmer). I get error messages that say 'cannot convert from Object to Batch'. Not sure if it's relevant but the Batch agent has two collections within it as well.
My next thought was that I could manually release the Resource agent but the help file says that even though a seized resource is publicly accessible, users shouldn't do it. What else could I try?
Sorry for the wall of text but any ideas are appreciated!
You do not manually release resources. The setup is agent-centric, so you must tell the agent to let go of the resource. This is done by making the agent "move" to the release block.
In your case, you could make the delay duration conditional: if the agent has met your condition, the delay time should be 0, else the normal delay time.
Use this notation: agent.condition == true ? 0. : normalDelayTime
You could also use a "Split" element after "Seize" and bypass the "delay" object altogether for your special agents.
Many options, but remember to thing agent-centric :)

How to identify the multi-instance sub-process and differentiate it from the main process in Jbpm?

I have used one multi instance subprocess which includes an workflow with human task. When executing, its creating the number of human tasks as to the number of elements present inside the collection object. But all tasks have same process instance id. How the relation is working between parent process and multi instance subprocess?
If there are multiple elements in collection list, then it will create those many tasks inside the multi instance sub process. As all the tasks have same process instance id, how to identify the respective process variable values for each task and the uniqueness of each flow afterwards? And is there a way to make it create an different instance id for each task of the multi instance subprocess?
I did not get all the question, but I will try to answer what I got:
Human tasks have their own task instance id
What is collection object? If you mean tasks in bpmn model, then it is as expected: process instance flow starts after start node and when it reaches a human task, it will create an task instance with id. You can see it in the tasks in UI and with api you can claim, work on, complete , populate data etc.
it is wise to have a separate/different variable for every tasks that can execute in parallel. Then the input will be kept in distinguished data placeholders and you can use it accordingly.
you can create a different instance(task instance) for each task or have repeatable tasks
well the answer was to put the multi-instance into a sub-process, this will allow me to have a separate process instance id per each element of the my List (the input of the multi-instance )

How to pre-assign assignee to tasks of a process instance in activiti

I have a requirement to dynamically set the assignee to tasks of process instance created from a process definition id.So,i get my assignee values from UI side and submitted for approval workflow.Now i will start the process and assign those assignee to the respective tasks.The problem is i get only one task on start of process as activiti gives only the current tasks/active tasks.As i don't get rest of task list i am unable to set the assignee to those tasks.
I also have to find pending tasks and completed tasks for a assignee from process instance,as there is query for task which i can use but as i am not able to set the assignee for all tasks ,this query seems not much of help for me.
So how can i get all tasks under a process instance and set the assignee to each user tasks and then complete the user tasks whenever needed using process instance and task query.
Below is my workflow
Workflow describing above scenario
To leverage the full power of the process engine, you would not pass runtime information at process start, you would dynamically determine the assignee at runtime, by using a taskListener on the „create“ event.
But if you have to stick to your approach: put the assignees in a map with the taskDefinitionKey
As key and pass that map to the process instance as process variables.
Afterwards, in your Bpmn model use „${taskDefintionKey}“ in the assigned field (taskdefkey being the I’d of your user task of course).

handling merge scenario in user event script in netsuite

I have successfully handled 'create', 'delete' and 'edit' types in afterSubmit event in a User Event script in NetSuite. What I need now is a way to capture Merge events. When I merge two customer records in Netsuite, the function below isn't invoked at all while it's invoked when I create, delete or edit a customer:
function afterSubmit(type)
{
...
}
Is there any way to handle merge scenarios?
Merge is not an event, it is handled by a duplicate manager.
Unless you hijack the merge button from the client side, I'm not sure it can be done.
Based on #felipechang's advice, I created a custom Merge Suitelet and all of the logic needed to go with it. All code can be found here
Step 1
Create (or add logic to) the Customer User Event script to hide the existing merge button and add a separate one.
gist
Step 2
Create (or add logic to) the Cutomer Client script to wire up the merge button click event.
gist
Step 3
Create a Merge page Suitelet that mimics the functionality of the out-of-the-box merge page but behaves differently on submit.
gist
Step 4
Create a Merge page Client script
gist
Step 5
Create a scheduled task that gets launched on Merge submit to check the progress of the merge task and then fire off custom logic if it succeeds
gist
Hopefully that saves someone some time.
An alternative solution to editing netsuite standard scripts, is to run a scheduled mapReduce on the netsuite record type in question. Running a search in the mapReduces getInputData, filtering on ["systemnotes.context","anyof","DPL"] (DPL => Duplicate Resolution) and processing the affected records in the map. Depending how immediately you need the event of the merge dictates the regularity of the scheduling. Unfortunately you cannot get the data of the merging record. If this is needed, I would recommend the business process is put in place to add the desired data to the master record before doing a merge.