How does Activiti dynamic assignment of candidate user work? - workflow

There is a way to pass the candidate users dynamically to Activiti workflow as described in .
How do I pass a list of candidate users to an activiti workflow task in alfresco?
When candidateUser/candidateGroup is set for a UserTask using a variable, when is the expression evaluated ? Is the task id -> user/group persisted in database for fast query of like, list all the tasks a particular use can claim ? What table is it stored in ?

When human tasks are created there are two distinct events that fire.
Create : When the task itself is created and most of the task metadata is associated with the task.
Assign : When the task assignment is evaluated and the task is assigned to either an assignee or candidateGroup.
As such, the candidateGroup expression is evaluated during the assign phase.
This means we can easily manipulate the list of candidates based on a rule, database result or some other business logic prior to the task actually being assigned using a task listener that fires on the create phase.
Hope this helps,
G

Concerning the "What table is it stored in ?" part of your question:
Candidate start groups/users for a given task or process are stored in the ACT_IDENTITY_LINK table.

Related

How to identify the multi-instance sub-process and differentiate it from the main process in Jbpm?

I have used one multi instance subprocess which includes an workflow with human task. When executing, its creating the number of human tasks as to the number of elements present inside the collection object. But all tasks have same process instance id. How the relation is working between parent process and multi instance subprocess?
If there are multiple elements in collection list, then it will create those many tasks inside the multi instance sub process. As all the tasks have same process instance id, how to identify the respective process variable values for each task and the uniqueness of each flow afterwards? And is there a way to make it create an different instance id for each task of the multi instance subprocess?
I did not get all the question, but I will try to answer what I got:
Human tasks have their own task instance id
What is collection object? If you mean tasks in bpmn model, then it is as expected: process instance flow starts after start node and when it reaches a human task, it will create an task instance with id. You can see it in the tasks in UI and with api you can claim, work on, complete , populate data etc.
it is wise to have a separate/different variable for every tasks that can execute in parallel. Then the input will be kept in distinguished data placeholders and you can use it accordingly.
you can create a different instance(task instance) for each task or have repeatable tasks
well the answer was to put the multi-instance into a sub-process, this will allow me to have a separate process instance id per each element of the my List (the input of the multi-instance )

Adding paramters to VSTS Task Group

I have a Task Group that I created out of a set of build tasks. I am able to edit the tasks quite well, but i now realise i will need to add another parameter to the task group. How do I go about doing that?
Task group parameters are automatically created based on the variables used in the tasks. If you reference a new variable in a task that's within a task group, it will pop up.
In addition to the accepted answer, if you want to add parameters that are not directly referenced by tasks within the tasks group (e.g. to use in a config file token replacement task) then you can export your task group, edit the .json file then import it back in. The parameters are in an inputs array near the end of file. You can also hide parameters here if you only want to use them internally to the task group by setting a default value and adding a 'visibleRule' property, see this article for details: https://medium.com/objectsharp/how-to-hide-a-task-group-parameter-b95f7c85870c
This will create a new task group rather than updating your current task group. If you want to update the task group, you can use this REST API:
https://learn.microsoft.com/en-us/rest/api/azure/devops/distributedtask/taskgroups/update?view=azure-devops-rest-5.1

How to pre-assign assignee to tasks of a process instance in activiti

I have a requirement to dynamically set the assignee to tasks of process instance created from a process definition id.So,i get my assignee values from UI side and submitted for approval workflow.Now i will start the process and assign those assignee to the respective tasks.The problem is i get only one task on start of process as activiti gives only the current tasks/active tasks.As i don't get rest of task list i am unable to set the assignee to those tasks.
I also have to find pending tasks and completed tasks for a assignee from process instance,as there is query for task which i can use but as i am not able to set the assignee for all tasks ,this query seems not much of help for me.
So how can i get all tasks under a process instance and set the assignee to each user tasks and then complete the user tasks whenever needed using process instance and task query.
Below is my workflow
Workflow describing above scenario
To leverage the full power of the process engine, you would not pass runtime information at process start, you would dynamically determine the assignee at runtime, by using a taskListener on the „create“ event.
But if you have to stick to your approach: put the assignees in a map with the taskDefinitionKey
As key and pass that map to the process instance as process variables.
Afterwards, in your Bpmn model use „${taskDefintionKey}“ in the assigned field (taskdefkey being the I’d of your user task of course).

Quartz capabilities

I am trying to create a Quartz scheduler using Java which will be able to call an API and pass in data.
I am totally new to Quartz but now I understand the Job concept and how to create one. I understand the trigger concept and how to trigger one
and I understand how the scheduler works.
What I am having difficult with is how can I pass in the information which is required to be passed to the API. I have an example of an API being called and the data is entered into the DB but the information has been hard coded into the class be passed into the JobDetails.
Ie. the user passes a message to the system which needs to be sent to the user in 12 hours and not before, so what i was planning was create a Job and a trigger in to set the execute time to 12 hours. How to do i pass the message into the scheduler? Where should this message be stored? Is what I am trying to do possible? Have i misunderstood what Quartz is capable of doing?
Thank you for your time. Any assistance would be greatly appreciated.
Take a look at JobDataMap. If you are creating a new job for each user action you can store the message in there which will be available during the execution.
JobDataMap Holds state information for Job instances.
JobDataMap instances are stored once when the Job is added to a scheduler. They are also re-persisted after every execution of jobs annotated with #PersistJobDataAfterExecution.
JobDataMap instances can also be stored with a Trigger. This can be useful in the case where you have a Job that is stored in the scheduler for regular/repeated use by multiple Triggers, yet with each independent triggering, you want to supply the Job with different data inputs.
The JobExecutionContext passed to a Job at execution time also contains a convenience JobDataMap that is the result of merging the contents of the trigger's JobDataMap (if any) over the Job's JobDataMap (if any).
In case you have a single job but for each user action you are creating a new trigger, you can follow the solution given here.
Third option will be, for each user action, persist the message and time to send email to the database. Have a job that runs periodically and scans through database for eligible records for which email has to be sent

How do I listen for, load and run user-defined workflows at runtime that have been persisted using SqlWorkflowInstanceStore?

The result of SqlWorkflowInstanceStore.WaitForEvents does not tell me what type of workflow is runnable. The constructor of WorkflowApplication takes a workflow definition, and at a minimum, I need to be able to store a workflow ID in the store and query it, so that I can determine which workflow definition to load for the WorkflowApplication.
I also don't want to create a SqlWorkflowInstanceStore for each custom workflow type, since there may be thousands of different workflows.
I thought about trying to use WorkflowServiceHost, but not every workflow has a Receive activity and I don't think it is feasible to have thousands of WorkflowServiceHosts running, each supporting a different workflow type.
Ideally, I just want to query the database for a runnable workflow, determine its workflow definition ID, load the appropriate XAML from a workflow definition table, instantiate WorkflowApplication with the workflow definition, and call LoadRunnableInstance().
I would like to have a way to correlate which workflow is related to a given HasRunnableWorkflowEvent raised by the SqlWorkflowInstanceStore (along with the custom workflow definition ID), or have an alternate way of supporting potentially thousands of different custom workflow types created at runtime. I must also load balance the execution of workflows across multiple application servers.
There's a free product from Microsoft that does pretty much everything you say there, and then some. Oh, and it's excellent too.
Windows Server AppFabric. No, not Azure.
http://www.microsoft.com/windowsserver2008/en/us/app-main.aspx
-Oisin