My case is to approve, reject or return my request, and based on that workflow should change its status.
How candence can help here to save/retrieve all actions history for each workflow.
Workflow is code, so if you want to return the history of actions you would store them in a list variable and return it when queried.
As all workflow variables are persisted by the Temporal service there is no need to save them to a DB explicitly.
Related
According to event sourcing. When a command is called, all events of a domain have to be stored. Per event, system must increase the version of an aggregate. My eventstore is something like this:
(AggregateId, AggregateVersion, Sequence, Data, EventName, CreatedDate)
(AggregateId, AggregateVersion) is key
In some cases it does not make sense to increase the version of an aggregate. For example,
a command register an user and raises RegisteredUser, WelcomeEmailEvent, GiftCardEvent.
how can I handle this problem?
how can I handle this problem?
Avoid confusing your representation-of-information-changes events from your publishing-for-use-elsewhere events.
"Event sourcing", as commonly understood in the domain-drive-design and cqrs space, is a kind of data model. We're talking specifically about the messages an aggregate sends to its future self that describe its own changes over time.
It's "just" another way of storing the state of the aggregate, same as we would do if we were storing information in a relational database, or a document store, etc.
Messages that we are going to send to other components and then forget about don't need to have events in the event stream.
In some cases, there can be confusion when we haven't recognized that there are multiple different processes at work.
A requirement like "when a new user is registered, we should send them a welcome email" is not necessarily part of the registration process; it might instead be an independent process that is triggered by the appearance of a RegisteredUser event. The information that you need to save for the SendEmail process would be "somewhere else" - outside of the Users event history.
Event changes the state of an aggregate, and therefore changes its version. If state is not changed, then there should be no event for this aggregate.
In your example, I would ask myself - if WelcomeEmailEvent does not change the state of the User aggregate, then whose state it chages? Perhaps some other aggregate - some EmailNotification service that cares about successful or filed email attempt. In this case I would make it event of those aggregate which state it changes. And it will affect version of that aggregate.
I'm new to Azure Dev Ops and I'm trying to make a field required when State changes. For example, when the State Changes to Approver, the Reviewer Sign Off should be required. When I wrote the attached rule, it is not working. Rule Screenshot
The fields are On/Off fields. Can you please advice?
In the azure devops work item if we use the Boolean type of the fields. Then this field will be set to Require as default and we cannot change it.
And according to our call, I update my answer and help you create a demo.
Update
This is used to change state to Reviewer. And here are the steps:
Since we cannot use the Boolean type of the fields to help us set the required. So, we need to use another field to help us to do it. Here I use the Date/Time type.
We need to associate the field Review Sign Off and Review Sign Off Time. So we need to create other five rules (Two of them are used to prevent disoperation).
Here are the details:
Note: My test state is New -> fa -> Rev -> Close; this demo is used to set the rule from fa->Rev
Create the ChangeState rule:
Create the SetTime rule:
Create the ReadOnly rule: When the state was not the state before the Rev, make the sign off read-only
Create two other rule to prevent disoperation:
Want to store an object inside a workflow then want to receive it through cadence api.
ListOpenWorkflowExecutionsRequest listOpenWorkflowExecutionsRequest=new ListOpenWorkflowExecutionsRequest();
listOpenWorkflowExecutionsRequest.setDomain(DOMAIN);
listOpenWorkflowExecutionsRequest.setStartTimeFilter(startTimeFilter);
ListOpenWorkflowExecutionsResponse response=
cadenceService.ListOpenWorkflowExecutions(listOpenWorkflowExecutionsRequest);*
I am open to go with any solution.
Use the QueryWorkflowExecution API to retrieve information from a single workflow.
The list API is used to get lists of workflows without querying them directly. You can attach custom information (called memo) to a visibility record that is returned by a list API. Use WorkflowOptions.memo property to add it.
The memo is not indexable. If you want the ability to index on custom attributes use the Search Attributes feature. One other feature of search attributes is that they are updatable from the workflow code using upsertSearchAttributes API. So for example, if the workflow code updates the "state" attribute on each state transition then it would be possible to find all the workflows in a given state. Also, all the search attributes are returned by the list API so their value can be shown in the UI list view even if they are not part of the search predicate. Note that this requires Elastic Search cluster integration enabled.
Sorry if this is sort of confusing because I'm not sure how to word this. I am trying to create a workflow that runs off of Account's in Microsoft CRM 2011. One part of this workflow requires me to retrieve a field contained in the Business Unit of the User in the Account's "Created By" field. However, the workflow will only allow me to access the Business Unit itself, but not any of its fields.
I'm wondering if there is a simple trick or work-around that will allow me access to this data.
Thanks!
For reference, the Account has a User, who has a Business Unit, and the Business Unit has a field I need to access. CRM, however, doesn't want to let me get more than 2 levels deep when accessing fields.
Clunky but do-able if you accept a bit of denormalisation (temporarily or otherwise). I'll assume for the sake of example you want to get at the "cost centre" field from the BU.
Add a field on User entity to temporarily hold the value from the BU (so make it same type and length, text(100) in this case), optionally put it on the form.
Create a child workflow for the User entity to update the user with the "cost centre" value from their BU. Make it only available to run as a child, not onDemand or anything else. Activate
In your Account workflow, add a step to call the child workflow against the relevant user (eg Created By in your case).
Add a step to wait until the new cost centre field on the user record contains data.
Now do whatever you need to with the value from the user record, such as update the Account, or do some branched logic.
Whatever you do, once you have used the value, clear the field on the user record, or do this as the last step of the workflow.
Now, since Users don't change BU very often, you might actually just go ahead and keep that value on the User record permanently, and instead of a child workflow, simply run this on create of a new user, or on change of BU, and store the value permanently on the User record. Yes, it is 'denormalised' and not purest SQL design, but then you don't need a child workflow, you don't need a wait state and you don't have to clear the value at the end, or worry about what happens when two Accounts need to run their workflow at the same time. I include the more general approach above as this might apply to other records which do change their parent quite often.
Just an additional thought - you can access the "owning business unit" of the Account, but this will be the BU of the Owning User, rather than the Created By, but is your business process such that this would normally be the same person? (eg users only have Create priviledge to "user owned" depth, so can only create records they own).
If so, then you could get at the BU directly from the Account, and then any fields on it too (in a condition or to update the Account)
Alternative which is less ideal but a similar approach - add a relationship from Account to BU (eg "created BU"). Now you can update the Account with this by referring to the Created By User's BU, then in the next step, reference this value from the Account. This is again denormalised, and less preferable since number of Accounts is far greater than number of users, so the level of duplicate information is much higher.
You can't get deeper with the standard steps of a workflow.
The solution is to create a custom workflow activity, you can start from this article:
http://msdn.microsoft.com/en-us/library/gg328515.aspx
In WF4, I've created a descendant of TrackingParticipant. In the Track method, record.InstanceId gives me the GUID of the workflow instance.
I'm using the SqlWorkflowInstanceStore for persistence. By default records are automatically deleted from the InstancesTable when the workflow completes. I want to keep it that way to keep the transaction database small.
This creates a problem for reporting, though. My TrackingParticipant will log the instance ID to a reporting table (along with other tracking information), but I'll want to join to the ServiceDeploymentsTable. If the workflow is complete, that GUID won't be in the InstancesTable, so I won't be able to look up the ServiceDeploymentId.
How can I obtain the ServiceDeploymentId in the TrackingParticipant? Alternately, how can I obtain it in the workflow to add it to a CustomTrackingRecord?
You can't get the ServiceDeploymentId in the TrackingParticipant. Basically the ServiceDeploymentId is an internal detail of the SqlWorkflowInstanceStore.
I would either set the SqlWorkflowInstanceStore to not delete the worklow instance upon completion and do so myself at some later point in time after saving the ServiceDeploymentId with the InstanceId.
An alternative is to use auto cleanup with the SqlWorkflowInstanceStore and retreive the ServiceDeploymentId when the first tracking record is generated. At that point the workflow is not complete so the original instance record is still there.