jBPM 6 how to define a swimlane to a task - jboss

I'm having a problem with Tasks and Lanes, I need that the actor who fire tasks persists during all the life cycle with its own data, and others actors cannot claim this tasks.
In tasks properties there are no swimlanes definition.
Thanks

In jBPM6, the task itself doesn't have a swimlane property anymore (as the Eclipse modeler in jBPM5 had), just use a graphical swimlane and put the task in it.
A second task in the swimlane should automatically be assigned to the person that completed the first one.

The swimlane is a graphical process element, just like tasks, and you can find it in the Object library between Data objects and Artifacts in jBPM 6. There is no entry in the current documentation for swimlanes but there is a small entry for it in the documentation for RedHat's commercial version:
https://access.redhat.com/site/documentation/en-US/Red_Hat_JBoss_BPM_Suite/6.0/html/User_Guide/Lanes.html

Related

How to add Tasks as a backlog navigation level in VSTS

Asking this here rather than softwareengineering, simply because there is a tag for VSTS here while there isn't one there...
So currently I'm seeing this:
What I'd like to see is 'Tasks' in there alongside Epics, Features, and Stories, and have it mapped to the 'Tasks' WIT.
How do I do this?
I've had a look at https://learn.microsoft.com/en-us/vsts/organizations/settings/work/customize-process-backlogs-boards?view=vsts, but I'm not sure if that's what I want.
This can't be achieved with the standard Task work item. The Task Level is a special kind of level which enables the Task Board and iterations. The distinction of pretty artificial and it stems from the time when TFS didn't event have a Feature or Epic level and there were just two kinds of plans: Longterm (story, pbi, requirement) or short term (tasks).

Is it possible to trigger a release with another release in VSTS?

In VSTS, is it possible to trigger a release upon the completion of another release? I know the release is typically triggered by the completion of a build, but I am wondering if I can trigger one release with another release, for the sake of organizing my processes into multiple releases and linking them together.
Thanks.
That's not a capability at present time, although you could write a custom task using the REST API to accomplish the same thing, or check the marketplace to see if someone has already created a task to do the same.
There are a lot of considerations when you start doing things like this, though. What build are you releasing in your "sub-releases"? The latest one? That build might not be stable enough to deploy. A hard-coded value? That's going to be a thing that people forget to update.
Typically, my approach in situations like this are to break down the releases into different discretely deployable units that have no dependencies on other units. They can be promoted through the stages as necessary. Then, if you have the occasional need to just do everything all together (for example, provisioning a brand new environment), have a "combo" release that encompasses everything. There isn't even a need for duplication with the recent introduction of meta-tasks.
You can try Release Orchestrator extension from the marketplace which adds a task to execute and track progress of multiple release pipelines in Azure DevOps.
Looks like you can use the VSTS Trigger extension from the marketplace to trigger a new build or release from a build or release.
I think an easier option would be to schedule a release. If you create a schedule for the release of each of your application repositories you can ensure that they will all reach the servers at the same time.
Another option is to use stage scheduled triggers.

Custom WorkFlows vs Plug-ins in MS CRM

I used a lot of Plug-in code to implement business logic in CRM but now I've came up with this feature called Custom Workflow Activity.
now i wonder When to use these custom workflows over Plug-ins ?
Code Activities are custom steps which can be inserted into one or many different workflows. Kind of "plugins" but used to be inserted in workflows.
Workflows give you more feedback because they are represented visually in CRM, so non technical people can see the status of a workflow, and the steps which were executed since the start. Workflows are also executed in the Asynchronous service so they run asynchronously, plugins run synchronously, inside the application pool.
So workflows are also better for long running processes.
With that being said, plugins are still helpful when:
You need to have an immediate response, because they are triggered and executed inside CRM's application pool and,
You need to run anything inside the transaction, so they can abort it by raising an exception.
Example: you have an integration with a 3rd party service, where a record can't be created in CRM unless something is validated on the other side. Another example is concurrency: the auto-number plugin is a plugin because it needs to lock the database in the transaction, otherwise multiple concurrent threads could create duplicate IDs.
So, the answer, like always is: It depends. :)
I went deep into the subject myself and found interesting things i want to share,
So here is the complete list of compare :
Plug-in's only fire on data change like updating or creating records but custom workflows take part inside a process ( workflow, dialog, ... )
As a result , workflows not only can be triggered on data change, but on demand at anytime at any point inside their process. As you might have already understood, It is the real flexibility needed for implementing complicated business logic.
Plug-ins won't accept arguments or passed-data,
But custom workflows make it possible by using InArgument properties like below :
[Input("Case")] //label of the field shown in workflow
[ReferenceTarget("incident")] //if using EntityReference, must point the type
public InArgument<EntityReference> yourArg { get; set; } //almost every data type is supported
Workflows can be simply used and manipulated by business users.
Custom Workflows are absolutely reusable. with one register you have a piece of business logic that can be used in several situations.
in some cases you might even happen to write a code which can be used upon many different entities.
So far you know that custom workflow is more reliable than plug-in , but the point that makes a plugin's take over custom workflow is when you are validating data changes and eventually need to revert those changes . of course this is possible in Custom Workflows but it's much more easier to add a plugin than workflow.
and bare in mind that plugins run faster! (as i tested it myself)
However profiling workflows in CRM is still bugged out !
Many of the developers or MS CRM beginners get confused in some scenarios whether to go with Workflows or to go with Plugins, as both can be used and has ability to perform specific task at server side.
Plugins and workflows have some significant differences like limitations in event messages, Triggering points.
You can refer the below link for complete understanding of differences-
https://mscrm16tech.com/.../workflows-vs-plugins-in-ms-crm/

Workflow Foundation and backward compatibility with long running instances

I recently joined a project where Workflow Foundation 4.0 is being used to model business processes.
We have a designer tool so that consultants for clients can customise the workflow definitions. We also persist the workflow instance along with the definition. The workflows can be long running (e.g. months or potentially years).
My question is how do we manage backward compatibility for each release given we don't necessarily know what customisations have been made and what legacy workflows are still in flight? We are loading from XAML but even seemingly minor changes to workflow definitions prevent them from loading. Migration scripts was my initial thought but it seems non-trivial given the complexity of WF workflows.
First off, XOML is 3.0; WF4 uses straight up XAML.
There are two options for doing this. It depends on whether you need to upgrade a long-running workflow in process, or if you want to update the workflow and use it for all new instances, while keeping current instances running on the previous version. Lets call these two options the upgrade and the multiversion strategies.
Re multiversion:
I'm doing this currently. Essentially, you must isolate every different version of the same workflow within an AppDomain. Deserializing from xaml or creating a new instance of a type is the same thing--they both result in an assembly being loaded into the current AppDomain. If v1 of the workflow is defined in assembly A.1, and v2 of the workflow is defined in assembly A.2, you can experience binding issues if you aren't careful. Isolating each version within its own AppDomain helps reduce the chance of this happening.
Re upgrade:
This currently isn't being supported, but there are plans on including this in a (near) future release. Ron Jacobs gave a presentation at PDC10 last October detailing WF4 futures. Three things (I can remember) were mentioned in the presentation--Metadata errors breaking the build, the state machine, and providing an upgrade path for workflows during execution. I can tell you that the state machine was released in the recent Platform Update, and I've been told the metadata-error-breaks-the-build feature is coming soon as well. I'd assume that the upgrade path feature will be coming soon as well.
I've done some research on this.
WF4 Workflow Versioning Spike
WF4 Versioning Spike: Planning for Change
Distributed System Versioning Myth #1

Web Application deployment and database/runtime data management

I have decided to finally nail down my team's deployment processes, soup-to-nuts. The last remaining pain point for us is managing database and runtime data migration/management. Here are two examples, though many exist:
If releasing a new "Upload" feature, automatically create upload directory and configure permisions. In later releases, verify existence/permissions - forever, automatically.
If a value in the database (let's say an Account Status of "Signup") is no longer valid, automatically migrate data in database to proper values, given some set of business rules.
I am interested in implementing a framework that allows developers to manage and deploy these changes with the same ease that we manage and deploy our code.
So the first question is: 1. What tools/frameworks are out there that provide this capacity?
In general, this seems to be an issue in any given language and platform. In my specific case, I am deploying a .NET MVC2 application which uses Fluent NHibernate for database abstraction. I already have in my deployment process a tool which triggers NHibernate's SchemaUpdate - which is awesome.
What I have built up to address this issue in my own way, is a tool that will scan target assemblies for classes which inherit from a certain abstract class (Deployment). That abstract class exposes hooks which you can override and implement your own arbitrary deployment code - in the context of your application's codebase. the Deployment class also provides for a versioning mechanism and the tool manages the current "deployment version" of a given running app. Then, a custom NAnt task glues this together with the NAnt deployment script, triggering the hooks at the appropriate times.
This seems to work well, and does meet my goals - but here's my beef, and leads to my second question: 2. Surely what I just wrote has to already exist. If so, can you point me to it? and 3. Has anyone started down this path and have insight into problems with this approach?
Lastly, if something like this exists, but not on the .NET platform, please still let me know - as I would be more interested in porting a known solution than starting from zero on my own solution.
Thanks everyone, I really appreciate your feedback!
Each major release, have a script to create the environment with the exact requirements you need.
For minor releases, have a script that is split into the various releases and incrementally alters the environment. There are some big benefits to this
You can look at the changes to the environment over time by reading the script and matching it with release notes and change logs.
You can create a brand new environment by running the latest major and then latest minor scripts.
You can create a brand new environment of a previous version (perhaps for testing purposes) by specifying it to stop at a certain minor release.