Work Flow Support multiple Scenario - workflow

I am building a base workflow will support around 25 Customer
all customers they matches with one basic workflow and each one has little different request lets say one customer wanna send email and another one don't wanna send email
What I am thinking to make
1- make one workflow and in the different requirement I will make switch to check who is
the user then switch each user to his requirements
(Advantages)this way powerful in maintenance and if there is any common requirements
easy to add
(Disadvantages) if The customer number increase and be like 100 and each is different
and we expect to have 100 user using the workflow but with the Different
little requirements
2- make Differnt workflow for each customer which meaning I will have a 100 workflow
in the future and in declaration instantiate the object from the specific workflow
which related to the Current user
(Advantages) each workflow is separate
(Disadvantages) - hard to add simple feature this meaning write the same thing 100
time so this is not Professional
so What I need ??
I wanna know if those only the ways I have to use in this situation or I missing another technique

One way would be to break out your workflow into smaller parts, each which do a specific thing. You could organize a layout like the following, to be able to support multiple variations of the inbound request.
Customer1-Activity.xaml
- Common-Activity1.xaml
- Common-Activity2.xaml
Customer2-Activity.xaml
- Common-Activity1.xaml
- Common-Activity2.xaml
For any new customers you have, you only need to create a root XAML activity, with each having the slight changes required for your incoming request parameters.
Option #2: Pass in a dictionary to your activity
Thought of a better idea, where you could have your workflow have a Dictionary<string, object> type be an input argument. The dictionary can contain the parameter/argument set that was given to your workflow. Your workflow could then query for the parameter set to initialize itself with that info.

Related

Best practices for Informatica Webservice workflow

I have created a Informatica webservice workflow which takes 1 parameter as input. A Webservice provider source definition is used for this and mapping is a one-way type.
Workflow works fine when parameter is being passed. But when the same workflow is triggered from Informatica Power center directly (in which case no parameters are passed), mapping that contains webservice provider source definition takes 3 minutes to complete (Gives Timeout based commit point in the log).
Is it a good practice to run the webservice workflow from power center directly? And is there a way to improve its performance when triggered from power center directly?
Note: I am trying to use 1 workflow for both - 1) Pass the parameter from web 2) Schedule the workflow in Informatica
Answers to your questions below.
Is it a good practice to run the webservice workflow from power center directly?
Of course it depends on requirement - whether you need to extract data automatically from WS or not. If you pass parameter using some session then i dont see much issue here and your session is completing within time.
So, you can create a new session/command task/shell script to create a param file and then use it in original session so it is passed on to WS.
In a complex scenario, you may have to pass multiple values, in such case, i would recommend to use a parent workflow to call original workflow multiple times and change param every time before call.
Is there a way to improve its performance when triggered from power center directly?
It is really depends on few factors.
The web service - Make sure you are using correct input and output columns. Most of the time WS are sensitive to outside call and you need to choose optimized column to extract data for better performance. You can work with WS admin to know correct column.
If informatica flow is complex then depending on bottle neck transformation/s (source, target, expression, lookup, aggregator, sorter), we can check and take actions.
For lookup, you can add new filter to exclude unwanted data, remove unwanted columns etc.
For aggregator, you can use sorter before to improve perf.
... like this

Does OROCRM fits our process?

Hey guys we need to realize the following workflow:
Our process looks as following:
We have a customer base with several attributes for each customer like city, type of product, segment and so on.It should be possible for the manager to choose the correct customers according to the attributes (e.g. all customer from city x and type y of product) and assign this customers to a marketing process.
The marketing process would look like:
User gets a notification call customer x
User is being asked how the call was and the user can choose from several categories (no interest, slight interest, wrong number, ....)
The logic behind what happens afterwards is in the program
THE WORKFLOW
In short:
Defining the customer range -> putting them into a process -> user gives a response in a pre defined way to the task he performed -> the process goes on
Thank you for your response!
PS. There is no need for VOIP integration.
The short answer is yes, OroCRM is flexible enough to handle completely custom workflows. Simple workflows, like the one that you described can even be configured from the user interface, without custom development.
We have a customer base with several attributes for each customer like the city, type of product, segment, and so on.
Depending on the needs, you can use one of the existing OroCRM entities as a base customer (e.g. a Lead or an Account entity) and add all the needed extra fields using the entity management. Or you can create a completely new custom entity with all the fields and relations to use it as a base customer.
It should be possible for the manager to choose the correct customers according to the attributes (e.g. all customers from city x and type y of product)
You can use OroCRM data grids to filter base customers by attributes and even create a custom grid view with predefined filters to reuse it later. The flexible reporting system also may be helpful here.
and assign these customers to the marketing process.
In OroCRM they are called workflows. There are a few predefined workflows, and you can create a completely custom within the user interface.

DDD: How to solve this using Domain-Driven design?

I'm new to DDD and cutting my teeth on the following exercise. The use case is real, but my attempt to solve it with DDD is purely for learning.
We have multiple Git repos, each containing a file that we call
product spec. The system needs to respond to a HTTP POST by cloning all
the repos, and then update the product spec in those that match some
information in the POST body. System also needs to log the POST request as the cause for updating the product spec.
I'd like to use Aggregates and event sourcing for solving this problem because they seem like a good fit. Event sourcing comes with automatic persistence of the commands, so if I convert the POST body to a command, I get auditing for free.
Problem is, the POST may match multiple product spec. I'm not sure how to deal with that. Should I create a domain service, let it find all the matching product spec and then issue an update command to each? Or should I have the aggregate root do so? If using aggregate root to update multiple entities, it itself needs to be an entity, so what would it be in my problem domain?
The first comment to your question is right (the one of #VoiceOfUnreason): this 'is mostly side effect coordination'.
But I will try to answer your question: How to solve this using DDD / Event Sourcing:
The first aggregate root could just be named: 'MultipleRepoOperations'. This aggregate root has only one stream of events.
The command that fires the whole process could be: 'CloneAndUpdateProdSpecRepos' which carries a list of all the repos to be cloned and updated.
When the aggregate root processes the command it will simply spit a bunch of events of type 'UserRequestedToCloneAndUpdateProdSpec'
The second bounded context manages all the repos, and it its subscribed to all the events from 'MultipleRepoOperations' and will receive each event emitted by it. This bounded context aggregate root can be called: 'GitRepoManagement', and has a stream per repo. Eg: GitRepoManagement-Repo1, GitRepoManagement-Repo215, GitRepoManagement-20158, etc.
'GitRepoManagement' receives each event of type 'UserRequestedToCloneAndUpdateProdSpec', replays its corresponding repo stream in order to rehydrate the current state, and then tries to clone and update the product spec for the repo. When fails emits a failed event or a suceed if appropiate.
for learning purposes try to choose problem domain that has more complex rules and logic, where many actions is needed. for example small game (card game,multiplayer quiz game or whatever). or simulate some real world process like school management or some business process.

How to make an InArgument's value dependant upon the value of another InArgument at design time

I have a requirement to allow a user to specify the value of an InArgument / property from a list of valid values (e.g. a combobox). The list of valid values is determined by the value of another InArgument (the value of which will be set by an expression).
For instance, at design time:
User enters a file path into workflow variable FilePath
The DependedUpon InArgument is set to the value of FilePath
The file is queried and a list of valid values is displayed to the user to select the appropriate value (presumably via a custom PropertyValueEditor).
Is this possible?
Considering this is being done at design time, I'd strongly suggest you provide for all this logic within the designer, rather than in the Activity itself.
Design-time logic shouldn't be contained within your Activity. Your Activity should be able to run independent of any designer. Think about it this way...
You sit down and design your workflow using Activities and their designers. Once done, you install/xcopy the workflows to a server somewhere else. When the server loads that Activity prior to executing it, what happens when your design logic executes in CacheMetadata? Either it is skipped using some heuristic to determine that you are not running in design time, or you include extra logic to skip this code when it is unable to locate that file. Either way, why is a server executing this design time code? The answer is that it shouldn't be executing it; that code belongs with the designers.
This is why, if you look at the framework, you'll see that Activities and their designers exist in different assemblies. Your code should be the same way--design-centric code should be delivered in separate assemblies from your Activities, so that you may deliver both to designers, and only the Activity assemblies to your application servers.
When do you want to validate this, at design time or run time?
Design time is limited because the user can use an expression that depends on another variable and you can't read the value from there at design time. You can however look at the expression and possibly deduce an invalid combination that way. In this case you need to add code to the CacheMetadata function.
At run time you can get the actual values and validate them in the Execute function.

How do I listen for, load and run user-defined workflows at runtime that have been persisted using SqlWorkflowInstanceStore?

The result of SqlWorkflowInstanceStore.WaitForEvents does not tell me what type of workflow is runnable. The constructor of WorkflowApplication takes a workflow definition, and at a minimum, I need to be able to store a workflow ID in the store and query it, so that I can determine which workflow definition to load for the WorkflowApplication.
I also don't want to create a SqlWorkflowInstanceStore for each custom workflow type, since there may be thousands of different workflows.
I thought about trying to use WorkflowServiceHost, but not every workflow has a Receive activity and I don't think it is feasible to have thousands of WorkflowServiceHosts running, each supporting a different workflow type.
Ideally, I just want to query the database for a runnable workflow, determine its workflow definition ID, load the appropriate XAML from a workflow definition table, instantiate WorkflowApplication with the workflow definition, and call LoadRunnableInstance().
I would like to have a way to correlate which workflow is related to a given HasRunnableWorkflowEvent raised by the SqlWorkflowInstanceStore (along with the custom workflow definition ID), or have an alternate way of supporting potentially thousands of different custom workflow types created at runtime. I must also load balance the execution of workflows across multiple application servers.
There's a free product from Microsoft that does pretty much everything you say there, and then some. Oh, and it's excellent too.
Windows Server AppFabric. No, not Azure.
http://www.microsoft.com/windowsserver2008/en/us/app-main.aspx
-Oisin