I'm trying to create Siebel workflow for a new business requirement, I've checked Siebel Bookshelf and I'm bit confused with the flow. Especially I'm having difficulty in understanding the relationship between Workflow policy object, Workflow policy column, workflow policy component, workflow policy component col and how database triggers are created.
Can someone help me to understand these concepts better?
Workflow policy objects and components are very much like the business logic layer objects and components that support the Siebel GUI. A Workflow policy object might be Account, and its components could be Account, Account Contact, Account Address, etc.
Workflow policies themselves are still (I think) created in the Siebel GUI on the server. If your policy conditions are based on a custom field, you might have to add this to the workflow policy component involved using Siebel Tools.
The columns are the set of fields that the triggers are generated for and that Siebel workflow policy will be referenced against. The Siebel monitor components wake up when the trigger is fired, check the policy conditions, and then run whatever processes are attached to the policy.
Once the policy is defined, the generate triggers command needs to be called against the database to include the new trigger details your policy has created. This can be done using Siebel server manager or using SQL directly against the database. Check the Siebel Business Process Framework: Workflow Guide for more details on these steps and how to validate you work.
Related
In our company we automated certain things for our customers:
Reporting, (counting Azure AD accounts, systems, mailboxes);
Create user (setting all permissions);
Create mailboxes;
managing tickets in ITSM;
Delete user .
We did use a lot of Powershell scripts and Azure Devops to automate these tasks. Now with the deprecation of basic authentication, we had to change our scripts and the way of authentication to all of our customers' Exchange Online. This made us think, is there a better way to set this up not to have these problems? We are already working with configfiles, modules, classes in our scripts.
What would be the best way to automate these tasks and not have the rework when Microsoft changes the authentication method?
Another question what would be a way to automate these tasks in low code?
The situation is that we make connection with the Azure Platform from our customers.
If you already work with modules and classes you should probably write a single "authentication" module/class that is then used in all of the subsequent scripts. If and when MS change the authentication method again - one only needs to change this one class that is called by every other script.
In terms of "low code" - it depends on what these tasks are doing.. but, one is able to use the Power Platform Office365 Connector and from there it is possible to query AAD. There is also the GraphAPI.
It really depends on your use case but the following link may be of some assistance with the "low code" question:
https://powerusers.microsoft.com/t5/Building-Power-Apps/Query-Active-Directory/td-p/724376
I've been trying to implement a way to download all the changes made by a particular user in salesforce using PowerShell script & create a package The changes could be anything whether it can be added or modified, Apex classes, profiles, Account, etc based on the modified by the user, component ID, timestamp, etc. below is the URL that exposes the API. The URL Does not explain any way to do this by using a script.
https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_listmetadata.htm
Does anyone know how I can implement this?
Regards,
Kramer
Salesforce orgs other than scratch orgs do not currently provide source tracking, which makes it possible to pinpoint user changes in metadata and extract only those changes. This is done by an SFDX/Metadata API client, like Salesforce DX or CumulusCI (disclaimer: I'm on the CumulusCI team).
I would not try to implement a Metadata API client in PowerShell; instead, harness one of the existing tools to do so.
Salesforce orgs other than scratch orgs don't provide source tracking at present. To identify user changes, you can either
Attempt to extract all metadata and diff it against your version control, which is considerably harder than it sounds and is implemented by a variety of commercial DevOps tools for Salesforce (GearSet, Copado, etc).
Have the user manually add components to a Change Set or Unmanaged Package, and use a Metadata API client as above to retrieve the contents of that package. (Little-known fact, a Change Set can be retrieved as a package!)
To emphasize: DevOps on Salesforce does not work like other platforms. Working on the Metadata API requires a fair amount of time investment and specialization. Harness the existing work of the Salesforce community where you can, but be aware that the task you are laying out may be rather more involved than you think and it's not necessarily something you can just throw together from off-the-shelf components.
I am using DNN 7.4.2 version for my project now i want to implement Workflow Management for HTML content module.
I am able to apply Workflow but not able to create new Workflow and even not able to see button "Manage Workflow"
Please guide which version of DNN should i use and its paid or free ?
The workflow API and tables are part of the DNN Platform, but the user interface to create new workflows is a professional feature. The Workflow Management module is part of all of the Evoq products (Engage, Content, and Content Basic). It's also on the Admin > Workflow Management page in any of the Evoq versions.
Request a 30-day trial of Evoq Content and review what professional features you need beside the workflow management and discuss licensing costs with DNN Corp.
If you don't need any of the professional features and want to stick with the free DNN Platform, you can still do a custom workflow, but would need to create it directly in the database by inserting records into the Workflow, WorkflowStates, and WorkflowStatePermissions tables.
I need to create an on demand workflow that will auto populate custom entities in CRM that are not related to each other..
Opportunity/ Opportunity Services fields need to populate CurrentContract/CurrentContract Services entities.
The fields are related as Opportunity Services > Opportunities > Account < CurrentContract < CurrentContract Services.
All fields from Opportunities need to create a CurrentContract and CurrentContract Services with identical information.
I agree with the other comments that a Custom Workflow Activity would be needed.
Another option would be to perform the action using JScript. There are pros/cons when moving the logic client-side, but some of the CRM REST libraries make it pretty easy to perform CRUD operations. Since you said "on demand workflow", you may want to consider client-side for user experience reasons.
I want to track my custom activities' property values in my own tracking service, .i.e. I don't need it in the built in SQL tracking service. I have been successful in reproducing the SQL tracking service in that I can see the worklfow and activity states etc. but I want to see property values also.
We are writing many workflows for a document management system (DMS) using its own workflow engine that is based on MS WF. I can therefore not change the workflow runtime (if it was needed). The solution has to work with the embedded functionality of the underlying Microsoft workflow runtime.
Our workflows typically do database lookups through custom activities we write. These lookup values are then passed on to other activities for program flow or for persistence into the DMS. It would be great if we can see what these lookup values are at runtime and in fact the values of the DMS own activities' properties.
From my (admittedly limited) knowledge of MS's Workflow, the correct approach is to publish updates to your tracking service from within the workflow -- ie, if your workflow does some step, it should go to the tracking service and say "I did X". Your tracking service can record this information to answer any subsequent queries about what the workflow did (and what the various property values were at the time).
The essential point is that the WF engine is useful for running workflows - but is not very good at reporting on the progress of those workflows.
If you do have some control over the WF engine (you say its "based on MS WF") then one option may be to make your WF Engine publish such updates for all workflows. That may allow you to forgo explicit updates within your actual workflow definitions. However, if you really can't make any changes to the engine, then this won't work.