qbo3 Document Review Workflow - workflow

I have a series of documents in a document imaging system that need to be manually reviewed.
I can query the relevant documents from my data warehouse via an API call (Process/DataWarehouseDoucmentQuery), which will return data along these lines:
Account
Document
Status
IndexId
12345
Document A
Late
9876
12345
Document B
Late
5432
12345
Document C
Late
1098
23456
Document A
Late
7654
23456
Document D
Late
3210
The goal is to:
Create a Task for each document
Create a Workflow for each document
Group the documents associated with the same account under common Process

Configuration should include:
Create a Workflow Template, perhaps named Document Review, that applies to Attachment
Add a Task to this workflow, perhaps named Document Review
You can use qbo's ImportFile/BatchApply feature to create a scheduled job along these lines:
Query: Process/DataWarehouseDoucmentQuery
Action:
Attachment/Save?
Attachment={Document}
&SubscriberID={IndexId}
&Status={Status}
&Decisions_DocumentReview_Decision={Document} Review
&Process={Account}
More detail on the parameters you are passing:
Parameter
Description
Attachment
This is the same as the name of your document; adjust as you see fit.
SubscriberID
This uses qbo's Subscription pattern to ensure you don't create duplicate documents. This assumes that your IndexId is effectively globally unique in your environment.
Status
Since it's part of your data warehouse query, might as well map it to Attachment.Status.
Decisions_DocumentReview_Decision
Any qbo object can save related table data by specifying {Child}_{Template}_*. Note that when passing Template, you should remove any non-alpha-numeric characters from the template name.
Process
The Attachment module is based on qbo's GenericObject, which can be a 'child' of any other object. In this case, the Attachment will be bound to a parent Process, which will be created on the fly if required.
Note that creating a parent Process record as detailed above required that the application setting for GenericParentClasses include Process. This can be verified from the Configuration > Modules > Matrix > Settings panel.
An alternative to adding Process to GenericParentClasses is to specify both an Object to indicate the parent, and {Object}_* to indicate parent field values:
Attachment/Save?
Attachment={Document}
&SubscriberID={IndexId}
&Status={Status}
&Decisions_DocumentReview_Decision={Document} Review
&Object=Process
&Process_Process={Account}
&Process_OpenedReason=Just a Stack Overflow Example

Related

How to bring custom fields created under inherited process in Azure DevOps Queries

I created a inherited process under inherited Scrum process (like 'X-Scrum') and added new fields like the below,
X.Original Estimate (Decimal)
X.Completed work (Decimal)
X.Remaining work (Decimal)
This is successfully done and reflected in the task screen.
Issue/ Help required:
I'm not able to bring those three fields into QUERIES, as I would like to extract all product backlog item, tasks and related custom fields.
Can someone help me on how to bring the custom fields into the queries.
Some special characters are prohibited for the Definition name of the custom field. If you add a new field with one of them(eg. .,()), you will get below error.
However you can change the Label of the field in the Layout section, which will be displayed in the task screen
In the QUERIES, only the Definition Name is showing up in the fields
So I guess you just changed the Layout Label of the custom field as X.Original Estimate (Decimal). But the Definition Name of the custom field is not X.Original Estimate (Decimal). Yon need to check what is Name for the field in Definition section. The Queries will reflect this Name.

How can i view the Created Date and the Owner of the Field in Azure DevOps

I am working on Azure DevOps and I know little about the product. I want to view the field history when it is created and by whom it is created and last modified.
I have gone through a couple of things in docs but it didn't help me.
Research up till Now
It shows me the fields but no information other than when I clicked on fields it gives me an option to delete the field.
Path
Organization Setting > Process > Fields
How can i view the Created Date and the Owner of the Field in Azure DevOps
Indeed, there is no such out of box way to get the Created Date and the Owner of the Field.
As a workaround, we could add a custom field in our custom process and set the value to 1 when the field is defined with a value. For example, I create a MyCustomFiled field and add rule to set the MyCustomFiled to 1 if the target field is defined:
In this case, when the target field is created by some one, our custom filed will be set with value 1.
Next, we create a query with following Filters:
Now, we could get the Created Date and the Owner of the Field.
Update:
what about the fields that are already created?
For this situation, we could query workitems whose fields have been modified according to the opposite conditions, for example, the Description field:
If this workaround still not work for you, you could try to use REST API Revisions - Get to get all the history for one workitem, use scripts to filter the revision about the specify field, then loop through all the workitems.
GET https://dev.azure.com/{organization}/{project}/_apis/wit/workItems/{id}/revisions/{revisionNumber}?api-version=5.1
Hope this helps.

Is there a way to assign the primary category for a product in demandware using the Open Commerce API (OCAPI)?

The primary category of a product is present in the product document (primary_category_id) in the DATA API but cannot be written. After sending a PATCH update of the product with a different primary_category_id, it doesn't change.
Is there a way of doing this through the OCAPI?
Can be some limitation for PATCH Method.Fields that can be updated:
name,
page_description,
long_descripton,
page_title,
page_keywords,
brand,
ean,
upc,
manufacture_sku,
manufacture_name,
searchable,
unit,
searchable,
online_flag,
default_variant_id.
Try with PUT Method. PUT https://hostname:port/dw/data/v19_1/products/{id}. Also,
please check Request Document.
At this time it does not appear that this is possible to manage via OCAPI.
I suspect that in the future you'd be able to achieve it using the following resources:
DELETE /catalogs/{catalog_id}/categories/{category_id}/products/{product_id}
followed by:
PUT /catalogs/{catalog_id}/categories/{category_id}/products/{product_id}
With a ProductCategoryAssignment document in the PUT call.
However, this would require that Salesforce adds those attributes to the ProductCategoryAssignment document.
The reason I suggest this is where it would be added is that within a catalog import document (XML) the flags are associated with a similar resource representation. eg:
<category-assignment category-id="gear-bags-backpacks" product-id="NSF4003100">
<primary-flag>true</primary-flag>
</category-assignment>

Workflow to change order status if a custom field is populated for Sales Orders in NetSuite

Basically I want the orders to import under Pending Fulfillment only when the selected triggering client field has a value in it (any value). If it is blank I don't want this workflow to run on that SO. I have another workflow in place that affects other orders that are imported through Web Services, so this will be just for specific orders when the Triggering Client Field has a value. I set it up, as you can see in the images attached and no luck!
Parameters: Order Status=Pending Fulfillment (Static value)
Trigger on: After Field Sourcing
Contexts: Web Services
Triggering Client Fields: Custom free form text field that will be
populated with a 4-5 digit number
This looks like it will only work within the UI when someone is entering a sales order - and actually it won't work at all because if that's a free form text field, then it won't trigger the "After Field Sourcing" event. I would make the trigger Before Record Submit, or After Record Submit instead, and then under the "Condition" section, use the visual builder to tell set the criteria to only when Handshake Order ID is not empty. That should do the trick for you.
Please change your workflow action as below
Workflow Action : Import Status
Trigger On : Entry
Condition : {custbody11} is not null

InfoPath 2010: Mirror data from a Secondary Data Source

Goal: Pull data from a SharePoint 2010 List using a WEB ENABLED form. Then, from the repeating table that contains the secondary data, extract only the desired data and mirror it in my main form fields. The extracted data would then be modified, and submitted to another sharepoint list using Nintex Workflows or, if the IT department smiles upon this project, a database.
What I've Tried: Created a field, named "TEST_CyS", in a repeating group, named "TEST", in my main form fields to store the mirrored data. This field has a default value of:
xdXDocument:GetDOM("REMOVED")/dfs:myFields/dfs:dataFields/d:SharePointListItem_RW[(count(../preceding-sibling::*[local-name() = "TEST"]) + 1)]/d:Cy_Statement
This is refreshed when the form updates.
If I set the default value to count(../preceding-sibling::*[local-name() = "TEST"]) + 1 it accurately counts each inserted group.
if I set the default value to xdXDocument:GetDOM("REMOVED")/dfs:myFields/dfs:dataFields/d:SharePointListItem_RW[<INT>])]/d:Cy_Statement where <INT> is any whole value between 1 and n then the field will display the correct information, for the secondary data field whose index is referenced.
It's when I combine the two that things fall apart.
Main Data Tree:
Secondary Data Tree:
Assumptions: I am guessing that the preceding-sibling::*[local-name() = "TEST"] axis is not returning a value due to the fact that it's being called along with the GetDOM() method. I've tried to point the preceding-sibling back to the correct group in the Main form fields, but then I felt silly for trying that as it wouldn't know where to start counting AND infopath presented me with an error:
Function 'GetDOM' did not return a value, or it returned a value that cannot be converted to an XSL data type.
Summary: Is this a lost cause without code or purchasing some "plugin" for Infopath like qRules? The IT department will not budge on allowing Forms with code in them to be run on the SharePoint site, and the requirements placed on the form state that it must be a web enabled form to be filled out in SharePoint.
Edit: We also do not have access to VSTA, and the possibility of having it installed is very, very slim.
Is there an alternative method I could use to pull this off?
SharePoint admins don't need to be involved if the InfoPath form uses code, so long as it is limited to the SharePoint 2010 sandbox APIs.