Drools - Send input data to business process (JBPM) via REST - jboss

Good day! I'm trying to start the next Drools Business Process via REST
This process has the following process variable (named contractObject), which is populated at the beginning of the flow and used to validate with the decision table
I'm sending the next body in the Drools Request
{
"lookup": "ksession1",
"commands": [
{ "start-process": {
"processId":"decision-tree-test.businessProcessTest",
"out-identifier": "firedProcesses",
"data": [
{
"contractObject": {
"com.myspace.decision_tree_test.Contract":{
"contractType": "P",
"service": 2000,
"serviceType": 3000,
"promotion": 3470
}
}
}
]
}
}]
}
But i keep getting the next error
{
"type" : "FAILURE",
"msg" : "Error calling container decision-tree-test: [decision-tree-test.businessProcessTest:11 - Decision Table:5] -- [decision-tree-test.businessProcessTest:11 - ?:4] -- Exception when trying to evaluate constraint in split null",
"result" : null
}
So I guess I'm not sending the data well as it doesn't seem to be taking it. In the documentation they do not specify a specific format to send data for process variables, this is what I have found for the start-process command:
What could I be doing wrong?
Links to the documentation i've consulted:
https://access.redhat.com/documentation/en-us/red_hat_process_automation_manager/7.1/html-single/interacting_with_red_hat_process_automation_manager_using_kie_apis/index#runtime-commands-con_kie-apis
https://docs.jboss.org/drools/release/5.3.0.Final/droolsjbpm-integration-docs/html/ch04.html#d0e1028
Picture of the Decision Table Data I / O

Related

POST request to JIRA REST API to create issue of type Minutes

my $create_issue_json = '{"fields": { "project": { "key": "ABC" }, "summary": "summary for version 1", "description": "Creating an issue via REST API", "issuetype": { "name": "Minutes" }}}';
$tx1 = $jira_ua->post($url2 => json => decode_json($create_issue_json));
my $res1 = $tx1->res->body;
I try to create a jira issue of type Minutes but POST expects some fields which are not available in the issue of type Minutes. The below is the response.
{"errorMessages":["Brands: Brands is required.","Detection: Detection is required."],"errors":{"versions":"Affects Version/s is required.","components":"Component/s is required."}}
I also tried to fetch the schema using createMeta api but don't find any useful info. The below is the response from createmeta.
{"maxResults":50,"startAt":0,"total":3,"isLast":true,"values":[
{
"self":"https://some_url.com/rest/api/2/issuetype/1",
"id":"1",
"description":"A problem which impairs or prevents the functions of the product.",
"iconUrl":"https://some_url.com:8443/secure/viewavatar?size=xsmall&avatarId=25683&avatarType=issuetype",
"name":"Bug",
"subtask":false},
{
"self":"https://some_url.com:8443/rest/api/2/issuetype/12",
"id":"12",
"description":"An issue type to document minutes of meetings, telecons and the like",
"iconUrl":"https://some_url.com:8443/secure/viewavatar?size=xsmall&avatarId=28180&avatarType=issuetype",
"name":"Minutes",
"subtask":false
},
{
"self":"https://some_url.com:8443/rest/api/2/issuetype/23",
"id":"23",
"description":"Used to split an existing issue of type \"Bug\"",
"iconUrl":"https://some_url.com:8443/images/icons/cmts_SubBug.gif",
"name":"Sub Bug",
"subtask":true
}
]
}
It looks like there Jira Admin has added these as manadatory fields for all the issuetypes which I came to know after speaking with him. He has now individual configuration for different issue types and am able to create minutes.

Not able to get logs related to azure data factory mapping data flows from log analytics

We are working on implementing a custom logging solution. Most of the information what we need is already present in log analytics from data factory analytics solution but for getting log info on data flows,  there is a challenge. When querying, we get this error in output. "Too large to parse". 
Since data flows are complex and critical piece in a pipeline, we are in desperate need to get data like rows copied, skipped, read etc of each activities with in data flow. can you pls help how to get those info?
You can get the same information shown in the ADF portal UI by making a POST request to the below REST endpoint. You can find more information and read about authentication on the following link https://learn.microsoft.com/en-us/rest/api/datafactory/pipelineruns/querybyfactory
You can choose to query by factory or for a specific pipeline run id depending on your needs.
https://management.azure.com/subscriptions/<subscription id>/resourcegroups/<resource group name>/providers/Microsoft.DataFactory/factories/<ADF resource Name>/pipelineruns/<pipeline run id>/queryactivityruns?api-version=2018-06-01
Below is an example of the data you can get from one stage:
{
"stage": 7,
"partitionTimes": [
950
],
"lastUpdateTime": "2020-07-28 18:24:55.604",
"bytesWritten": 0,
"bytesRead": 544785954,
"streams": {
"CleanData": {
"type": "select",
"count": 241231,
"partitionCounts": [
950
],
"cached": false
},
"ProductData": {
"type": "source",
"count": 241231,
"partitionCounts": [
950
],
"cached": false
}
},
"target": "MergeWithDeltaLakeTable",
"time": 67589,
"progressState": "Completed"
}

Camunda correlating a message to running process instance starts new process instance

I have the following problem:
There is a message start event lets say message name is MessageX followed by a Task.
If i send a POST request engine-rest/message with the body
{
"messageName" : "MessageX",
"processInstanceId" : "null",
"resultEnabled" : "true"
}
i get the answer
{
"resultType": "ProcessDefinition",
"execution": null,
"processInstance": {
...
"id": "1234-567-8910",
...
}
}
So there is now a process with id 1234-567-8910 started and waiting on the Task. Fine.
I now want to corellate the same message to the Process Instance with the id 1234-567-8910 like so:
{
"messageName" : "MessageX",
"processInstanceId" : "1234-567-8910",
"resultEnabled" : "true"
}
The BPMN looks like this:
I expect that he says something like process 1234-567-8910 is not waiting for message or something, but instead he starts a new process instance...
Is there a way to only corellate the message when the process is actually at some point where he is actually waiting for it?
Messages only can be correlated when the execution token is waiting at the message event. In your case, execution token is already handed over to the task, so correlation fails and a new instance is spawned.
But why do you want to sent the same message twice? Starting an instance with a message already delivered it. You could separate process start and message delivery by using an explicit start request (needs to separate start-event and message-event in your process-definition):
POST localhost:8080/engine-rest/process-definition/key/<your-process-id>/start HTTP/1.1
{
"variables": {
"someVar": {
"value": "hello", "type":"string"
}
},
"businessKey" : "1234"
}
and afterwards:
POST localhost:8080/engine-rest/message HTTP/1.1
{
"messageName" : "MessageX",
"businessKey" : "1234",
"processVariables" : {
"someNewVar" : {"value" : 5, "type": "Integer"}
}
}
For sure you could also use p-id to correlate, but do you really want to keep track of all these id's?
KR, Joachim
You can do this in KIE(jbpm). There the start and others nodes are signals and there is an api for those. It also accepts process variables as parameters.

SSAS Tabular Add Column via TMSL

Good Morning,
Objective: I am working on trying to add new columns to a SSAS Tabular Model table. With a long-term aim to programmaticly made large-batch changes when needed.
Resources I've found:
https://learn.microsoft.com/en-us/sql/analysis-services/tabular-models-scripting-language-commands/create-command-tmsl
This one gives the template I've been following but seems to not work.
What I have tried so far:
{
"create": {
"parentObject": {
"database": "TabularModel_1_dev"
, "table": "TableABC"
},
"columns": [
{
"name": "New Column"
, "dataType": "string"
, "sourceColumn": "Column from SQL Source"
}
]
}
}
This first one is the most true to the example but returns the following error:
"The JSON DDL request failed with the following error: Unrecognized JSON property: columns. Check path 'create.columns', line 7, position 15.."
Attempt Two:
{
"create": {
"parentObject": {
"database": "TabularModel_1_dev"
, "table": "TableABC"
},
"table": {
"name": "Item Details by Branch",
"columns": [
{
"name": "New Column"
, "dataType": "string"
, "sourceColumn": "New Column"
}
]
}
}
}
Adding table within the child list returns error too;
"...Cannot execute the Create command: the specified parent object cannot have a child object of type Table.."
Omitting the table within the parentObject is unsuccessful as well.
I know it's been three years since the post, but I too was attempting the same thing and stumbled across this post in my quest. I ended up reaching out to microsoft and was told that the Add Column example they gave in their documentation was a "doc bug". In fact, you can't add just a column, you have to feed it an entire table definition via createOrReplace.
SSAS Error Creating Column with TMSL

Talend read JSON data from tRESTRequest

I am trying to learn Talend.
Scenario:
I have to create a REST endpoint (i am using tRESTRequest) which takes a POST request at http://localhost:8086/emp/create and accepts below json and prints each json field and sends a sample json response containing only name field.
How can I do so ?
How to read the json data into a java component like tJava?
Structure:
{
"emp" :
[
{
"id":"123",
"name": "testemp1"
},
{
"id":"456",
"name": "testemp2"
}
]
}
Expected Response:
{
"emp" :
[
{
"name": "testemp1"
},
{
"name": "testemp2"
}
]
}
I am using tRESTRequest -> tExtractJSONFields -> tRESTResponse.
For looping on the right elements and parsing the contents, please see my answer JSON Deserialization on Talend
I did not understand the second question. When deserializing JSON, the data will already be available in the usual row format for processing further. Beginner tutorials will show you the standard structure. The component tJava is - of course - an exception to that rule. Handling data is different in this component and not neccessarily row based.
Talend has an excellent knowledge base for components and examples, see https://help.talend.com/