I'm trying to update a a cell with an existing formula in a Smartsheet sheet. I've read in a similar topic that this is now possible but can't work out how to get that done through my json parse.
Currently this following string type is converted to JSON (using Alteryx) with HTTP Action = PUT
{"id": "111111111111111", "cells": [ {"columnId": 11111111111111111, "value": "=[Actual.]6}" ] }
Using this I'm successfully updating the sheet cell to "=[Actual.]6", BUT it comes in as Text and a single quotation mark to disable the formula function eg. '=[Actual.]6. I've tried to parse the value without quotation marks, but that didn't work at all...
I hope this makes sense
to set a formula, you should use the formula property :
{"id": "111111111111111", "cells": [ {"columnId": 11111111111111111, "formula": "=[Actual.]6}" ] }
Related
I'm using Azure data factory to retrieve data and copy into a database... the Source looks like this:
{
"GroupIds": [
"4ee1a-0856-4618-4c3c77302b",
"21259-0ce1-4a30-2a499965d9",
"b2209-4dda-4e2f-029384e4ad",
"63ac6-fcbc-8f7e-36fdc5e4f9",
"821c9-aa73-4a94-3fc0bd2338"
],
"Id": "w5a19-a493-bfd4-0a0c8djc05",
"Name": "Test Item",
"Description": "test item description",
"Notes": null,
"ExternalId": null,
"ExpiryDate": null,
"ActiveStatus": 0,
"TagIds": [
"784083-4c77-b8fb-0135046c",
"86de96-44c1-a497-0a308607",
"7565aa-437f-af36-8f9306c9",
"d5d841-1762-8c14-d8420da2",
"bac054-2b6e-a19b-ef5b0b0c"
],
"ResourceIds": []
}
In my ADF pipeline, I am trying to get the count of GroupIds and store that in a database column (along with the associated Id from the JSON above).
Is there some kind of syntax I can use to tell ADF that I just want the count of GroupIds or is this going to require some kind of recursive loop activity?
You can use the length function in Azure Data Factory (ADF) to check the length of json arrays:
length(json(variables('varSource')).GroupIds)
If you are loading the data to a SQL database then you could use OPENJSON, a simple example:
DECLARE #json NVARCHAR(MAX) = '{
"GroupIds": [
"4ee1a-0856-4618-4c3c77302b",
"21259-0ce1-4a30-2a499965d9",
"b2209-4dda-4e2f-029384e4ad",
"63ac6-fcbc-8f7e-36fdc5e4f9",
"821c9-aa73-4a94-3fc0bd2338"
],
"Id": "w5a19-a493-bfd4-0a0c8djc05",
"Name": "Test Item",
"Description": "test item description",
"Notes": null,
"ExternalId": null,
"ExpiryDate": null,
"ActiveStatus": 0,
"TagIds": [
"784083-4c77-b8fb-0135046c",
"86de96-44c1-a497-0a308607",
"7565aa-437f-af36-8f9306c9",
"d5d841-1762-8c14-d8420da2",
"bac054-2b6e-a19b-ef5b0b0c"
],
"ResourceIds": []
}'
SELECT *
FROM OPENJSON( #json, '$.GroupIds' )
SELECT COUNT(*) countOfGroupIds
FROM OPENJSON( #json, '$.GroupIds' );
My results:
If your data is stored in a table the code is similar. Make sense?
Another funky way to approach it if you really need the count in-line, is to convert the JSON to XML using the built-in functions and then run some XPath on it. It's not as complicated as it sounds and would allow you to get the result inside the pipeline.
The Data Factory XML function converts JSON to XML, but that JSON must have a single root property. We can fix up the json with concat and a single line of code. In this example I'm using a Set Variable activity, where varSource is your original JSON:
#concat('{"root":', variables('varSource'), '}')
Next, we can just apply the XPath with another simple expression:
#string(xpath(xml(json(variables('varIntermed1'))), 'count(/root/GroupIds)'))
My results:
Easy huh. It's a shame there isn't more built-in support for JPath unless I'm missing something, although you can use limited JPath in the Copy activity.
You can use Data flow activity in the Azure data factory pipeline to get the count.
Step1:
Connect the Source to JSON dataset, and in Source options under JSON settings, select single document.
In the source preview, you can see there are 5 GroupIDs per ID.
Step2:
Use flatten transformation to deformalize the values into rows for GroupIDs.
Select GroupIDs array in Unroll by and Unroll root.
Step3:
Use Aggregate transformation, to get the count of GroupIDs group by ID.
Under Group by: Select a column from the drop-down for your aggregation.
Under Aggregate: You can build the expression to get the count of the column (GroupIDs).
Aggregate Data preview:
Step4: Connect the output to Sink transformation to load the final output to database.
Helo eveyone,
I am fairly new to Data Factory and I need to copy information from Dynamics Business Central's Rest API. I am struggling with the "Details" type entities such as "invoiceSalesHeader".
The api for that entity forces me to provide a header ID as a filter. In that sense, I would have to loop x times (a few thousand) and call the Rest API to retreive the lines of each sales invoice. I find that completely ridiculous and am trying to find other ways to get the information.
To avoid doing that, I am trying to get the information by calling the "salesInvoice" entity and use "$expand=salesInvoiceLines".
That gets me the information I need but inside data factory's Copy Activity, I am struggling with what I should put as a "collection reference" so that I end up with one row per salesInvoiceLine.
The data returned is an array of sales invoices with a sub array of invoice lines.
If I select "salesInvoiceLines" as the collection reference, I end up with "$['value'][0]['salesInvoiceLines']" and that only gives me the lines for the first invoice (since there is an index of zero).
What should I put in Collection Reference so that I get one row per salesInvoiceLine
It is not support to foreach nested json array in ADF.
Alternatively, we can use a Flattern activity in data flow to flatten the nested json array.
Here is my example:
This is my example json data, the structure is like yours:
[
{
"id": 1,
"Value": "January",
"orders":[{"orderid":1,"orderno":"qaz"},{"orderid":2,"orderno":"edc"}]
},
{
"id": 2,
"Value": "February",
"orders":[{"orderid":3,"orderno":"wsx"},{"orderid":4,"orderno":"rfv"}]
},
{
"id": 3,
"Value": "March",
"orders":[{"orderid":5,"orderno":"rfv"},{"orderid":6,"orderno":"tgb"}]
},
{
"id": 11,
"Value": "November",
"orders":[{"orderid":7,"orderno":"yhn"},{"orderid":8,"orderno":"ujm"}]
}
]
In the dataflow, we can select the header of the nested json array, here is orders:
Then we can see the result, we have transposed the JSON orders array with 2 objects (orderid, orderno) into 8 flatten rows:
I am trying to use a Lookup Activity to return a row count. I am able to do this, but once I do, I would like to run an If Statement against it and if the count returns more than 20MIL in rows, I want to execute an additional pipeline for further table manipulation. The issue, however, is that I can not compare the returned value to a static integer. Below is the current Dynamic Expression I have for this If Statement:
#greater(int(activity('COUNT_RL_WK_GRBY_LOOKUP').output),20000000)
and when fired, the following error is returned:
{
"errorCode": "InvalidTemplate",
"message": "The function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type",
"failureType": "UserError",
"target": "If Condition1",
"details": ""
}
Is it possible to convert this returned value to an integer in order to make the comparison? If not, is there a possible work around in order to achieve my desired result?
Looks like the issue is with your dynamic expression. Please correct your dynamic expression similar to below and retry.
If firstRowOnly is set to true : #greater(int(activity('COUNT_RL_WK_GRBY_LOOKUP').output.firstRow.propertyname),20000000)
If firstRowOnly is set to false : #greater(int(activity('COUNT_RL_WK_GRBY_LOOKUP').output.value[zero based index].propertyname),20000000)
The lookup result is returned in the output section of the activity run result.
When firstRowOnly is set to true (default), the output format is as shown in the following code. The lookup result is under a fixed firstRow key. To use the result in subsequent activity, use the pattern of #{activity('MyLookupActivity').output.firstRow.TableName}.
Sample Output JSON code is as follows:
{
"firstRow":
{
"Id": "1",
"TableName" : "Table1"
}
}
When firstRowOnly is set to false, the output format is as shown in the following code. A count field indicates how many records are returned. Detailed values are displayed under a fixed value array. In such a case, the Lookup activity is followed by a Foreach activity. You pass the value array to the ForEach activity items field by using the pattern of #activity('MyLookupActivity').output.value. To access elements in the value array, use the following syntax: #{activity('lookupActivity').output.value[zero based index].propertyname}. An example is #{activity('lookupActivity').output.value[0].tablename}.
Sample Output JSON Code is as follows:
{
"count": "2",
"value": [
{
"Id": "1",
"TableName" : "Table1"
},
{
"Id": "2",
"TableName" : "Table2"
}
]
}
Hope this helps.
Do this - when you run the debugger look at the output from your lookup. It will give a json string including the alias for the result of your query. If it's not firstrow set then you get a table. But for first you'll get output then firstRow and then your alias. So that's what you specify.
For example...if you put alias of your count as Row_Cnt then...
#greater(activity('COUNT_RL_WK_GRBY_LOOKUP').output.firstRow.Row_Cnt,20000000)
You don't need the int function. You were trying to do that (just like I was!) because it was complaining about datatype. That's because you were returning a bunch of json text as the output instead of the value you were after. Totally makes sense after you realize how it works. But it is NOT intuitively obvious because it's coming back with data but its string stuff from json, not the value you're after. And functions like equals are just happy with that. It's not until you try to do something like greater where it looks for numeric value that it chokes.
When trying to access the serviceM8 API to search for customers I am using the following API request. The endpoint works but I am not sure how to handle special chars like ' or &.
API Documentation: https://developer.servicem8.com/docs/filtering
So if I call the following:
https://api.servicem8.com/api_1.0/company.json?%24filter=name%20eq%20"BENTON%"
Then I get following result (3 results including the ones with a special char.
[
{
"uuid": "60791e9a-8c5a-4e2a-aca5-f9625bef7d8b",
"edit_date": "2018-06-20 15:13:53",
"name": "BENTON'S TEST",
},
{
"uuid": "f722b374-e330-42f1-a09e-333b375af1ab",
"edit_date": "2018-07-05 09:46:30",
"name": "Benton's test",
},
{
"uuid": "df01f8ce-a1c9-438b-8954-297b113689ab",
"edit_date": "2018-07-05 17:02:43",
"name": "BENTONS TEST",
}
]
What I would like to try to filter for is for BENTON'S TEST so I try the following but that does not work.
https://api.servicem8.com/api_1.0/company.json?%24filter=name%20eq%20"BENTON'S%"
{
"uuid": "df01f8ce-a1c9-438b-8954-297b113689ab",
"edit_date": "2018-07-05 17:02:43",
"name": "BENTONS TEST",
}
My question is now, how can I filter in this API so that I can search for these special characters? like BENTON'S. If I use %27 to replace the ' sign that does not change the result as it seems the % sign in a filter is used as search parameter?
UPDATE
Reading this: https://community.dynamics.com/crm/b/mscrmshop/archive/2015/12/21/crm-odata-rest-queries-and-special-characters
I assume that the %27 is just not recognized and I should use two single quotes but that does not work either. If I search in the front end application, searching for ' does not work but when I use \' I can search for the correct names. But using that in my query string does not work.
Please use &$filter=substringof("BENTON'S",name) instead of &$filter=name eq "BENTON'S". This would return all names that contain Benton's.
Otherwise you can also use &$filter=startswith("BENTON'S",name)
I have array which I need to parse into Talent
{
"sEcho": 1,
"iTotalRecords": 54,
"iTotalDisplayRecords": 54,
"aaData": [
[
"79",
"testowy2",
"testowy samochod",
"12.00",
"14.00",
"2147483647",
"posciel",
""
]
]
}
What would be Xpath Query which I need to pass on the tExtractJSONFields ?
you might use JSONPath query to read the multi dimensional array via tExtractJSONFields.
For query building (with respect to above data), please refer below screen shot :
you will get the array in "id" field.
Let me know if you face any problem.
It is not possible to do this,
"Please check right XPathExpression or XML source document." error is occuring...
Your input is a valid json but not a valid XML Document