Collect and merge data from multiple REST responses - talend

I have a project where I need to perform a REST call and collect values from the response. For each set of values collected, I need to use one value to add as a query parameter for a second REST call. Once I receive the response from the second call, I need to add that value to the set of values from the first call.
Data set A from REST call A:
[
{
"key1": "value1",
"key2": "value2"
},
{
"key1": "value3",
"key2": "value4"
}
]
REST Call B:
GET https://.../widgets?neededValue=value1
Data Set B:
[ {"id": "guid1", "secondaryId": "value1"} ]
Amalgam of data example:
[
{
"key1": "value1",
"key2": "value2",
"key3" "guid1"
},...
]
What is the best way to accomplish this?

concerning your question maybe need a bit more information specifically for the json structure you want to create.
But for this kind of use cases I usually a tFlowIterate and store the value I want to use in a global var.
So for the structure of the job I go like this :
first rest call (and extract json values) and store the value in a globalVar (using tFlowToIterate)
Iterate with the values in globalvar and repeat the second rest call build like this "http://test.com/widgest?neededValue=" + ((String)globalMap("KEY1")) and the globalVar is setup in the tFlowIterate component
Extract the values from the second rest call and combine the values you want in your final json structure in a tMap (using globalVar calls) and write the json with tWriteJsonFields
It should look something like that.
I hope this answers your question

Related

JSON data getting retrieved from mongodb with formats added explicitly inside field e.g.({"field": {$numberInt: "20"}}). How to process that data?

I have used mongo import to import data into mongodb from csv files. I am trying to retrieve data from an Mongodb realm service. The returned data for the entry is as follows:
{
"_id": "6124edd04543fb222e",
"Field1": "some string",
"Field2": {
"$numberDouble": "145.81"
},
"Field3": {
"$numberInt": "0"
},
"Field4": {
"$numberInt": "15"
},
"Field5": {
"$numberInt": "0"
}
How do I convert this into normal JSON by removing $numberInt and $numberDouble like :
{
"_id": "6124edd04543fb222e",
"Field1": "some string",
"Field2": 145.8,
"Field3": 0,
"Field4": 15,
"Field5": 0
}
The fields are also different for different documents so cannot use Mongoose directly. Are there any solutions to this?
Also would help to know why the numbers are being stored as $numberInt:"".
Edit:
For anyone with the same problem this is how I solved it.
The array of documents is in EJSON format instead of JSON like said in the upvoted answer. To covert it back into normal JSON, I used JSON.stringify to first convert each document I got from map function into string and then parsed it using EJSON.parse with
{strict:false} (this option is important)
option to convert it into normal JSON.
{restaurants.map((restaurant) => {
restaurant=EJSON.parse(JSON.stringify(restaurant),{strict:false});
}
EJSON.parse documentation here. The module to be installed and imported is mongodb-extjson.
The format with $numberInt etc. is called (MongoDB) Extended JSON.
You are getting it on the output side either because this is how you inserted your data (meaning your inserted data was incorrect, you need to fix the ingestion side) or because you requested extended JSON serialization.
If the data in the database is correct, and you want non-extended JSON output, you generally need to write your own serializers to JSON since there are multiple possibilities of how to format the data. MongoDB's JSON output format is the Extended JSON you're seeing in your first quote.

Azure Data Factory - Copy Activity - rest api collection reference

Helo eveyone,
I am fairly new to Data Factory and I need to copy information from Dynamics Business Central's Rest API. I am struggling with the "Details" type entities such as "invoiceSalesHeader".
The api for that entity forces me to provide a header ID as a filter. In that sense, I would have to loop x times (a few thousand) and call the Rest API to retreive the lines of each sales invoice. I find that completely ridiculous and am trying to find other ways to get the information.
To avoid doing that, I am trying to get the information by calling the "salesInvoice" entity and use "$expand=salesInvoiceLines".
That gets me the information I need but inside data factory's Copy Activity, I am struggling with what I should put as a "collection reference" so that I end up with one row per salesInvoiceLine.
The data returned is an array of sales invoices with a sub array of invoice lines.
If I select "salesInvoiceLines" as the collection reference, I end up with "$['value'][0]['salesInvoiceLines']" and that only gives me the lines for the first invoice (since there is an index of zero).
What should I put in Collection Reference so that I get one row per salesInvoiceLine
It is not support to foreach nested json array in ADF.
Alternatively, we can use a Flattern activity in data flow to flatten the nested json array.
Here is my example:
This is my example json data, the structure is like yours:
[
{
"id": 1,
"Value": "January",
"orders":[{"orderid":1,"orderno":"qaz"},{"orderid":2,"orderno":"edc"}]
},
{
"id": 2,
"Value": "February",
"orders":[{"orderid":3,"orderno":"wsx"},{"orderid":4,"orderno":"rfv"}]
},
{
"id": 3,
"Value": "March",
"orders":[{"orderid":5,"orderno":"rfv"},{"orderid":6,"orderno":"tgb"}]
},
{
"id": 11,
"Value": "November",
"orders":[{"orderid":7,"orderno":"yhn"},{"orderid":8,"orderno":"ujm"}]
}
]
In the dataflow, we can select the header of the nested json array, here is orders:
Then we can see the result, we have transposed the JSON orders array with 2 objects (orderid, orderno) into 8 flatten rows:

Updating relation in Hybris data using REST API, nested relation not saved

I've implemented tree structure and want to save items to database. Every item has "children" field with list of child nodes.
But if I send PUT request with something like this:
https://localhost:9001/ws410/rest/pdsfamilies/8796093098749
{
"children": [
{
"pk": "8796093164285"
}
]
}
I'm getting response 200 OK but of course "children" list doesn't update. If I pull the item using GET again, it doesn't contain that change.
What am I doing wrong?
The solution was weird nested object structure like this:
{
"children": {
"pdsFamily" : [
{
"pk": "8796093164285"
}
]
}
I don't know why another property pdsFamily was needed.
Also another weird thing is that in the response from GET I'm getting similar structure but the property is all lowercase pdsfamily... I have to create separate dtos for response and request just because of that...

How do I use ADF copy activity with multiple rows in source?

I have source which is JSON array, sink is SQL server. When I use column mapping and see the code I can see mapping is done to first element of array so each run produces single record despite the fact that source has multiple records. How do I use copy activity to import ALL the rows?
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"schemaMapping": {
"['#odata.context']": "BuyerFinancing",
"['#odata.nextLink']": "PropertyCondition",
"value[0].AssociationFee": "AssociationFee",
"value[0].AssociationFeeFrequency": "AssociationFeeFrequency",
"value[0].AssociationName": "AssociationName",
Use * as the source field to indicate all elements in json format. For example, with json:
{
"results": [
{"field1": "valuea", "field2": "valueb"},
{"field1": "valuex", "field2": "valuey"}
]
}
and a database table with a column result to store the json. The mapping with results as the collection and * and the sub element will create two records with:
{"field1": "valuea", "field2": "valueb"}
{"field1": "valuex", "field2": "valuey"}
in the result field.
Copy Data Field Mapping
ADF support cross apply for json array. Please check the example in this doc. https://learn.microsoft.com/en-us/azure/data-factory/supported-file-formats-and-compression-codecs#jsonformat-example
For schema mapping: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping#schema-mapping

Any default APIs in Dolibarr for creating sales order records?

Dolibarr has a module for restful APIs.
The API explorer seems to show all the CRUD tasks for each module like orders, stock and customer.
But to CREATE a record, the sample VALUE for the POST method shows as:
{
"request_data": [
"string"
]
}
What are the specific field attributes that should go in here?
Where can I look up the field requirements?
You should take a look at the attributes of the Commande class:
https://github.com/Dolibarr/dolibarr/blob/develop/htdocs/commande/class/commande.class.php
The object should be something like this :
{
"date_commande" : "0000-00-00 00:00:00",
"date_livraison" : "0000-00-00 00:00:00",
"attribute3": "and so on"
}
When you need a parameter like
{ "request_data": [ "string" ] } for a POST API, all you have to do is to call the similar API to get a record (so the same API with the GET method). The result can be cut and paste to be used to create a new record (just change the id and ref in the answer retreived by the GET).