Azure data factory: Using output of Rest in copy data activity in next activity - azure-data-factory

I have copy data activity which copies data from Rest to SQL Server. Rest returns a json response. I need to have another Web activity after success of copy data. This activity needs data from previous rest api response (which is part of copy data). Any idea how we can achieve this.
I have tried using
#{activity('ACTIVITY_NAME').output.<json_field_from_response>
I get following error.
{
"errorCode": "InvalidTemplate",
"message": "The expression 'activity('ACTIVITY_NAME').output.batch_id' cannot be evaluated because property 'batch_id' doesn't exist, available properties are 'dataRead, dataWritten, rowsRead, rowsCopied, copyDuration, throughput, errors, effectiveIntegrationRuntime, usedDataIntegrationUnits, usedParallelCopies, executionDetails'.",
"failureType": "UserError",
"target": "Web1"
}
I am hoping there will be some way in dataset or pipeline to set variable to be used later. But I am not able to find it. Thanks.

You could use lookup activity and get the data from lookup output.

Related

Copy Activity Not able to copy any response from Rest api in Azure Data factory

I am using input as rest api url .And I am trying to save the response to a sql table.When I run the pipeline the pipeline run successfully,But it is showing zero rows copied.
I tested the api in postman.I am able to see the reponse data (9 mb)
Anybody else got this same issue,Please help me
I tried to reproduce and faced similar problem Its not inserting any records.
The problem is causing due to API returns response in Json and pipeline doesn't know which object value should store in which column.
To resolve this use Mapping. import the scma and amp the paricular columns as below:
Output:
I think the intend here is copy the response json to SQL and if thats the case then we cannot do that with copy activity .
One way is you can use a web activity to call the API and after that you can call a Stored proc activity and pass the response as a input paramter to the SP . The SP will insert the record in the table . But 9MB of response is too big , i doubt if the web activity can handle that .

How do you handle an empty Rest call when trying to copy data in azure data factory?

I have a copy data activity in which my source dataset is set to a RestResource. It works fine except for every once in a while the Rest call returns an empty dataset: {"d":{"results":[]}}
This results in the following error
{
"errorCode": "2200",
"message": "ErrorCode=UserErrorTypeInSchemaTableNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to get the type from schema table. This could be caused by missing Sql Server System CLR Types.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.InvalidCastException,Message=Unable to cast object of type 'System.DBNull' to type 'System.Type'.,Source=Microsoft.DataTransfer.ClientLibrary,'",
"failureType": "UserError",
"target": "Import Punch Adjustments",
"details": []
}
I know for other sources you would want to do a lookup or check the metadata then do a conditional if, but I am unsure how to do that for a rest source. Are there any other better options?
You can add an activity on failure, this way even the copy activity fails pipeline will run successfully.
Example:
I have added an activity to store the error code on failure.
#string(activity('Copy data1').Error.errorCode)
Pipeline status:

Rest API call from copy activity

Hi i am processing a set of ~50K records from a pipe delimeted flat kn azure data factory and need to invoke a rest API call for each input record. So, I am using a foreach loop to access each record and inside the loop, I am using a copy activity to invoke a rest API call.
My question is, can I invoke the rest API call in bulk for all the records at once, as the foreach loop is slowing the pipeline execution. I want to remove the foreach loop and also process the API json response and store it in azure sql database.
Thanks
You will have to check the Pagination properties so that you can decide how much payload you need to return from source API:
https://learn.microsoft.com/en-us/azure/data-factory/connector-rest?tabs=data-factory#pagination-support
Also, if you need to store the API JSON response in Azure SQL, then you can do so with many built in functions like JSON_PATH
More details can be found in this link:
https://learn.microsoft.com/en-us/azure/azure-sql/database/json-features

Azure Copy Activity Rest Results Unexpected

I'm attempting to pull data from the Square Connect v1 API using ADF. I'm utilizing a Copy Activity with a REST source. I am successfully pulling back data, however, the results are unexpected.
The endpoint is /v1/{location_id}/payments. I have three parameters, shown below.
I can successfully pull this data via Postman.
The results are stored in a Blob and are as if I did not specify any parameters whatsoever.
Only when I hardcode the parameters into the relative path
do I get correct results.
I feel I must be missing a setting somewhere, but which one?
You can try setting the values you want into a setVariable activity, and then have your copyActivity reference those variables. This will tell you whether it is an issue with the dynamic content or not. I have run into some unexpected behavior myself. The benefit of the intermediate setVariable activity is twofold. Firstly it coerces the datatype, secondly, it lets you see what the value is.
My apologies for not using comments. I do not yet have enough points to comment.

Error - Azure Data Factory transfer from SQL Database to Azure Search

I've set up an Azure Data Factory pipeline to transfer the data from one table in our SQL Server Database to our new Azure Search service. The transfer job continuously fails giving the following error:
Copy activity encountered a user error at Sink side:
GatewayNodeName=SQLMAIN01,ErrorCode=UserErrorAzuerSearchOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error
happened when writing data to Azure Search Index
'001'.,Source=Microsoft.DataTransfer.ClientLibrary.AzureSearch,''Type=Microsoft.Rest.Azure.CloudException,Message=Operation
returned an invalid status code
'RequestEntityTooLarge',Source=Microsoft.Azure.Search,'.
From what I've read thus far, Request Entity Too Large error is a standard HTTP error 413 found inside REST API. Of all the research I've done though, nothing helps me understand how I can truly diagnose and resolve this error.
Has anyone dealt with this with specific context to Azure? I would like to find out how to get all of our database data into our Azure Search service. If there are adjustments that can be made on the Azure side to increase the allowed request size, the process for doing so certainly is not readily-available anywhere I've seen on the internet nor in the Azure documentation.
This error means that the batch size written by Azure Search sink into Azure Search is too large. The default batch size is 1000 documents (rows). You can decrease it to a value that balances size and performance by using writeBatchSize property of the Azure Search sink. See Copy Activity Properties in Push data to an Azure Search index by using Azure Data Factory article.
For example, writeBatchSize can be configured on the sink as follows:
"sink": { "type": "AzureSearchIndexSink", "writeBatchSize": 200 }