Issue in passing content as a body in Azure Data Factory - azure-data-factory

I am calling an Azure Function(HTTP trigger) in Azure Data Factory and the body is coming from a lookup activity (#activity('Lookup1').output.value) which is something like this
"body":[
{
"BaseObject": "2|03|01|01",
"BaseObjectDescription": "Cent",
"CreateDate": "30.09.2021"
},
{
"BaseObject": "9|03|01|01",
"BaseObjectDescription": "Pent",
"CreateDate": "30.09.2021"
}]
The above json when passed as body to Azure function activity, I get error as "Unexpected character encountered while parsing value: S." But if I hardcode and pass the same value in body of Azure function I get the output. When I hardcode it, I see in debug mode the body is passed as something like this
"body": "[\n {\n \"BaseObject\": \"02|03|01|01\",\n \"BaseObjectDescription\": \"Cent\",\n \"CreateDate\": \"30.09.2021\" },\n {\n \"BaseObject\": \"04|03|01|01\",\n \"BaseObjectDescription\": \"Pent",\n \"CreateDate\": \"21.09.2021\",\n },\n]"
So question is how do I change the json I am getting from lookup activity to something like above so that my Function recognizes this as body in Azure Data Factory.
Here is my configuration of Azure Function Linked Service. Why open in Azure Portal is Disabled. The error I get in ADF for Azure Function Activity is
"
Call to provided Azure function 'Function1' failed with status-'InternalServerError' and message - 'Invoking Azure function failed with HttpStatusCode - InternalServerError.'." While in Azure Function Logs I see the error message as below:
2022-01-06T10:14:42.403 [Information] C# HTTP trigger function processed a request.
2022-01-06T10:14:42.826 [Error] Executed 'Function1' (Failed, Id=c7f2488f-e08f-49a3-8f10-4e82a10d9ac0, Duration=270ms)The argument 'length' is smaller than minimum of '1' (Parameter 'length')
2022-01-06T10:16:40.934 [Information] Executing 'Function1' (Reason='This function was programmatically called via the host APIs.', Id=5ff54bce-679d-4892-87a0-fb342ff02cc7)
2022-01-06T10:16:40.934 [Information] C# HTTP trigger function processed a request.
2022-01-06T10:16:40.994 [Error] Executed 'Function1' (Failed, Id=5ff54bce-679d-4892-87a0-fb342ff02cc7, Duration=2ms)Specified method is not supported.Specified method is not supported.

Use the lookup activity value as #activity('Lookup1').output.value[0] inside the body of Azure Function activity.
Lookup activity output:
Azure Function activity settings:
Body: #activity('Lookup1').output.value[0]
The value shown in input to Azure functions:

Related

Getting Error when publishing in azure synapse or azure data factory

I am trying to publish the synapse pipelines from master branch.
And I am getting an error and it does not point it to any details or any specific pipeline or Data flow.
Error is below:-
Error code: OK
Inner error code: BadRequest
Message: Missing parameter definition for Env
BadRequest Message: Missing parameter definition for Env
is a parameter-passing error that occurs often. Default value can't be expressions. It has to be static value.
You can check if you follow the steps passing parameters:
Create parameters in the dataset:
create parameters
Change the dynamic content to reference the new dataset parameters in the dataset
change the dynamic content
Enter dynamic content referencing the original pipeline parameter in the calling pipeline
Enter dynamic content

How we can pass API payload in Step Function Dynamically?

I created a Step function which will trigger by API Gateway with Input Parameters. Graph is here :
Below is my Step function code :
[
I'm trying to pass Body via API Gateway like below:
I'm getting below error while body request to API gateway
{
"error": "States.Runtime",
"cause": "An error occurred while executing the state 'API Gateway Invoke' (entered at the event id #2). The JSONPath '$.id' specified for the field 'Payload.$' could not be found in the input '{"ISIN":"12sd"}'"
}

Can we Invoke AWS SNS service from snowflake external function through API gtw without lambda? must accept API query string parameters as well*

I am able to make calls through snowflake external function(with parameters) to API gateway and then able to invoke lambda. Question is - "I want invoke SNS without lambda hence I also need to handle the parameters from snowflake function in API gateway itself which will be used to invoke SNS" is it possible ??
Error: Request failed for external function EX_FN_API_EXCEPTIONS with
remote service error: 400 '{"message": "Missing required request
parameters: [subject, topicArn]"}'; requests batch-id: <>; request
batch size: 1 rows; request retries: 0; response time (last retry):
41.9ms
create or replace external function EX_FN_API_EXCEPTIONS(subject1 varchar, topicArn1 varchar, message1 variant)
returns variant
api_integration = aws_api_integration_exceptions
as 'https://tp7nyt3qq0.execute-api.us-east-1.amazonaws.com/dev/event';
select EX_FN_API_EXCEPTIONS('TEST SNOW SUBJECT', 'arn:aws:sns:us-east-1:account:riksd-0015-notification', null )
All I need is how to pass parameters from snowflake external function to below method request's query string parameters and also the Message body

Is there a way to capture the name of a task that has been executed in SnapLogic?

We have a lot of triggered Tasks that run on the same pipelines, but with different parameters.
My question regarding this - is there a possible way, like a function or an expression to capture the name of the triggered task so that we could use the information while writing the reports and e-mails of which Task started the error pipeline. I can't find anything even close to it.
Thank you in advance.
This answer addresses the requirement of uniquely identify the invoker task in the invoked pipeline
For triggered tasks, there isn't anything provided out of the box by SnapLogic. Although, in case of ULTRA tasks you can get $['task_name'] from the input to the pipeline.
Out of the box, SnapLogic provides the following headers that can be captured and used in the pipeline being initiated by the triggered task. These are as follows.
PATH_INFO - The path elements after the Task part of the URL.
REMOTE_USER - The name of the user that invoked this request.
REMOTE_ADDR - The IP address of the host that invoked this request.
REQUEST_METHOD - The method used to invoke this request.
None of these contains the task-name.
In your case, as a workaround, to uniquely identify the invoker task in the invoked pipeline you could do one of the following three things.
Pass the task-name as a parameter
Add the task-name in the URL like https://elastic.snaplogic.com/.../task-name
Add a custom header from the REST call
All the three above methods can help you capture the task-name in the invoked pipeline.
In your case, I would suggest you go for a custom header because the parameters you pass in the pipeline could be task-specific and it is redundant to add the task-name again in the URL.
Following is how you can add a custom header in your triggered task.
From SnapLogic Docs -
Custom Headers
To pass a custom HTTP header, specify a header and its value through the parameter fields in Pipeline Settings. The
request matches any header with Pipeline arguments and passes those to
the Task, while the Authorization header is never passed to the
Pipeline.
Guidelines
The header must be capitalized in its entirety. Headers are case-sensitive.
Hyphens must be changed to underscores.
The HTTP custom headers override both the Task and Pipeline parameters, but the query string parameter has the highest precedence.
For example, if you want to pass a tenant ID (X-TENANT-ID) in a
header, add the parameter X_TENANT_ID and provide a default or leave
it blank. When you configure the expression, refer to the Pipeline
argument following standard convention: _X_TENANT_ID. In the HTTP
request, you add the header X-TENANT-ID: 123abc, which results in the
value 123abc being substituted for the Pipeline argument X_TENANT_ID.
Creating a task-name parameter in the pipeline settings
Using the task-name parameter in the pipeline
Calling the triggered task
Note: Hyphens must be changed to underscores.
References:
SnapLogic Docs - Triggered Tasks
I'm adding this as a separate answer because it addresses the specific requirement of logging an executed triggered task separate from the pipeline. This solution has to be a separate process (or pipeline) instead of being part of the triggered pipeline itself.
The Pipeline Monitoring API doesn't have any explicit log entry for the task name of a triggered task. invoker is what you have to use.
However, the main API used by SnapLogic to populate the Dashboard is more verbose. Following is a screenshot of how the response looks on Google Chrome Developer Tools.
You can use the invoker_name and pipe_invoker fields for identifying a triggered task.
Following is the API that is being used.
POST https://elastic.snaplogic.com/api/2/<org snode
id>/rest/pm/runtime
Body:
{
"state": "Completed,Stopped,Failed,Queued,Started,Prepared,Stopping,Failing,Suspending,Suspended,Resuming",
"offset": 0,
"limit": 25,
"include_subpipelines": false,
"sort": {
"create_time": -1
},
"start_ts": null,
"end_ts": null,
"last_hours": 1
}
You can have a pipeline that periodically fires this API then parses the response and populates a log table (or creates a log file).

how to create a text file and insert error message into it in ADFV2

We have a copy activity in ADF and the requirement is to abort the operation on failure,send the mail with error message and log the error information in a text file in blob storage.
Till sending the mail with error message, the task is completed using logic apps.Since skip and log cannot be done in ADF fault tolerance in our case to store the error information in blob storage.Is there a way we can log the error information in a text file in blob storage using ADF when fault tolerance for copy activity is set as 'abort activity on first incompatible row'?
Please let me know if there is any way to log the error information in blob storage when fault tolerance is 'abort activity on first incompatible row'.
vicky,based on the Fault tolerance document and options in the ADF UI,there are only 3 options we could pick:
Only "Skip and log incompatible rows" could save the error log into Azure Blob Storage which is configured by us. No any direct way to save the error log into Azure Storage if you have to pick "abort activity on first incompatible row".
So,my workaround is adding a Azure Function Activity behind the Copy Activity if the Copy Activity Status is Failure,like this:
I make a error situation on purpose for copy activity and you could view the output of Copy Activity which contains error message:
Then you could refer this message in the azure function body by #activity('Copy data1').output
Inside the azure function,you could store it into blob storage using sdk code.
can you please share the code for azure function to store the error
information inside the blob storage as i dont have much knowledge of
c#.
I followed this tutorial to test c# code for you:
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.WindowsAzure.Storage.Blob;
namespace FunctionApp2
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
[Blob("test/error.txt", FileAccess.ReadWrite)] CloudBlockBlob blob,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string name = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
.Value;
await blob.UploadTextAsync(name);
if (name == null)
{
// Get request body
dynamic data = await req.Content.ReadAsAsync<object>();
name = data?.name;
}
return name == null
? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body")
: req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
}
}
}
The name parameter is the error message you need to pass.Then it will be saved into test/error.txt path.