How to insert an entity into Azure Storage table using Web Activity of Azure Data Factory service - azure-data-factory

I have a table in a storage account. I would like to do a test by inserting an entity into this table using Web Activity with the guide from this link (https://learn.microsoft.com/en-us/rest/api/storageservices/insert-entity).
I also tried to create a header in the Web Activity settings with the following format for my shared key (https://learn.microsoft.com/en-us/rest/api/storageservices/authorize-with-shared-key):
Authorization="SharedKey <_AccountName>:<_Signature>"
But it seems that there is no function in the dynamic expression to make a Hash-based Message Authentication Code (HMAC) for the <_Signature>.
Could someone give me some sample or some hints? Thanks.

We have a provision for using sha2 encoding in expression builder while using Data Flows.
But while using web activity in Data factory pipelines you will have to use a workaround. Here is what I tried, Call a serverless function app based on powershell to encode the signature.
basic idea in powershell:
$ClearString = "String_to_sign"
$hasher = [System.Security.Cryptography.HashAlgorithm]::Create('sha256')
$hash = $hasher.ComputeHash([System.Text.Encoding]::UTF8.GetBytes($ClearString))
$hashString = [System.BitConverter]::ToString($hash)
$body = $hashString.Replace('-', '').ToLower()
1. Call the function app:
With body: a JSON with String_to_sign
{
"name": 'pipeline().parameters.StringToSign'
}
2. Assign function app output(HMAC) to a variable: (to later encode using base64)
#activity('Azure Function1').output.Response
3. Configure WebActivity as per your scenario:
Note: I have used sample data for demonstration purpose, please use this method modifying as per your need.
Encode HMAC and prep Authorization header by using base64 function.
Authorization: #concat('SharedKey kteststoragee:',base64(variables('sha256')))
Build Authorization header following MS doc: Table service (Shared Key authorization Use string function such as Concat to build the final string.

Related

connect to REST endpoint using OAuth2

I am trying to explore different options to connect to a REST endpoint using Azure Data Factory. I have the below python code which does what I am looking for but not sure if Azure Data Factory offers something out of the box to connect to the api or a way to call a custom code.
Code:
import sys
import requests
from requests_oauthlib import OAuth2Session
from oauthlib.oauth2 import BackendApplicationClient
import json
import logging
import time
logging.captureWarnings(True)
api_url = "https://webapi.com/api/v1/data"
client_id = 'client'
client_secret = 'secret'
client = BackendApplicationClient(client_id=client_id)
oauth = OAuth2Session(client=client)
token = oauth.fetch_token(token_url='https://webapi.com/connect/accesstoken', client_id=client_id, client_secret=client_secret)
client = OAuth2Session(client_id, token=token)
response = client.get(api_url)
data = response.json()
When I look at the REST linked service I don't see many authentication options
Could you please point to me on what activities to use to make OAuth2 working in Azure Data Factory
You would have to use a WebActivity to call using POST method and get the authentication token before getting data from API.
Here is an example.
First create an Web Activity.
Select your URL that would do the authentication and get the token.
Set Method to POST.
Create header > Name: Content-Type Value: application/x-www-form-urlencoded
Configure request body for HTTP request.
..
Format: grant_type=refresh_token&client_id={client_id}&client_secret=t0_0CxxxxxxxxOKyT8gWva3GPU0JxYhsQ-S1XfAIYaEYrpB&refresh_token={refresh_token}
Example: grant_type=refresh_token&client_id=HsdO3t5xxxxxxxxx0VBsbGYb&client_secret=t0_0CqU8oA5snIOKyT8gWxxxxxxxxxYhsQ-S1XfAIYaEYrpB&refresh_token={refresh_token
I have shown above for example, please replace with respective id and secret when you try.
As an output from this WebActivity, you would receive a JSON string. From which you can extract the access_token to further use in any request header from further activities (REST linked service) in the pipeline depending on your need.
You can get the access_token like below. I have assigned it to a variable for simplicity.
#activity('GetOauth2 token').output.access_token
Here is an example from official MS doc for Oauth authentication implementation for copying data.

What is the appropriate way to build JSON within a Data Factory Pipeline

In my earlier post, SQL Server complains about invalid json, I was advised to use an 'appropriate methods' for building a json string, which is to be inserted into a SQL Server table for logging purposes. In the earlier post, I was using string concatenation to build a json string.
What is the appropriate tools/functions to build json within a Data Factory pipeline? I've looked into the json() and string() functions, but they would still rely on concatenation.
Clarification: I'm trying to generate a logging message that looks like this: Right now I'm using string concatenation to generate the logging json. Is there a better, more elegant (but lightweight) way to generate the json data?
{ "EventType": "DataFactoryPipelineRunActivity",
"DataFactoryName":"fa603ea7-f1bd-48c0-a690-73b92d12176c",
"DataFactoryPipelineName":"Import Blob Storage Account Key CSV file into generic SQL table using Data Flow Activity Logging to Target SQL Server",
"DataFactoryPipelineActivityName":"Copy Generic CSV Source to Generic SQL Sink",
"DataFactoryPipelineActivityOutput":"{runStatus:{computeAcquisitionDuration:316446,dsl: source() ~> ReadFromCSVInBlobStorage ReadFromCSVInBlobStorage derive() ~> EnrichWithDataFactoryMetadata EnrichWithDataFactoryMetadata sink() ~> WriteToTargetSqlTable,profile:{ReadFromCSVInBlobStorage:{computed:[],lineage:{},dropped:0,drifted:1,newer:1,total:1,updated:0},EnrichWithDataFactoryMetadata:{computed:[],lineage:{},dropped:0,drifted:1,newer:6,total:7,updated:0},WriteToTargetSqlTable:{computed:[],lineage:{__DataFactoryPipelineName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineName]}]},__DataFactoryPipelineRunId:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineRunId]}]},id:{mapped:true,from:[{source:ReadFromCSVInBlobStorage,columns:[id]}]},__InsertDateTimeUTC:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__InsertDateTimeUTC]}]},__DataFactoryName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryName]}]},__FileName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__FileName]}]},__StorageAccountName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__StorageAccountName]}]}},dropped:0,drifted:1,newer:0,total:7,updated:7}},metrics:{WriteToTargetSqlTable:{rowsWritten:4,sinkProcessingTime:1436,sources:{ReadFromCSVInBlobStorage:{rowsRead:4}},stages:[{stage:3,partitionTimes:[621],bytesWritten:0,bytesRead:24,streams:{WriteToTargetSqlTable:{type:sink,count:4,partitionCounts:[4],cached:false},EnrichWithDataFactoryMetadata:{type:derive,count:4,partitionCounts:[4],cached:false},ReadFromCSVInBlobStorage:{type:source,count:4,partitionCounts:[4],cached:false}},target:WriteToTargetSqlTable,time:811}]}}},effectiveIntegrationRuntime:DefaultIntegrationRuntime (East US)}",
"DataFactoryPipelineRunID":"63759585-4acb-48af-8536-ae953efdbbb0",
"DataFactoryPipelineTriggerName":"Manual",
"DataFactoryPipelineTriggerType":"Manual",
"DataFactoryPipelineTriggerTime":"2019-11-05T15:27:44.1568581Z",
"Parameters":{
"StorageAccountName":"fa603ea7",
"FileName":"0030_SourceData1.csv",
"TargetSQLServerName":"5a128a64-659d-4481-9440-4f377e30358c.database.windows.net",
"TargetSQLDatabaseName":"TargetDatabase",
"TargetSQLUsername":"demoadmin"
},
"InterimValues":{
"SchemaName":"utils",
"TableName":"vw_0030_SourceData1.csv-2019-11-05T15:27:57.643"
}
}
You can using Data Flow, it help you build the JSON string within pipeline in Data Factory.
Here's the Data Flow tutorial: Mapping data flow JSON handling.
It can help you:
Creating JSON structures in Derived Column
Source format options
Hope this helps.

QLIK Sense - REST api chain call

I need to integrate data in my Qlik Sense project using cloud REST api.
I need to call a chain of API as I firstly need the Token
Basically:
1) "Token" REST passing user+psw getting token
2) "API2" REST passing token received from 1 in the BODY
I think I need to use the data script feature, I'm able to create separately the 2 REST call, but how can I pass tokn dinamically in the Body?
Is there a specific code to be added?
Thx
Find an answer here:
https://community.qlikview.com/thread/224957
Basically just edit and parse Body variable:
let vRequestBody = '{"call":"ListarCategorias","app_key":"XXXXXXXX","app_secret":"XXXXXXXXXX","param":[{"pagina":"$(vPagina)","registros_por_pagina":100,"apenas_importado_api":"N"}]}';
let vRequestBody = replace(vRequestBody,'"', chr(34)&chr(34));
and use this at the end of "RestConnectorMasterTable" default scripting snippet WITH CONNECTION(BODY "$(vRequestBody)"):
RestConnectorMasterTable:
SQL SELECT
"__KEY_root",
(SELECT
"codigo",
"totalizadora",
"transferencia",
"__FK_categoria_cadastro"
FROM "categoria_cadastro" FK "__FK_categoria_cadastro")
FROM JSON (wrap on) "root" PK "__KEY_root"
WITH CONNECTION(BODY "$(vRequestBody)");

Invalid_request_parameter (create and sending envelopes)

I'm trying to use a service of DocuSign API in an abap project. I want to send a document to a specific email so it can be signed. But im getting the following error:
"errorCode": "INVALID_REQUEST_PARAMETER",## "message": "The request contained at least one invalid parameter. Query parameter 'from_date' must be set to a valid DateTime, or 'envelope_ids' or 'transaction_ids' must be specified.
I tried the following:
CALL METHOD cl_http_client=>create_by_url
EXPORTING
url = l_url (https://demo.docusign.net/restapi/v2/accounts/XXXXXX')
proxy_host = co_proxy_host
proxy_service = co_proxy_service
IMPORTING
client = lo_http_client
lo_http_client->request->set_method( method = 'POST').
CALL METHOD lo_http_client->request->set_header_field
EXPORTING
name = 'Accept'
value = 'application/json'.
CALL METHOD lo_http_client->request->set_header_field
EXPORTING
name = 'X-DocuSign-Authentication'
value = get_auth_header( ). (json auth header)
CALL METHOD lo_http_client->request->set_cdata
EXPORTING
data = create_body( ).
This is my body:
CONCATENATE
`{`
`"emailSubject": "DocuSign REST API Quickstart Sample",`
`"emailBlurb": "Shows how to create and send an envelope from a document.",`
`"recipients": {`
`"signers": [{`
`"email": "test#email",`
`"name": "test",`
`"recipientId": "1",`
`"routingOrder": "1"`
`}]`
`},`
`"documents": [{`
`"documentId": "1",`
`"name": "test.pdf",`
`"documentBase64":` `"` l_encoded_doc `"`
`}],`
`"status": "sent"`
`}` INTO re_data.
The api request to get the Baseurl is working fine. (I know the error is quite specific what the problem is, but i cant find any sources on the docusign api documentation that one of the mentioned parameters should be added to the request)
Thank you in regards
The error message seems to indicate that you're Posting to an endpoint that requires certain query string parameters -- but you're not specifying them as expected in the query string. I'd suggest you check the DocuSign API documentation for the operation you are using, to determine what query string parameters it requires, and then ensure that you're including those parameters in your request URL.
If you can't figure this out using the documentation, then I'd suggest that you update your post to clarify exactly what URL (endpoint) you are using for the request, including any querystring parameters you're specifying in the URL. You can put fake values for things like Account ID, of course -- we just need to see the endpoint you are calling, and what qs params you're sending.
To create an envelope, use
https://demo.docusign.net/restapi/v2/accounts/XXXXXX/envelopes
instead of
https://demo.docusign.net/restapi/v2/accounts/XXXXXX
Thank you for all the answers, i found the mistake. Creating the request wasn´t the problem. I was using the wrong "sending"-method -_-.
now its working :)
lo_rest_client->post( EXPORTING io_entity = lo_request_entity ).

Integrate AngularJS App with SoftwareAG webMethods Integration Server

I have been trying to set up a sample AngularJS app with webMethods Integration Server on the backend. Using $resource, I can easily pull normal JSON files and manipulate the data within the file. However, the goal is that I want to create services in webMethods Designer and call them from AngularJS using $resource to display the data in my app. The problem is that from AngularJS I cannot extract the data I need from the service that I'm creating in Designer. In Designer I can use (in WMPublic) documentToJSONString, and output something like:
jsonString {"id":"1", "name":"Dan", "quantity":"3"}
But I cannot extract the data because this is not a pure JSON string. Does anyone know how to (1) extract the JSON string output data using AnularJS or (2) output a JSON document from Designer? I am calling a REST service; something to the effect of
http://localhost:2222/rest/Get/getOrderData
from my services.js file in AngularJS.
Here is my services.js file:
/* Services */
var orderServices = angular.module('orderServices', ['ngResource']);
orderServices.factory('Order', ['$resource',
function($resource){
return $resource('http://localhost:2222/rest/REST/getOrderData', {}, {
query: {method:'GET', isArray:true}
});
}]);
Then, in my app, I want to use an ng-repeat to call things like {{order.id}}, {{order.name}} etc. Is anyone good with webMethods and Angular or done this before?
To force the response that you want, I would have used the service
pub.flow:setResponse mapping the jsonString to it's string parameter and probably hardcoded (eww!) the contentType parameter to 'application/json'
You may also need to use the service pub.flow:setResponseCode to set the response code.
They would be the last services in getOrderData
I would have invoked it using the below (where namespace is the folder structure in designer)
http://localhost:2222/invoke/namespace:getOrderData
The above applies to Integration Server V8 and it looks like you're using V9 since some of the services that you mention didn't exist in V8. This would also apply to a normal flow service, not a specific REST one (assuming they exist in V9).