I want to publish data into a Service Bus, from my Storage Account.
I already tried send a simple body and it works fine. But i dont know how should set a data set.
Web Activity Setting
When i run this activity into a pipeline, this send
{
"myMessage": "Sample",
"datasets": [{
"name": "MyDataset",
"properties": {
...
}
}],
"linkedServices": [{
"name": "MyStorageLinkedService1",
"properties": {
...
}
}]
}
and i want send data from the file in dataset. Anyone know how i should set web activity?
You can achieve that by using "Copy Activity".
Here is a quick demo that I made :
I used JsonPlaceHolder API , I want to modify the array and add a custom value by doing a PUT request.
check it out here : https://jsonplaceholder.typicode.com/guide/
please read carefully "Updating a resource"
Here is a Json that I want to modify , I added it as a Dataset in ADF.
The main idea is to set the Dataset as a source and the sink is a REST API method so we are sending the Dataset as an input to the POST request in Copy activity.
Copy activity:
Source:
Sink:
You can read more about it here:
https://learn.microsoft.com/en-us/azure/data-factory/connector-rest?tabs=data-factory#dataset-properties
Here is the output of the Copy Activity:
Related
I have an ADF pipeline, I want to send the output of my activity as an email attachment in the logic app.
I have a lookup activity followed by a For each activity and an Inside For each activity I have a web activity to call the logic app.
I want to send the output of the lookup activity as an email attachment to the logic app. I am not able to think about this integration part.
Create Logic app event trigger with HTTP and Outlook.
Inside HTTP request is received:
Copy HTTP POST URL
Request Body JSON Schema
{ "properties": { "dataFactoryName": { "type": "string" }, "message": { "type": "string" }, "pipelineName": { "type": "string" }, "receiver": { "type": "string" } }, "type": "object" }
POST Method
Send an Email
Connect Your outlook email .
Use HTTP POST URL as shown in step1
Create parameter name receiver
Add dynamic this content
s
{
"message" : "This is the row from lookup item #{item().customerID},#{item().gender},#{item().SeniorCitizen},#{item().Partner}.",
"dataFactoryName" : "#{pipeline().DataFactory}",
"pipelineName" : "#{pipeline().Pipeline}",
"receiver" : "#{pipeline().parameters.receiver}"
}
Pipeline successfully executed and got the output:
There is no direct or easy way to send email attachment from ADF.
But as a workaround first you will have to save the output of your lookup activity to a file and then follow the approach described in this video by a community volunteer where logic apps come into play to send the lookup activity output data file as an attachment. How To Send File as Attachment From Azure Data Factory - Azure Data Factory Tutorial 2021
In order to save the lookup output data to a file you can follow this approach: Get Output of lookup activity in a file
I have Azure Data Factory Pipeline and I would like to send notification to Slack in the end of pipeline. Notification body is formed from content of database data.
Best practice to integrate Azure Data Factory and Slack?
A) ADF Webbook(to slack) -> Slack
B) ADF Web -> Logic Apps Web+Webhook -> Slack
Below is one way that worked for me
Firstly I have taken 2 variables
ListOfFiles - Array
strListOfFiles - string
Here is the pipeline that Im using:
ForEach loop Activities:
I have made items in my settings to read the child items i.e..
#activity('Get List of Files').output.childitems
Then in the set variables I'm storing all the arrays of ListOfFiles inside strListOfFiles
And then in the Web Im using my Logic App URL making post method having { "ListOfFiles":#{variables('strListOfFiles')} } inside the body.
Logic App workflow
Im using the below JSON Schema inside the HTTP request
{
"properties": {
"ListOfFiles": {
"type": "array"
}
},
"type": "object"
}
I have a node app that uses the Watson Conversation service. I am able to successfully trigger a call to another API via a dialogue node using the JSON that it uses for the reply. However after reading up it seems I am doing it wrong. I am triggering my client server to make a REST call by adding an action property to the context.
{
"context": {
"action": "lookup"
},
"output": {}
}
When I get my result I add it onto the context object and pass it back to the conversation service. This seems to work ok, but it causes some issues.
1) having to manually delete these props after I trigger the thing I want
2) In conversation I must wait for user input even though I am not actually requesting user input on the front end but rather my client app is sending a message with no input text and the results of the REST call on the context object. This message which is returned to the conversation at the node where the action was triggered is what triggers the child nodes. It seems like there is a standardized way IBM wants you to make these programmatic calls regardless of if it's to an IBM cloud function, or your own client app. https://console.bluemix.net/docs/services/conversation/dialog-actions.html#dialog-actions
docs method:
{
"context": {
"variable_name" : "variable_value"
},
"actions": [
{
"name":"<actionName>",
"type":"client | server",
"parameters": {
"<parameter_name>":"<parameter_value>",
"<parameter_name>":"<parameter_value>"
},
"result_variable": "<result_variable_name>",
"credentials": "<reference_to_credentials>"
}
],
"output": {
"text": "response text"
}
}
Is this a new feature? I was referencing sample projects for my own app and I didn't see this pattern. By using this format will it tell the parent node to wait for a response to come back before trying to process the children? Will it prevent me from needing to delete properties off the context object so that I'm not calling the same action over and over with the same parameters in further turns of the conversation?
I would like to create a stream analytics job using only Powershell. I know that the command to do this is: New-AzureRMStreamAnalyticsInput. However it requires a JSON file with job details. I found a documentation provided by Microsoft where there is a small template of such a JSON file (check Create paragraph). However it's not enough for me.
I want to create an input from blob storage hence my JSON looks like this:
{
"properties":{
"type":"stream",
"datasource":{
"type":"Microsoft.Storage/Blob",
"properties":{
"accountName":"abc",
"accountKey":"###",
"container":"appinsights",
"pathPattern":"test-blob_2324jklj/PageViews/{date}/{time}",
"dateFormat":"YYYY-MM-DD",
"timeFormat":"HH"
}
}
}
}
After saving it and passing as an argument in New-AzureRMStreamAnalyticsInput I receive following error: New-AzureRMStreamAnalyticsInput : Stream analytics input name cannot be null. I think that my JSON file is not correct.
Do you have any templates of json files containing stream analytics job details or can you just tell me how to correctly set up a job through powershell?
A simple way of getting your template right is to manually create an input from Portal and then run PowerShell command Get-AzureRmStreamAnalyticsInput to get the JSON payload.
From your example, it seems you missed the input name. Try something like below:
{
"Name": "BlobInput1",
"Properties": {
... ...
}
}
I am trying to do a performance test with a REST Webservice. I added a HTTP Request, Header Manager, HTTP Request defaults, View Results Tree, CSV data config file.
This is how i parametrized my Post Message
{ "groupList": [ {"group":"newgroup"} ], "user": "${user}",
"password": "test", "email": "test#go2group.com", "role": "USER", "ui": false}
This is my CSV Data Set Config
The Problem i face is "<EOF>" is getting added instead of the values.
POST http://cawin.go2group.com/ConnectAll/rest/useradmin/user
POST data:
{ "groupList": [ {"group":"newgroup"} ], "user": "<EOF>",
"password": "test", "email": "test#go2group.com", "role": "USER", "ui": false}
kindly excuse me for my ignorance.
The most common problems are:
Using incorrect relative path to CSV file
Using incorrect value of "Recycle on EOF" dropdown
Incorrect placement of CSV Data Set Config element (sampler out of its scope)
In the majority of cases the answer is in either jmeter.log file or in Debug Sampler / View Results Tree listener combination
See Using CSV DATA SET CONFIG for detailed end-to-end instructions. If you still experience problems please update your question with first few lines of CSV file and Test Plan screenshot showing CSV Data Set Config location and configuration.