Azure data factory and Log analytics - azure-data-factory

I want to perform some validation checks in ADF on my input data and any validation failures want to capture into Azure log analytics.
Can someone guide me how to capture the custom logs into log analytics through Azure Data Factory please.
Any example dataflow/pipeline would be very helpful.
Thanks,
Kumar

If I understand correctly you want to be able to get the Azure Monitor logs for ADF and query/store these logs?
Well the good news is most of the information you would want to see is already collected through Azure Monitor.
One of the simple methods to pull the information is to use the Azure Monitor REST API. You can then store the response into a file or table, or you can just query the api for specific pipelines or triggers etc.
Here is a link with example of Authorization and using the Azure Monitor API:
https://learn.microsoft.com/en-us/azure/azure-monitor/essentials/rest-api-walkthrough#authenticating-azure-monitor-requests
This is an example of the HTTP URL using the Azure Monitor REST API to get Activity Run data (Dynamic content syntax):
#{concat('https://management.azure.com/subscriptions/', linkedService().SubscriptionID, '/resourceGroups/', linkedService().ResourceGroupName, '/providers/Microsoft.DataFactory/factories/', linkedService().DataFactoryName, '/pipelineruns/', linkedService().RunID, '/queryActivityruns?api-version=2018-06-01')}
Here are all the different ADF Metrics that can be pulled from Azure Monitor:
https://learn.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#data-factory-metrics
Here is an example of a REST Linked service in ADF that can be used to call the rest api:
You could then create a dataset that will pass all the values to the linked service so that you can call the API, and copy the response to a DB, like here:
This particular example is getting the status of a specific Pipeline RunID, but it can be much more broad than that.
Here is what the request body looks like so that I could filter to just failed pipelines in the last day:

I'm looking into this myself, as far as I can tell you would have to use a REST or HTTP connector to send a POST request to the HTTP Data Collector API in Log Analytics. More details here: https://learn.microsoft.com/en-gb/azure/azure-monitor/logs/data-collector-api

Related

How to Extract Jira test results data using azure data factory..?

I am facing the issue with extrct projects test results data from Jira using API in azure data factory.
Please help me on this..
I don't have experience using XRAY API, but I have made other API calls via ADF. Based on Xray documentation you would need to make two API calls in ADF.
Get Access Token/API Key
Use Access Token to Get Test Results
Theoretically it would look something like this using these links:
https://docs.getxray.app/display/XRAYCLOUD/Authentication+-+REST
https://docs.getxray.app/display/XRAY/Tests+-+REST#TestsREST-GettingallTestsstatuses
Step 1: Get Access Token (refer to link and screenshot here)
Step 2: Use Access Token to get Test Results (refer to link and screenshot)
This should be pretty close to what you need, but I am unable to test since I don't use xray, but I hope this helps.

How to copy data from rest api using oauth to azure blob storage as json using adf

I have difficulties creating an ADF pipeline that pulls data from a REST API using oAuth and writes it to azure blob storage in JSON format. I am a python developer and relatively new to adf. I will explain in more detail below.
I need to obtain an access token by posting a request to the token endpoint. This step works fine.
I need to pull the data from the API using a GET request in a copy activity. Also I need to include pagination. Is there something wrong on the way I define the pagination rules? "#odata.nextLink" is full description of the json key; is this correctly spelled in the value section?
I need to copy the data to a blob storage in json format. I created the sink dataset.
Upon running the pipeline it asks for 2 pipeline parameters, what am I supposed to write here?
I get the following error, what am I doing wrong?

How to explicitly fail azure data factory pipeline?

Is there any method to explicitly fail an azure data factory pipeline?
If you would like to fail your pipeline explicitly, one possible way is to have an invalid URL in your web activity which will fail the Web activity, which inturn will result in your pipeline to fail.
There is an existing feature request related to the same requirement in ADF user voice forum suggested by other ADF users. I would recommend you please up-vote and/or comment on this feedback which will help to increase the priority of the feature request implementation.
ADF User voice feedback related to this requirement: https://feedback.azure.com/forums/270578-data-factory/suggestions/38143873-a-new-activity-for-cancelling-the-pipeline-executi
Additional info:
In case if you just want to cancel your pipeline run then you can have a Web activity which calls the below REST API to cancel the pipeline run by using the pipelinerunID (you can get this value by using dynamic expression - #pipeline().RunId)
REST API to Cancel the Pipeline Run: POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/pipelineruns/{runId}/cancel?api-version=2018-06-01
MS Doc related to Rest API: ADF Pipeline Runs - Cancel
Hope this helps.
Just set a variable with #string(div(1, 0))
It depends at which level you would like to stop the operation. For example at MSSQL DB operation, from SP you may RAISERROR above level 18 to raise an exception. As already stated incase of web Activity, request a non existent URL. That would also stop the process

Azure Custom Connector for 3rd Party API with Oauth 2.0 access token

I am trying to get data from a 3rd party API into an Azure SQL DB using Azure Data Factory with out using SSIS.
This lead me down a rabbit hole and I have been searching for 3 days now and cannot find a solution.I must be missing something.
I have tried using Azure Data Factory and the copy data controls.
I then tried using power apps and cant find anything that helps.
I then tried custom connector, from scrach, from postman and from OpenAPI
I cannot get any of it to work!
I really thought this would be easier than this?
I have read almost all of the standard MicroSoft documentation and none of it helps with my specific scenario.
I have a third party web site that I get an authorisation token from using a username and password with grant-type=password
Using this token I then get JSON data from the site.
I want to get this data into my SQL DB in Azure.
Thats it!
Any help would be greatly appreciated.
Thanks...
PS:Next step is the same thing but API returns XML, rabbit hole 2...
Here is a official guide that import data from Azure Storage to Azure SQL Database using pipeline.
I think this guide will be helpful for you. Before you follow this guide to meet your requirement, you should import your JSON data from 3rd party API first. So you can use Azure logic App to finish this process quickly.
I have finished this process , it is easy and just few steps :
1. Getting access token from 3rd party identity provider(here I use Azure AD as identity provider, just for demo) using http request action:
2. Generally , we can get an access token from step1 , use this auth info to call your 3rd API to fetch json data using http request action directly:
The API I am using is a sample demo one, just reply a json as :
{
"name": "stan",
"id": "123"
}
Using create blob action to create a blob file with the json content from previous step (you should create a storage account with a blob container first , so that you can store files, here I name the json file with guid)
Run this logic App you can see that a json file has been created in Azure storage account with the json data from API :
So the steps left , you can just refer to the official guide to import data to your SQL, just selecting json as resource format in Configure source section.
If you have anything unclear , pls feel free to let me know. Hope it helps : )

Azure Media Services Encoding Job Callback to URL

Using only the REST API, I am able to upload a file to Azure Media Services from my local machine and start an encoding job. Then I need to poll the job for status to see when it is done. But, what I really want is for Azure Media Services to send a request to my callback URL when it is done. Is there way to do this?
Take a look at our Notifications features which supports WebHooks.
https://learn.microsoft.com/en-us/azure/media-services/media-services-dotnet-check-job-progress-with-webhooks
It integrates well with Azure Functions also - if you want to host your callback in Azure Functions and just leverage the WebHook trigger in there.
We have some examples of doing that up here:
https://github.com/Azure-Samples/media-services-dotnet-functions-integration/tree/master/101-notify-webhooks

Categories