Azure data factory API - azure-data-factory

I have constructed a very simple pipeline to test the concept of a pipeline monitoring itself using the REST API, but I am constantly running into 404 Resource not found errors.
here is the general format of my GET request.
https://management.azure.com/subscriptions/_MY SUB_/resourceGroups/MY_RG/providers/Microsoft.DataFactory/factories/#{pipeline().DataFactory}/pipelineruns/#{pipeline().RunId}?api-version=2018-06-01
Even using the static try it functionality from learn.microsoft.com using hard coded known values also results in a 404.
The run id being generated from #{pipeline().RunId} matches that seen from the monitor.

Issue solved.
in ADF you cannot utilize the get run id capability of the REST API on runid's that are from debug runs.
The runID must be from a 'triggered' pipeline.

Related

List multi-stage automated test run results, on a per-stage basis, using Azure DevOps GET Api?

Is there a way to list multi-stage automated test run results, on a per stage basis, using Azure DevOps GET Api?
If you call /_apis/pipelines/{pl_def_id}/runs , you will get a list of builds from a defined pipeline.
If you get the latest build from that list, and if that build is multi-stage, the test runs do not seem to be hierarchically beneath the latest build, as you might expect.
To get the test runs, when you call /_apis/test/runs, ( https://learn.microsoft.com/en-us/rest/api/azure/devops/test/runs/list ) it only can give you the list of ALL runs within your org+project. There does not seem to be an id that references the multi-staged "test runs" tasks back to the "parent pipeline" having a "build id".
I looked at timeline queries, but they don't seem to have that info either.
HOW can this be achieved with ADO GET api? Am I overlooking the answer?
There is a parameter buildUri in the run list API as you mentioned the doc:Run List
You can use buildUri to get test runs from a specified build of pipeline.
However, if you want to get multi-stage automated test run results on a per stage, I’m afraid that there’s no supported API to achieve it.
You may submit a feature request at website below:
feature request

Cannot create a batch pipeline to get data from ZohoCRM with http plugin 1.2.1 to BigQuery. Retuns Spark Program 'phase-1' failed

My first post here and I'm new to Data Fusion and I'm with low to no coding skills.
I want to get data from ZohoCRM to BigQuery. Module from ZohoCRM (e.g. accounts, contacts...) to be a separate table in BigQuery.
To connect to Zoho CRM I obtained a code, token, refresh token and everything needed as described here https://www.zoho.com/crm/developer/docs/api/v2/get-records.html. Then I ran a successful get records request as described here via Postman and it returned the records from Zoho CRM Accounts module as JSON file.
I thought it will be all fine and set the parameters in Data Fusion
DataFusion_settings_1 and DataFusion_settings_2 it validated fine. Then I previewed and ran the pipeline without deploying it. It failed with the following info from the logs logs_screenshot. I tried to manually enter a few fields in the schema when the format was JSON. I tried changing the format to csv, nether worked. I tried switching the Verify HTTPS Trust Certificates on and off. It did not help.
I'd be really thankful for some help. Thanks.
Update, 2020-12-03
I got in touch with Google Cloud Account Manager, who then took my question to their engineers and here is the info
The HTTP plugin can be used to "fetch Atom or RSS feeds regularly, or to fetch the status of an external system" it does not seems to be designed for APIs
At the moment a more suitable tool for data collected via APIs is Dataflow https://cloud.google.com/dataflow
"Google Cloud Dataflow is used as the primary ETL mechanism, extracting the data from the API Endpoints specified by the customer, which is then transformed into the required format and pushed into BigQuery, Cloud Storage and Pub/Sub."
https://www.onixnet.com/insights/gcp-101-an-introduction-to-google-cloud-platform
So in the next weeks I'll be looking at Data Flow.
Can you please attach the complete logs of the preview run? Make sure to redact any PII data. Also what is the version of CDF you are using? Is CDF instance private or public?
Thanks and Regards,
Sagar
Did you end up using Dataflow?
I am also experiencing the same issue with the HTTP plugin, but my temporary way to go around it was to use a cloud scheduler to periodically trigger a cloud function that fetches my data from the API and exports them as a JSON to GCS, which can then be accessed by Data Fusion.
My solution is of course non-ideal, so I am still looking for a way to use the Data Fusion HTTP plugin. I was able to make it work to get sample data from public API end-points, but for a reason still unknown to me I can't get it to work for my actual API.

Testing service session management via REST

I need to write test for some JAX RS web service that asserts that certain value is cached in the session from disk on the first request in the session.
The testing process does not have access to the tested process. The use case involves using REST API to invoke services.
I can think of several options to proceed with:
Create a REST endpoint just for testing, and query there the needed session value.
Write and then read a log message.
I am aware that I am trying to test an implementation detail via an external API which does not provide contract for this detail, but currently I'm a bit constrained about which processes may be run by the testing infrastructure.
Are there any additional seams to exploit for testing, and what general good practice exists for this scenario?
I just came up with the idea of changing the cached resource and using the change in the behavior.

How do I retrieve my custom variables from a Bamboo Atlassian Build Plan via REST API

I have a bamboo plan that runs on every commit to a github pull request. In that bamboo plan there are a few custom variables on it such as Git Sha, Github Pull Request Number, etc.
I want to write a script that stops all previous builds (multiple concurrent builds) that have the same pull request number -- same custom variable value.
The reason for this is that if someone makes a quick change to their pull request (comments on the review, etc) that we don't have multiple builds running when only the last one is necessary.
I know it is possible to stop a build with a rest request, but I need a way to be able to get all running builds with custom variable value = 27 (pull request number). Once I know this, I can proceed.
At the time of writing, the REST API documentation doesn't list any method of querying the running builds for a particular build variable.
A solution would be to create your own plugin for Bamboo that exposes a REST service that does this query for you, but I don't know which of the Java APIs you would need to use in order to perform that query.
Here is how I solved this ...
You can call /rest/api/latest/result/<plankey>-latest?includeAllStates=true&expand=variables where plankey is the key for the specific Bamboo build plan.
You then loop through the results you get back, looking for a lifeCycleState value that is not Finished, and a custom variable with the desired name to see if it matches the PR number you have.

Jenkins Workflow API - stage status

What information are we able to query on a workflow job? Anything regarding a particular build's stage status (succeeded, failed, hasn't reached yet, aborted, etc.)? I see we can interact with the input step using this method, but where can we find what metadata, if anything, can be obtained about our builds?
The REST exported API for builds (…/job/…/…/api/json?tree=…) is not very extensive yet. You can get some information about nodes in the flow graph (steps, and some associated block nodes—the stuff you see in Workflow Steps). It is possible to extract some information about stages from that, albeit not easily. Much more is available from the Java API.