In Azure DevOps, when trying to find all jobs with time based triggers, is there an alternative to examining them one by one using UI or CLI?
You can try to use Rest API to list builds.
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds?reasonFilter=schedule&api-version=6.1-preview.6
After adding reasonFilter=schedule parameter, you will get all schedule triggered builds and information related to the builds such as definition ids.
Related
I am using npm package based CI-CD approach for ADF. I want to selectively deploy some pipelines and datasets on prod, instead of deploying everything in repository.
Is there any powershell script where I can send list of ADF objects which I want to deploy using my CI-CD pipeline?
Instead of powershell, if there is any other way, please let me know that as well.
As per official documentation, Data factory entities depend on each other.
For example, triggers depend on pipelines, and pipelines depend on datasets and other pipelines. Selective publishing of a subset of resources could lead to unexpected behaviors and errors.
On rare occasions when you need selective publishing, consider using a hotfix.
Steps to deploy a hotfix
I have 2 projects in Azure DevOps and 1 pipeline in each of them.
I start pipeline in the first project manually.
I need this started pipeline to start another pipeline in the second project.
I cannot use $(System.AccessToken) in the REST API call because I get a http 401 error. I suppose due to different project.
How can I start second pipeline? Any automatic way is acceptable.
You can use the Trigger Build Task extension, it allows to trigger a build in a different team project.
Can i run a single job to trigger multiple release pipelines in Azure Devops. We have multiple microservices which are being built and deployed separately that are independent of each other but the business ask now is to treat them as a release and deploy so wondering how that can be achieved off a release tag(1.x.x)which gets tagged for every release run.
We do the same thing for the app I work on. We wrote a PowerShell script to kick off the releases utilizing the DevOps Rest API. The documentation is here: https://learn.microsoft.com/en-us/rest/api/azure/devops/release/definitions?view=azure-devops-rest-6.0.
You basically pass in a list of Release DefinitionIds and then set the status of the latest Release to "inProgress" to start it.
If you want to target a specific release, you can use the searchText query parameter to find a release by the name. We created a variable group with the names of the releases in it and use that to target them.
We have a custom Azure DevOps extension to in order to inject SonarQube pipeline tasks into every definition using the Pipeline Decorator feature. These tasks are a mixutre of both pre and post tasks.
In YAML defined pipelines, the tasks run perfectly, however in Classic pipeline definitions, only the post tasks run, although the classic and YAML pipelines are defined identically (steps, agents, demands, variables etc.).
As this is a relatively new feature of Azure DevOps, there is a lack of documentation, especially regarding classic pipelines.
Is there something that we could possibly be missing for this to happen?
Is there something that we could possibly be missing for this to
happen?
This seems the issue on our side. And, it only exists to the sonarcloud/sonarqube prepare task if we apply it into Decorator.
As you know, users use yaml template for the steps to inserted at the specified location. And in fact, on our backend, this template file is processed through yaml template engine.
As our design, after you enable the Pipeline decorators at organization level. In Initialize job, Pipeline will call one backend class to get the JobContext, which will add decorator providers to JobContext. Then JobContext use these providers to fetch contributions to add pre/post tasks in job while preparing the job to run.
BUT, the sonar prepare task can not actively be detected by engine, then inject it into JobContext. For why I point to this specific task, because this kind of abnormality only exists in the prepare task of sonarcloud and sonarqube until now.
Our team will do some investigation and fix with sonar team together.
Until now, there has 2 work around you could consider to apply.
Work around 1:
As I mentioned previously, this prepare task can not actively be detected and injected into JobContext. So, the first work around we actively add this task info into JobContext via adding prepare task into agent job.
But this will cause one disadvantage is, it will load 2 prepare tasks. One is executed in pre-job, and next it will run second.
Work around 2:
Try to use YAML to build your pipeline until we implement this abnormality thing. So that it will not cause error because of lacking prepare task
Will update the status here to let you know once we have any progress.
I have an Azure DevOps pipeline with multiple jobs (MacOS, Ubuntu, Windows).
I am able to get the general build status by building on the following url
[![Build Status](https://dev.azure.com/{organization}/{project}/_apis/build/status/{pipelineName}?branchName=dev)](https://dev.azure.com/{organization}/{project}/_build/latest?definitionId={definitionId})
Please note the parameters {organization}, {project}, {pipelineName}, {branch} and {definitionId}
However I am not able to show a different badge for any single job/platform in the pipeline. Is that possible?
Azure Pipeline's Badges show the general build status of the pipeline, not a specific job or task.
Create another pipeline and use the new badge.
This functionality has been updated. It is possible to get the status badge at the job level within Azure Pipelines. This can be configured at the top of the status badge page. The only thing you need to do is make sure that the structure of the YAML pipeline corresponds with what you’re looking for in terms of context.