I am new to Azure pipelines. My team has inherited an existing pipeline, and I am just trying to figure out what these specific settings are doing.
We have a stage with a job with a task:
task: DownloadBuildArtifacts#0
inputs:
buildType: 'specific'
specificBuildWithTriggering: true
buildVersionToDownload: 'latestFromBranch'
branchName: '<path-to-branch>'
There are more settings of course, but I'm really trying to wrap my head around the "buildType: specific" one. The docs say "Download artifacts produced by the current build, or from a specific build." But I'm not sure how we're specifying "specific". Is it the specificBuildWithTriggering setting? Or is the the buildVersionToDownload setting?
Trying to read this (and ignoring "specific"), it looks like we're saying "build with artifacts from the triggering commit, but also get the latest". To me, those could be different, if 2 builds are triggered by rapid commits. Am I reading that right? What happens if there is a conflict?
Finally, the default for buildType is 'current'. To me this sounds like, "just download the current from the branch" ... which to me also sounds like I could skip 3 of those settings, right?
Thanks for any clarity the community can provide!
Related
I need to modify an existing YAML pipeline so that it downloads an artifact published from another existing ADO pipeline. The other pipeline is a Classic one in case it matters.
In the current setup, a daily Release pipeline takes the artifact from the Classic pipeline and pushes it to a company repository external to ADO.
Now, the YAML pipeline is only run occasionally and it is run manually. Currently it downloads the artifact from the external repo to which the Release pipeline pushed. This hasn't been a problem generally. However a recent issue highlighted that it would be desirable to be able to avoid the delays built in to the current approach and essentially just grab the artifact directly from the latest run of the Classic pipeline.
When I set out to do this, I assumed that it would be straightforward but I seem to have run into a brick wall. The only information I have found describes using DownloadPipelineArtifact#2 but this depends on various IDs like the pipeline ID and run ID which it seems are not easily obtainable.
I am pretty new to ADO. I'm not a devops person at all really but I've had this put on my plate. So before I spend too much time on this, am I missing something or is this just something that one should not really be doing in ADO? If it is possible, is there a guide somewhere?
UPDATE
Thanks to a useful answer from daniel-mann, I was able to get this working but there were some quirks that I thought I should mention in the event that they might be useful to anyone else.
When I started adding the DownloadPipelineArtifact#2 task (this was editing directly in ADO on a browser), ADO was hinting field names to me that seemed to be different from the documented ones. Possibly these were aliases but I had a hard time knowing what to trust with respect to documentation.
I also noticed a Settings "link" had appeared above the first line of the task definition. When I clicked on this it opened up an editor to the right of the page that helped fill in the fields. It provided dropdowns for things like the project and the pipeline ID.
This what I ended up with:
- task: DownloadPipelineArtifact#2
displayName: "my task description"
inputs:
buildType: 'specific'
project: <long "UID" string identifying project>
definition: <numeric id for pipeline>
buildVersionToDownload: 'latest'
artifactName: <name of artifact as defined in upstream pipeline>
targetPath: '$(Pipeline.Workspace)'
Note that the editor tool added a definition field, but apparently this is an alias for pipeline. I am not sure why it thinks this is more helpful.
Unfortunately the above did not work. I saw this error:
##[error]Pipeline Artifact Task is not supported in on-premises. Please use Build Artifact Task instead.
I don't know what caused this - perhaps the ADO setup in my organization? As I understand it the Build Artifact Task is deprecated in favour of the Pipeline Artifact Task but I did not have any choice but to try it and this time it did work for me.
This time I used the "Settings" editor from the outset and ended up with this:
- task: DownloadBuildArtifacts#0
displayName: "my task description"
inputs:
buildType: 'specific'
project: $(System.TeamProjectId)
pipeline: <numeric ID as above>
buildVersionToDownload: 'latest'
downloadType: 'single'
artifactName: '$(ARTNAME)'
downloadPath: '$(System.ArtifactsDirectory)'
The fields that I manually edited here were:
using our own ARTNAME variable that is we define to be the artifact name in one of our variable groups. The relevant variable group is imported to this pipeline.
using the builtin System.TeamProjectId for the project name. This seemed prefereable to having the "UID" string in there. (Though I also found that the normal name string for our project worked here too.)
but this depends on various IDs like the pipeline ID and run ID
Not for your use case.
You said
just grab the artifact directly from the latest run of the Classic pipeline.
In which case, referring to the parameters explained in the documentation,
# Download pipeline artifacts
# Download build and pipeline artifacts
- task: DownloadPipelineArtifact#2
inputs:
#source: 'current' # Options: current, specific
#project: # Required when source == Specific
#pipeline: # Required when source == Specific
#preferTriggeringPipeline: false # Optional
#runVersion: 'latest' # Required when source == Specific# Options: latest, latestFromBranch, specific
#runBranch: 'refs/heads/master' # Required when source == Specific && RunVersion == LatestFromBranch
#runId: # Required when source == Specific && RunVersion == Specific
#tags: # Optional
#artifact: # Optional
#patterns: '**' # Optional
#path: '$(Pipeline.Workspace)'
You would just need to set the project, pipeline, source: specific, and runVersion: latest parameters.
Or you could use the download alias, which is a little bit simpler but can achieve the same thing
The problem i am running into is perfectly explained here, but at the time the user was pointed to another forum and no solution is written, if even available...
https://github.com/microsoft/azure-pipelines-yaml/issues/459
In short, my problem is that I have defined a pipeline resource in a YAML Pipeline.
pipelines:
- pipeline: MainRebuild
project: ProjectName
source: 'Main - Rebuild'
branch: feature/DummySmokeTest
trigger:
branches:
- master
- feature/DummySmokeTest
The trigger works great, if the MainRebuild completes as Successful or PartiallySucceeded, a new pipeline is triggered that picks up the right version and can download the right artifacts.
The problem is when queueing the pipeline manually. The default resource configured is "Last successful run", which is indeed what i am looking for.
Only my last run has a result of PartiallySucceeded.
When I do not touch the resource and trigger the build, the build will pick up the latest build with state successful, not the very latest PartiallySucceeded build.
One workaround I found is that when triggering the build manually, I can choose a "different" resource. The pipeline nicely shows me all the Successful and PartiallySucceeded builds and I can choose one. I can choose the latest and say Use the selected run, the pipeline will then also use that properly.
So that is a workaround, but I would like to fix the default behavior of the latest successful to include the partially succeeded builds.
Does anyone know if this is possible?
Update 1:
This my Download build task
# Download build and pipeline artifacts
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'specific'
project: ProjectName
pipeline: '$(RESOURCES.PIPELINE.MAINREBUILD.PIPELINEID)'
specificBuildWithTriggering: true
buildVersionToDownload: specific
buildId: '$(RESOURCES.PIPELINE.MAINREBUILD.RUNID)'
allowPartiallySucceededBuilds: true
downloadType: specific
itemPattern: |
**\Installers\*.exe
downloadPath: '$(System.ArtifactsDirectory)'
The docs for the DownloadPipelineArtifact#2 says you can use a option in the task to disable download of partially succeeded builds. I assume this is the task you are using to grab the artifact?
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/download-pipeline-artifact?view=azure-devops#arguments
allowPartiallySucceededBuilds
Download artifacts from partially succeeded builds (Optional) If checked, this build task will try to download artifacts whether the build is succeeded or partially succeeded
Default value: false
Problem
Azure DevOps has a feature (documented here) to trigger a pipeline on completion from another pipeline.
This works fine in a test organization, but it won't work in our main organization.
There could be something on the organization, project, repository or even branching level, but I'm currently stuck and any help would be appreciated!
Pipelines
Pipeline Pipeline B should run automatically when pipeline Pipeline A completes.
File pipeline-a.yaml for Pipeline A:
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo Hello, world!
displayName: 'Do something'
File pipeline-b.yaml for Pipeline B:
trigger: none
pool:
vmImage: 'ubuntu-latest'
resources:
pipelines:
- pipeline: pipeline-a
source: 'Pipeline A'
branch: master
trigger:
branches:
- master
steps:
- script: echo Hello, world!
displayName: 'Do something'
Organizations
In my test organization the above pipelines run like a charm. This means that Pipeline A runs on a commit, and after completion, Pipeline B runs automatically.
Yet in our production organization, Pipeline B does not run automatically.
Discovery
Both pipelines run fine when started manually, in both organizations
All Preview features are equal on organization and personal level for both organizations, including the Multi-stage pipelines feature.
The production organization has branch policies on master, while the test organization does not have policies. I don't see a connection with pipeline triggers and did not investigate this.
Installing extensions to have them equal on test and production does not make a difference.
The test organization seems to be in the slow ring and was still on Sprint 161. EDIT: The issue persists after the organization was updated to Sprint 162.
It works when I use the classic editor and manually create a build completion trigger. But this overrides the YAML pipeline trigger and I don't want to do this (I want to generate the pipeline and it's triggers)
Deleting and re-adding the pipeline did the trick. So keep the YAML file but delete the pipeline and add it again.
The Azure DevOps backend seems to miss a relationship between pipelines now and then.
We troubleshot a similar problem today. With Pipeline-A defined as a resource that is meant to be consumed by Pipeline-B.
The consuming pipeline was never being triggered. Deleting and recreating the pipeline did not work for us. This work\pipeline was net new and on a feature branch. That ended up being important.
The ultimate fix was defining that feature branch as the Default branch for manual and scheduled builds in Pipeline-B. You can find that setting tucked away in Pipeline -> Edit -> triggers -> yaml-> Get Sources. Expect that as we promote this code to the main branch we will need to update the setting.
So it seems like the Default branch for manual and scheduled builds would be better named
Default branch for manual and scheduled builds and Pipeline Completion Triggers
In my case it was as simple as the source pipeline completing with an error. Testing with a very simple non erroring pipeline and the code worked fine.
I'm setting up a pipeline using Azure Pipelines YAML format. I have created 3 stages: Build, Staging, and Production. As the names suggest, the Build stage builds the project and publishes the build artifacts. The Staging stage deploys to the Staging environment and the Production stage deploys to the Production environment.
In the Environments section of my project, I have added a check for the Production environment so that I can approve the deployment before going live.
The way that my pipeline works is that both Staging and Production stages are triggered automatically after the Build stage is finished. What I don't like about this is that when developers deploy their code to Staging, they need a couple of days to test it on Staging before pushing their code to Production. So, until then, my pipeline keeps running and waiting for my approval. The spinner at the top-left corner keeps spinning and the "Duration" field keeps passing.
Is there any ways that develpers manually trigger the Production stage whenever they are ready instead of the Build stage triggering it?
you can set the trigger to none to disable CI and only trigger it manual
trigger: none
Manual stages in yaml pipeline is not available currently. This feature request has been submitted to Microsoft. You can go and vote it up or submit a new one.
There are workarounds to achieve this.
You can move your staging and production stages to Classic Web UI Release Pipeline. Manually trigger a stage is available in Web UI Release pipeline. Please check here for more information.
Another way to achieve this is to separate your yaml pipeline into two yaml pipelines(stage pipeline and production pipeline). And disable CI build for production pipeline( in the pipeline edit page, click on the 3dots on the top right corner and choose triggers. Please refer to below pics).
So that you can manually run production pipeline after Developer done with their tests.
You can specify which stage you want to run.
When you click "Run pipeline", click on "Stages to run":
Now choose which staged will run:
I think there's a better way. You can add a pipeline variable which can can be overridden when starting the pipeline.
You have to add a new variable to your pipeline and chose 'Let users override this value when running this pipeline'.
In your pipeline add a condition to your stage such as:
condition: and(succeeded(), or(eq(variables['Build.SourceBranch'], 'refs/heads/master'), eq(variables['DEPLOY_PROD'], 'true')))
Now whenever you want a build to deploy to Production you start the build and then override the variable from here:
Set the value to 'true' and your build will trigger the stage you want.
I have a much cleaner solution that I've been using for quite a while. It's similar to my original solution posted here but instead of manually adding a variable to the pipeline I add a parameter and use it as a condition to trigger the deployment in a particular environment.
The azure-pipelines.yaml looks like this:
trigger:
- master
parameters:
- name: deployDEV
displayName: Deploy to DEV
type: boolean
default: false
stages:
- stage: Build
jobs:
- job: Build
steps:
- script: |
echo "Building something..."
- stage: Release_DEV
displayName: Release to DEV
condition: |
and(
succeeded('Build'),
eq(${{ parameters.deployDEV }}, true)
)
dependsOn: Build
jobs:
- job: Release DEV
steps:
- script: |
echo "Releasing to DEV..."
The beauty of this solution is that when you're starting a new instance you'll get the parameters as options in the UI like this:
Yes it can be done. We do not do it in the yaml directly. But instead we add environment in YAML. And on environment we add manual trigger.
environment: 'smarthotel-dev'
Environment and triggers are managed through UI.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments?view=azure-devops
One great workaround with YAML is using conditions and variables.
Just add condition: eq(variables['Build.Reason'], 'Manual') in the stage that needs manual intervention and it should be it.
There is a lot of useful information on https://ochzhen.com/blog/manual-trigger-in-yaml-azure-pipelines
Here is a link to view all values of Build.Reason: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
I'm beginning playing around with the Azure Pipeline and have a hard time figuring out what is the output of my builds and where they are stored.
Is there a way to explore files contained in builds? I feel like I'm blind when using those built-in variables without knowing whats behind.
Here is a list of predefined variables for referencing files in build pipeline tasks.
If you want to get more visibility to the files I suggest you create and configure your own build agent, instructions here.
One way to "explore" a build is to publish to a "drop" location using:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
After this open up the Artifacts dropdown in the top-right and explore your build