I setting up the variables at global level and value will be provided through variables before running pipeline, trying to set condition using variables but could not succeeded. below is my sample pipeline
trigger
- develop
variables:
stagename: $(stagename)
stages:
- stage: deploy
- task: download artifact
condition: notin(variables($(stagename),(tst1,tst2)))
- task: deploy2
condition: notin(variables($(stagename),(tst1,tst2)))
the goal is to not to run above those two tasks if stagname equals to tst1 or tst2, should run in other cases.
You can refer to the following sample and it worked on my side:
trigger:
- develop
stages:
- stage: deploy
jobs:
- job: Build
steps:
- task: PowerShell#2
condition: notin(variables['stagename'],'tst1','tst2')
inputs:
targetType: 'inline'
script: |
Write-Host "Hello World"
You do not need to set variable at the pipeline root level. A variable set in the pipeline root level will override a variable set in the Pipeline settings UI. Here is the document about define variables and specify conditions.
Related
I have two pipelines - build and publish. Build pipeline can produce up two artifacts but it depends on given parameters.
Publish pipeline is automatically triggered when build pipeline is completed. Publish pipeline then tooks published artifacts and deploy them. However I want to run publish tasks only and only if particular artifacts exists from build pipeline.
Right now, if artifact does not exists, it will fail "download" task.
Simplified to important parts and redacted some secret info
resources:
pipelines:
- pipeline: buildDev # Internal name of the source pipeline, used elsewhere within app-ci YAML, # e.g. to reference published artifacts
source: "Build"
trigger:
branches:
- dev
- feat/*
stages:
- stage: publish
displayName: "🚀🔥 Publish to Firebase"
jobs:
- job: publish_firebase_android
displayName: "🔥🤖Publish Android to Firebase"
steps:
- script: |
- download: buildDev
artifact: android
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload>
displayName: "Deploy APK to Firebase"
workingDirectory: "$(Pipeline.Workspace)/buildDev/android/"
- job: publish_firebase_ios
displayName: "🔥🍏Publish iOS to Firebase"
steps:
- download: buildDev
artifact: iOS
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload...>
workingDirectory: "$(Pipeline.Workspace)/buildDev/iOS/"
displayName: "Deploy IPA to Firebase"
I've tried to find some solution but every other solution solve the only problem within the same pipeline. Based on MS Docs I can't find if there is a prepared env. a variable that could point to "pipeline resources". With that env. variable I could theoretically run a script which checks presence of artifact, set variable and use that variable as condition for steps.
I think you can use stage filters in trigger. I don't know what structure your build pipeline is, but you can set up a stage to publish artifacts. Execute that stage if there are artifacts to publish, otherwise skip it. You can do this using conditions. Here is a simple sample:
stages:
- stage: Build
jobs:
- job: build
steps:
...
- stage: Artifact
condition: ... # Set the condition based on your parameter
jobs:
- job: artifact
steps:
...
Then use the stage filter in the publishing pipeline. If the stage executes successfully, then the publish pipeline will run, otherwise, the publish pipeline will not run.
resources:
pipelines:
- pipeline: buildpipeline
source: buildpipeline
trigger:
stages:
- Artifact
Using variable groups is an option as well. You can use the variable groups to pass variable from a pipeline to another pipeline. Here are the detailed steps:
(1). Create a variable group in Pipelines/Library and add a new Variable. I will call this variable "var" later.
(2). In your build pipeline, you can update "var" based on your parameters:
variables:
- group: {group name}
- bash: |
az pipelines variable-group variable update --group-id {id} --name var --value yes
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
condition: ...
Tip 1. If you don't know your variable group id, go to Pipelines/Library and select your variable group. You can find it in the URL: https://dev.azure.com/...&variableGroupId={id}&...
Tip 2. If you meet the error "You do not have permissions to perform this operation on the variable group.", go to Pipelines/Library and select your variable group. Click on "Security" and give "{pipeline name} Build Service" user the Administrator role.
Tip 3. Use your parameter in condition to decide whether to update var.
(3). In your publish pipeline, you can use var from variable group in condition:
condition: eq(variables['var'], 'yes')
I am trying to pass a variable from one job to multiple non-dependent jobs. It works for dependent jobs. However it is not working for non-dependent jobs. I have yaml pipeline defined like this.
jobs:
- job: JobA
steps:
- task: Bash#3
name: var_test
inputs:
targetType: 'inline'
script: |
myVar=demo-value
echo "##vso[task.setvariable variable=myVarNew;isOutput=true]$myVar"
- job: JobB
dependsOn: JobA
variables:
- name: newVar2
value: $[ dependencies.JobA.outputs['var_test.myVarNew'] ]
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo "$(newVar2)"
- job: JobC
dependsOn: JobB
variables:
- name: newVar2
value: $[ dependencies.JobA.outputs['var_test.myVarNew'] ]
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo "$(newVar2)"
It works fine in JobB. But when I tried it in different downstream job like jobC it doesn't work. Also I can't set the variable in every job as manual validation agentless job is also there in between. I want to use same variable in different non dependent jobs within same stage. Please help. Thanks.
dependencies.X.outputs relies on X having been declared as a dependency. If it's not there, then Azure Pipelines can't really know to track it as one for that job (with the sole exception that a Job implicitly dependsOn the preceding job unless this is explicitly set.
In your case, though, because JobB depends on JobA and JobC depends on JobB, the following is equivalent (even if a little redundant):
- job: JobC
dependsOn:
- JobA
- JobB
...
It's redundant for dependancy graph purposes, but it satisfies the need for a dependency to be declared so that you can access its output variables.
Let's say I have the following stage and pipeline:
stages:
- stage: A
lockBehavior: sequential
jobs:
- job: Job
steps:
- script: Hey!
lockBehavior: runLatest
stages:
- stage: A
jobs:
- job: Job
steps:
- script: Hey!
How would I print out the lockBehavior parameter out in Azure Devops when running the pipeline? I have tried printing out all variables using this code:
jobs:
- job: testing
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
But this does not work.
I checked the parameters "lockBehavior" from here . But I haven't found out a method to show the value in the pipeline log. However, there is a note "If you do not specify a lockBehavior, it is assumed to be runLatest." In my view,it's not hard to know the value if you set up it by yourself.
As my experience, you can print some "Use predefined variables" or "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script" by following the offical doc
So i guess that perhaps this parameter will be added in the future in the "Use predefined variables".
I have the following yaml as used in an Azure DevOps pipeline (this is not the full pipeline - it's just a portion of yaml that is in a template):
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
# - ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = $(testVar)"
displayName: Test workspaces output
this works correctly in that the second job retrieves a variable from a powershell task in the previous job and outputs that variable value. The task in the second job outputs a list of apps using variable testVar. The output contains:
app1,app2,app3,app4 etc
I would like to take this the next stage which is I would like to create a loop of jobs that repeated runs for this application list. Something like:
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
- ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = ${{folder}}"
displayName: Test workspaces output
This code gives me an error:
Unrecognized value: 'dependencies'. Located at position 1 within expression: dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList']
Is there a way i can use a powershell task output variable, to create a list of jobs in a dependent job? The problem is that i don't know at design time what the list of applications will be (the pipeline should ideally find this out when it runs). The list of applications is based on the list of folders that are created within a repository - which changes over time..
In current situation, we cannot use the 'each' key word for the variables. The 'each' keyword is used for the Obj type, but the variable is String.
For more details, you can refer the doc: Each keyword
When using the Releases tab in Azure DevOps web console, to create release definitions, the tasks can resolve $(Release.ReleaseId) inside of a bash task.
But if I instead do my deployment in the azure-pipelines.yml file and do echo $(Release.ReleaseId), I get null because the variable doesn't exist. How come?
Here is part of the yml file
- stage: Deploy
dependsOn: BuildAndPublishArtifact
condition: succeeded('BuildAndPublishArtifact')
jobs:
- deployment: DeployToAWSDev
displayName: My display name
pool:
vmImage: 'Ubuntu-16.04'
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: MyArtifact
- task: Base#3
inputs:
targetType: 'inline'
script: |
echo $(Release.ReleaseId) # Nothing
Thanks for any help to point in the right direction of how I can retrieve my release id.
Refer to the documentation on variables. There's no differentiation of "build" vs "release" in a YAML pipeline. Thus, Build.BuildId would be the run's ID.