I have a pipeline with a deployment job that has a environment
environment: "myname-$(variable1)"
but when i look at environments in Azure Devops, it has not replaced the variable and named my environment "myname-$(variable1)" and not "myname-helloworld".
Any way to use varaibles for the environment names?
Updated with example
stages.yml
stages:
- stage:
variables:
EnvironmentName: Prod
jobs:
- template: steps.yml
....
steps:yml
jobs:
- deployment: deployment
environment: ${{ EnvironmentName }}
strategy:
...
The solution is to use parameters instead of variables.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#insert-a-template
stages.yml
stages:
- stage:
variables:
EnvironmentName: Prod
jobs:
- template: steps.yml
parameters:
myParameter: Test
....
steps:yml
parameters:
- name: myParameter
jobs:
- deployment: deployment
environment: ${{ parameters['myParameter']}}
strategy:
...
Refer to the expressions documentation. You need to use a runtime or compile time expression.
environment: "myname-${{ variable1 }}"
That expands the variable value when the YAML file is parsed. If variable is a variable that you've defined, it will be expanded accordingly.
If variable1 is defined at some point in the execution of the pipeline, then you need to use the runtime syntax: $[ variable1 ] instead.
Related
Here is a code snippet:
stages:
- stage: Apply
dependsOn: Plan
variables:
OVERRIDE_ADO_ENVIRONMENT: $[ dependencies.Plan.outputs['Plan.IsTerraformPlanEmpty.OVERRIDE_ADO_ENVIRONMENT'] ]
condition: and(succeeded(), ${{ parameters.terraform_apply }})
jobs:
- deployment: Apply
environment: ${{ coalesce(variables.OVERRIDE_ADO_ENVIRONMENT, parameters.ado_environment) }}
strategy:
runOnce:
deploy:
steps:
- template: start.yaml
- template: terraform_init.yaml
parameters:
I know the build variable OVERRIDE_ADO_ENVIRONMENT is declared correctly, because I can use it in the condition to skip the Apply stage completely.
However, this is incorrect. Even if the plan is empty, there could be a change in the terraform output variables. Therefore I must run the Apply logic always. However, there is no need for approvals in this case.
Therefore I would like to switch the environment to the one in the OVERRIDE_ADO_ENVIRONMENT build variable which is a special environment with no approvals.
However, trying to run this pipeline produces the following error message:
Job Apply: Environment $[ dependencies could not be found. The environment does not exist or has not been authorized for use.
From which I conclude we cannot use a build variable, albeit computed in a previous stage.
The question is - what is the least painful way to implement this logic? If at all possible.
Edit 1
I tried an approach where I create two stages with a condition that is affected by the output variable from the previous stage. However, I found out that:
The condition must be on the stage, not deployment job. Otherwise, the environment is applied even if the deployment job's condition disables it.
However, the condition on the stage does not see the build variables shared at the same level. Thus the condition is always false.
Here is my attempt to use this approach
parameters:
- name: terraform_apply
type: boolean
- name: ado_environment
- name: working_directory
- name: application
default: terraform
- name: apply_stages
type: object
default:
- name: ApplyNonEmptyPlan
displayName: Apply Non Empty Plan
tf_plan_tag: TF_NON_EMPTY_PLAN
- name: ApplyEmptyPlan
displayName: Apply Empty Plan
tf_plan_tag: TF_EMPTY_PLAN
ado_environment: Empty TF Plan
stages:
- ${{ each apply_stage in parameters.apply_stages }}:
- stage: ${{ apply_stage.name }}
displayName: ${{ apply_stage.displayName }}
dependsOn: Plan
variables:
TF_PLAN_TAG: $[ stageDependencies.Plan.Plan.outputs['IS_TERRAFORM_PLAN_EMPTY.TF_PLAN_TAG'] ]
condition: and(succeeded(), ${{ parameters.terraform_apply }}, eq(variables['TF_PLAN_TAG'], '${{ apply_stage.tf_plan_tag }}'))
jobs:
- deployment: ${{ apply_stage.name }}
environment: ${{ coalesce(apply_stage.ado_environment, parameters.ado_environment) }}
strategy:
runOnce:
deploy:
steps:
- template: start.yaml
Environment creation happens at compile time before run time. It doesn't support dynamic environment name. Hence, in the code snippet shared below, each element in
coalesce should be known(or hardcode) before you run the pipeline, it cannot depends on the output calculated from previous stage.
environment: ${{ coalesce(variables.OVERRIDE_ADO_ENVIRONMENT, parameters.ado_environment) }}
and
environment: ${{ coalesce(apply_stage.ado_environment, parameters.ado_environment) }}
I've got yaml pipeline that referes some templates.
I have variable group linked to this main yaml file and I want to pass one of its variables to the template.
It's simple when I want to "just use it" as bellow:
example:
stages:
- stage: Deployment
variables:
- group: My_group_variables
jobs:
- template: /templates/jobs/myJobTemplate.yml
parameters:
someParameter: $(variable_from_my_variable_group)
myJobTemplate.yml:
parameters:
- name: someParameter
default: ''
jobs:
- job: Myjob
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: cat ${{ parameters.someParameter}}
It does not cooperate when I want to have parameters validation like:
parameters:
- name: environmentName
type: string
values:
- Development
- Test
- UAT
- Production
Or when I want to use "service connection" name as a variable.
...
- task: KubernetesManifest#0
displayName: Deploy to Kubernetes cluster
inputs:
action: 'deploy'
kubernetesServiceConnection: ${{ parameters.kubernetesServiceConnection }}
namespace: ${{ parameters.kubernetesNamespace }}
manifests: ${{ variables.manifestFile }}
...
Does anyone know how should I use those variables with pre-validated parameters or service connections?
It's most probably an issue with the time of resolving values. Pre-defined parameters and service connections names are checked on compile-time, while values from $() are resolved during runtime.
I cannot use extends and variables in this template.
Maybe someone has a pattern for those kinds of usage?
We have two deployment jobs that run in the same stage. The first job creates an output variable and the second job uses that output variable (code borrowed from here and implemented the same way in our pipeline).
jobs:
- deployment: producer
environment:
name: ${{ parameters.environment }}
resourceType: VirtualMachine
tags: ${{ parameters.tags }}
strategy:
runOnce:
deploy:
steps:
- script: echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]this is the deployment variable value"
name: setvarStep
- script: echo $(setvarStep.myOutputVar)
name: echovar
- deployment: consumer_deploy
dependsOn: producer
variables:
myVarFromDeploymentJob: $[ dependencies.producer.outputs['deploy_Vm1.setvarStep.myOutputVar'] ]
environment:
name: ${{ parameters.environment }}
resourceType: VirtualMachine
tags: ${{ parameters.tags }}
strategy:
runOnce:
deploy:
steps:
- script: "echo $(myVarFromDeploymentJob)"
name: echovar
This works because we reference the virtual machine (hardcoded) that the producer deployment job runs on. However, not every stage will run on the same virtual machine.
I've tried regular variables ($(Agent.MachineName)), as well as expression syntax, passing the variable from a template file and changing the scope of the variable template, but none of them work and the 'myVarFromDeploymentJob' variable stays empty.
Is there a way to make the virtual machine name in the expression variable or more flexible? So going from this:
$[ dependencies.producer.outputs['deploy_Vm1.setvarStep.myOutputVar'] ]
To something like this:
$[ dependencies.producer.outputs['deploy_$(Agent.MachineName).setvarStep.myOutputVar'] ]
Adding a solution for others.
here missing link to docs: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#support-for-output-variables
for runOnce deployment depends if you are addressing whole deployment (repeat Job name) or resource deployment (use deploy_resourceName):
variables:
myVarFromDeploymentJob: $[ dependencies.A2.outputs['A2.setvarStepTwo.myOutputVar'] ]
myOutputVarTwo: $[ dependencies.A2.outputs['Deploy_vmsfortesting.setvarStepTwo.myOutputVarTwo'] ]
I'm trying to create a deploy pipeline YAML template for all environments/stages. I've set up the Environments on Azure DevOps so that I can add checks and approvals on the Test and Prod environments before they get deployed. I've set up a library group for each stage and each one of them has a variable called 'env' which defines the current stage running in the pipeline. For some reason, the environment property under the deployment job (see code snippet below) doesn't read that variable.
Has anyone faced this issue before, or is there a reason why the variable won't be read for that specific property?
Note: I've tested the variables and they do work, for example, the stage property outputs as 'deploy-dev/test/prod' (depending on the environment)
- stage: deploy-$(env)
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-latest'
# creates an environment if it doesn't exist
environment: 'smarthotel-$(env)'
strategy:
runOnce:
deploy:
steps:
- script: echo Hello world
You can't do this because it have to be know at compilation phase.
But you can try this (lets name this file deploy.yml):
parameters:
- name: env
type: string
default: 'dev'
stages:
- stage: deploy-${{ parameters.env }}
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-latest'
# creates an environment if it doesn't exist
environment: 'smarthotel-${{ parameters.env }}'
strategy:
runOnce:
deploy:
steps:
- script: echo Hello world
and then you need to run as follows (in build.yml file):
stages:
- template: deploy.yml
parameters:
env: dev
- template: deploy.yml
parameters:
env: qa
- template: deploy.yml
parameters:
env: prod
I'm using an Azure DevOps pipeline to deploy my code and now I'm in need of passing a variable value from a deployment job to a subsequent job that depends on it. I've read up on this example but it does not seem to work at all.
What I'm trying to do is run an Azure ARM Deployment that provisions a Key Vault. The name of the key vault is outputted from the ARM deployment job and I'm then trying to pass that name to another job which needs to add specific secrets. Access control is taken care of, but I still need to pass the name.
I've boiled the problem down to the basics of passing a variable from a deployment to a job. Here is my complete test pipeline (almost entirely copied from here):
trigger: none
stages:
- stage: X
jobs:
- deployment: A
pool:
vmImage: "ubuntu-16.04"
environment: test
strategy:
runOnce:
deploy:
steps:
- script: echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]this is the deployment variable value"
name: setvarStep
- script: echo $(setvarStep.myOutputVar)
name: echovar
- job: B
dependsOn: A
pool:
vmImage: "ubuntu-16.04"
variables:
myVarFromDeploymentJob: $[ dependencies.A.outputs['deploy.setvarStep.myOutputVar'] ]
steps:
- script: "echo $(myVarFromDeploymentJob)"
name: echovar
Once I run this the echoed value is blank in job B, but defined in deployment A. Why is this? And is there a way to dum everything in dependencies.A.outputs so that I can see what I have to work with?
How can I pass a variable from a runOnce deployment job to a regular job?
I've solved it. The problem is that the documentation here specifies this schema for fetching the variable for a runOnce deployment:
$[dependencies.<job-name>.outputs['<lifecycle-hookname>.<step-name>.<variable-name>']]
This is in fact WRONG. The <lifecycle-hookname> parameter should be replaced with the name of the deployment like this:
$[dependencies.<job-name>.outputs['<job-name>.<step-name>.<variable-name>']]
The example from this documentation (scroll down a bit) is correct.
A full example pipeline that I've tested and works:
trigger: none
stages:
- stage: X
jobs:
- deployment: A # This name is important
pool:
vmImage: 'ubuntu-16.04'
environment: staging
strategy:
runOnce:
deploy:
steps:
- script: echo "##vso[task.setvariable variable=myOutputVar;isOutput=true]this is the deployment variable value"
name: setvarStep # This name is also important
- script: echo $(setvarStep.myOutputVar)
name: echovar
- job: B
dependsOn: A
pool:
vmImage: 'ubuntu-16.04'
variables:
myVarFromDeploymentJob: $[ dependencies.A.outputs['A.setvarStep.myOutputVar'] ]
steps:
- script: "echo $(myVarFromDeploymentJob)"
name: echovar