Unexpected value "steps" in yaml template - azure-devops

I don't understand how the have one file like this hierarchy is carried over into templates because I just get errors:
Pipeline
Stage A
Job 1
Step 1.1
Consider this yml file:
trigger:
- master
stages:
- stage: build
displayName: "Run Build"
jobs:
- template: someTemplate.yml
My template looks sort of like (tried putting jobs as the first hierarchy as well):
pool:
name: "Azure Pipelines"
vmImage: "windows-2019"
steps:
- task: myTask
inputs: ...
I believe I follow the right structure but I don't understand how the hierarchy follows to the template.
I get:
/generate-release-notes.yml (Line: 3, Col: 1): Unexpected value 'pool'
/generate-release-notes.yml (Line: 7, Col: 1): Unexpected value 'steps'

You're trying to use a steps template as a job template.
jobs:
- template: someTemplate.yml
is expecting someTemplate.yml to contain a jobs: block with a job or series of jobs in it. You're giving it steps.

From your YAML template sample, you are defining the steps template.
In this case, you cannot define the pool in the template because the pool cannot be defined at the step level of yaml.
On the other hand, since you defined the steps yaml template, you also need to reference this template in the step level in the main yaml.
To solve your issue, you need to modify your template YAML to define the job level YAML template.
Here is an example:
someTemplate.yml
jobs:
- job: NameA
pool:
name: "Azure Pipelines"
vmImage: "windows-2019"
steps:
- script: echo 1
azure-pipelines.yml
trigger:
- master
stages:
- stage: build
displayName: "Run Build"
jobs:
- template: someTemplate.yml

Related

Jobs in azure depending on jobs from other stages (inside templates)

Migrating from .gitlab-ci.yml to azure-pipelines.yml. In the .gitlab-ci.yml, I havee a scenario where one job(in the deploy stage) needs two other jobs(from test stages) for its execution
.deploy
stage: deploy
needs:
- testmethod1
- testmethod2
deployPROD:
extends: .deploy
Now here, does the deployProd job executes the testmethods again or just checks if they have been executed?
Moving to azure context, I created a templatefolder in my repository, with test file just to replicate this scenario.
My azure-pipelines.yml file is as shown below:
trigger:
- azure-pipelines
pool:
vmImage: ubuntu-latest
jobs:
- job: InitialA
steps:
- script: echo hello from initial A
- job: InitialB
steps:
- script: echo hello from initial B
- job: Subsequent
dependsOn:
- templates/test1.yml/testme
steps:
- script: echo hello from subsequent
I used the dependsOn key to show the depending jobs. Now the structure of the repo, along with the template file, looks like this:
But I end up getting the following error :
So is my approach correct? Am I using the correct keywords in azure? if yes, what is the path that I need to consider in the dependsOnkey?
Suggestions welcome.
You can add a template job and the Subsequent job can dependsOn the template job.
Please refer Template types & usage for more template usages in Azure Pipeline.
Code sample:
trigger:
- none
pool:
vmImage: ubuntu-latest
jobs:
- job: InitialA
steps:
- script: echo hello from initial A
- job: InitialB
steps:
- script: echo hello from initial B
- job: templates
steps:
- template: test.yml # Template reference
- job: Subsequent
dependsOn: templates
steps:
- script: echo hello from subsequent

How to call Azure pipeline template that included "resources"?

For some reasons, I want my A.yml calling another B.yml from others repository.
Pipeline A is consider an 'empty' pipeline, its job is basically calling pipeline B, and B done the real job.
But B is not just steps, it contain all the details, it included 'Resources' also.
This is how I do:
Pipeline A:
trigger:
- main
resources:
repositories:
- repository: script
type: github
name: xxx/script
ref: refs/heads/main
endpoint: smartracks
steps: **<---- What should I put ?**
- template: B.yml#script
Pipeline B:
resources:
repositories:
- repository: rcu-service
type: github
name: abc/rcu-service
ref: refs/heads/main
endpoint: test
jobs:
- job: OpenEmbedded_Build_And_Export
steps:
- checkout: rcu-service
- script: |
......
If I excluded the "resources" in pipeline B, it will success (need add those resources into pipeline A).
But once I included resource in pipeline B, it fails with these message:
B.yml#script(Line: 1, Col: 1): Unexpected value 'resource'
B.yml#script(Line: 24, Col: 1): Unexpected value 'jobs'
In Pipeline A, this is how I call the pipeline B, I use steps, but it seems doesn't work.
steps: **<---- What should I put ?**
- template: B.yml#script
I try with stages, jobs, but fail too.
So, I am wonder what should I do ?
Please teach me, thank you.
Azure Pipelines supports four kinds of templates:
Stage
Job
Step
Variable
It doesn't support resources, you need to put the resources in your A.yml.
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema#template-references
Imagine that you were to expand the template so that it showed up in the file where you placed - template:.... In this case, you would have
steps:
- jobs:
...
This doesn't work, because the schema requires steps to be a part of a job.
jobs:
- template: B.yml#script
or
stages:
- stage: stageName
displayName: "my stage"
jobs:
- template: B.yml#script

Azure Pipelines using YAML for multiple environments (stages) with different variable values but no YAML duplication

Let's suppose I have 3 environments on Azure: Dev, Test and Prod. I have the same pipeline for building and deploying the resources and the code for each one of the environments except for two differences:
different trigger branch
different variable values
What is the correct approach for this scenario? Because at least 3 come to my mind, none of which is perfect:
Option 1: I guess I could create a single pipeline on Azure DevOps (triggered by any of 3 branches) with 3 stages for each environment and for each stage add a condition to run depending on the source branch, like this:
condition: eq(variables['Build.SourceBranch'], 'refs/heads/a-branch-name')
and in each stage reference different variables. But this would introduce code duplication in each stage - when adding or modifying a step I would have to remember to edit 3 stages - not desirable.
Option 2: Create 3 separate YAML files in my repository, each one of them with specified trigger branch and referencing the same variable names, then create 3 different pipeline on Azure DevOps, each one of them with different variable values. But this would also introduce code duplication.
Option 3: Create 1 build-and-deploy.yaml file as a template with the steps defined in it and then create another 3 YAML files referring to that template, each with different trigger branch and with different variable values in each Azure Pipeline, like this:
trigger:
branches:
include:
- a-branch-name
steps:
- template: build-and-deploy.yaml
parameters:
parameterName1: $(parameterValue1)
parameterName2: $(parameterValue2)
This seems to be the best option but I haven't seen it used anywhere in the examples so maybe I'm just unaware of downsides of it, if there are any.
Here's how to do it with a shared pipeline config that gets included into env-specific pipelines.
To support 2 environments (dev and prod) you'd need:
1 shared pipeline yaml
2 env-specific yamls, one for each env
2 pipelines created in Azure DevOps, one for each env; each pipeline referencing corresponding yaml
pipeline-shared.yml:
variables:
ARTIFACT_NAME: ApiBuild
NPM_CACHE_FOLDER: $(Pipeline.Workspace)/.npm
stages:
- stage: Build
displayName: Build
pool:
vmImage: 'ubuntu-latest'
demands: npm
jobs:
...
- stage: Release
displayName: Release
dependsOn: Build
pool:
vmImage: 'ubuntu-latest'
jobs:
...
pipeline-dev.yml:
# Trigger builds on commits to branches
trigger:
- dev
# Do not trigger builds on PRs
pr: none
extends:
template: pipeline-shared.yml
pipeline-prod.yml
trigger:
- master
pr: none
extends:
template: pipeline-shared.yml
According to your description, if you want different stages to share the same repo resource, but their trigger branch and variable values are different.
Regarding trigger branch, you can use expression {{if ......}} to determine the trigger branch condition.
Regarding variable values, you can define templates and variable groups to specify them through parameters.
Here is an example, you can refer to it:
First go to Library under Pipelines, click on the Variable group to add a variable group. You can add multiple variables to this variable group.
Repo structure:
azure-pipelines.yml:
sample:
stages:
- template: stage/test.yml
parameters:
${{if contains(variables['Build.SourceBranch'], 'master')}}:
variableGroup: devGroup
stageName: Dev
test: a
${{if contains(variables['Build.SourceBranch'], 'test')}}:
stageName: test
test: b
stage/test. yml:
parameters:
- name: stageName
displayName: Test
type: string
default: test
- name: test
displayName: Test
type: string
default: test
- name: variableGroup
displayName: Test
type: string
default: test
stages:
- stage: Test_${{ parameters.stageName }}
variables:
- group: ${{parameters.variableGroup}}
jobs:
- job: Test1
pool:
vmImage: vs2017-win2016
steps:
- script: echo "Hello Test1"
- script: echo ${{ parameters.test }}
- script: echo $(dev1)
Of course, if you want to use a single variable, you can define the variable directly in yaml without adding a variable group.

Azure Devops - passing variables between job templates

Normal (non-template) jobs in Azure DevOps yaml support inter-job variable passing as follows:
jobs:
- job: A
steps:
- script: "echo ##vso[task.setvariable variable=skipsubsequent;isOutput=true]false"
name: printvar
- job: B
condition: and(succeeded(), ne(dependencies.A.outputs['printvar.skipsubsequent'], 'true'))
dependsOn: A
steps:
- script: echo hello from B
How do I do something similar in the following, given that templates don't support the dependsOn syntax? I need to get an output from the first template and pass it as 'environmentSlice' to the second template.
- stage: Deploy
displayName: Deploy stage
jobs:
- template: build-templates/get-environment-slice.yml#templates
parameters:
configFileLocation: 'config/config.json'
- template: build-templates/node-app-deploy.yml#templates
parameters:
# Build agent VM image name
vmImageName: $(Common.BuildVmImage)
environmentPrefix: 'Dev'
environmentSlice: '-$(dependencies.GetEnvironmentSlice.outputs['getEnvironmentSlice.environmentSlice'])'
The reason I want the separation between the two templates is the second one is a deployment template and I would like input from the first template in naming the environment in the second template. I.e. initial part of node-app-deploy.yml (2nd template) is:
jobs:
- deployment: Deploy
displayName: Deploy
# Because we use the environmentSlice to name the environment, we have to have it passed in rather than
# extracting it from the config file in steps below
environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}
Update:
The accepted solution does allow you to pass variables between separate templates, but won't work for my particular use case. I wanted to be able to name the 'environment' section of the 2nd template dynamically, i.e. environment: ${{ parameters.environmentPrefix }}${{ parameters.environmentSlice }}, but this can only be named statically since templates are compiled on pipeline startup.
The downside of the solution is that it introduces a hidden coupling between the templates. I would have preferred the calling pipeline to orchestrate the parameter passing between templates.
You can apply the depend on and dependency variable into templates.
See below sample:
To make sample more clear, here has 2 template files, one is azure-pipelines-1.yml, and another is azure-pipeline-1-copy.yml.
In azure-pipelines-1.yml, specify the environment value as output variable:
parameters:
  environment: ''
jobs:
- job: preDeploy
  variables:
    EnvironmentName: preDeploy-${{ parameters.environment }}
  steps:
  - checkout: none
  - pwsh: |
      echo "##vso[task.setvariable variable=EnvironmentName;isOutput=true]$($env:ENVIRONMENTNAME)"
    name: outputVars
And then, in azure-pipeline-1-copy.yml use dependency to get this output variable:
jobs:
- job: deployment
  dependsOn: preDeploy
  variables:
    EnvironmentNameCopy: $[dependencies.preDeploy.outputs['outputVars.EnvironmentName']]
  steps:
  - checkout: none
  - pwsh: |
      Write-Host "$(EnvironmentNameCopy)"
    name: outputVars
At last, in YAML pipeline, just need to pass the environment value
stages:
  - stage: deployQA
    jobs:
    - template: azure-pipelines-1.yml
      parameters:
        environment: FromTemplate1
    - template: azure-pipeline-1-copy.yml
Now, you can see the value get successfully in the second template job:
It is possible to avoid the dependency in the called template. However, as the OP says, the environment name cannot be created dynamically.
Here is an example of the "calling" template, which firstly calls another template (devops-variables.yml) that sets some environment variables that we wish to consume in a later template (devops-callee.yml):
stages:
- stage: 'Caller_Stage'
displayName: 'Caller Stage'
jobs:
- template: 'devops-variables.yml'
parameters:
InitialEnvironment: "Development"
- template: 'devops-callee.yml'
parameters:
SomeParameter: $[dependencies.Variables_Job.outputs['Variables_Job.Variables.SomeParameter']]
In the devops-variables.yml file, I have this:
"##vso[task.setvariable variable=SomeParameter;isOutput=true;]Wibble"
Then, in the "devops-callee.yml", I just consume it something like this:
parameters:
- name: SomeParameter
default: ''
jobs:
- deployment: 'Called_Job'
condition: succeeded()
displayName: 'Called Job'
environment: "Development"
pool:
vmImage: 'windows-2019'
dependsOn:
- Variables_Job
variables:
SomeParameter: ${{parameters.SomeParameter}}
strategy:
runOnce:
deploy:
steps:
- download: none
- task: AzureCLI#2
condition: succeeded()
displayName: 'An Echo Task'
inputs:
azureSubscription: "$(TheServiceConnection)"
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
echo "Before"
echo "$(SomeParameter)"
echo "After"
Output:
2021-04-10T09:22:29.6188535Z Before
2021-04-10T09:22:29.6196620Z Wibble
2021-04-10T09:22:29.6197124Z After
This way, the callee doesn't reference the caller. Unfortunately, setting the environment in the callee thus:
environment: "$(SomeParameter)"
doesn't work - you'll just get an environment with the literal characters '$(SomeParameter)'.

Multi-stage YAML pipeline does not apply environment-specific XML transformation

I transformed steps executed within a typical release pipeline into a multi-stage YAML pipeline.
The problem is that the Web.config (XML) transformation applied by the AzureRmWebAppDeployment#4 task does not work as previously anymore.
The documented approach of this behavior is that first the release config is being used for the transformation and the environment-specific config follows.
From the logs I can see that the Web.Release.config is applied, but there is no other transformation happening for the Web.TA.config (environment-specific config) even though it is present and the stage name matches the config name.
I have the following (simplified) YAML file for a multi-stage pipeline:
trigger: none
variables:
azureSubscriptionProject: 'SubscriptionName'
artifactDropDirectory: '$(Agent.BuildDirectory)/ProjectName'
stages:
- stage: Build
jobs:
- job: Build
pool:
vmImage: windows-2019
demands:
- msbuild
- visualstudio
- vstest
variables:
buildConfiguration: 'Release'
buildPlatformSolutionLevel: 'Any CPU'
buildPlatformProjectLevel: 'AnyCPU'
steps:
- template: azure-pipelines-ci-build-steps-template.yml
- template: azure-pipelines-ci-build-publishing-steps-template.yml
- stage: TA
dependsOn: Build
jobs:
- deployment: TA
pool:
vmImage: windows-2019
environment: 'TA'
timeoutInMinutes: 0
strategy:
runOnce:
deploy:
steps:
# ...
- task: AzureRmWebAppDeployment#4
displayName: 'Deploy ProjectName'
inputs:
azureSubscription: $(azureSubscriptionProject)
webAppName: '$(AzureResourcesPrefix)-project-name'
package: '$(artifactDropDirectory)/web-applications/ProjectName.zip'
enableCustomDeployment: true
removeAdditionalFilesFlag: true
xmlTransformation: true
# ...
An excerpt of the logs of this step/task:
Start tranformation to 'D:\a\_temp\temp_web_package_39967480923393817\Content\D_C\a\1\s\src\ProjectName\obj\Release\Package\PackageTmp\Web.config'.
Source file: 'D:\a\_temp\temp_web_package_39967480923393817\Content\D_C\a\1\s\src\ProjectName\obj\Release\Package\PackageTmp\Web.config'.
Transform file: 'D:\a\_temp\temp_web_package_39967480923393817\Content\D_C\a\1\s\src\ProjectName\obj\Release\Package\PackageTmp\Web.Release.config'.
Transformation task is using encoding 'System.Text.UTF8Encoding'. Change encoding in source file, or use the 'encoding' parameter if you want to change encoding.
Has somebody experienced this issue as well?
Am I doing something wrong or is this a bug or even by design?
I just realized there is a new FileTransform task but I would prefer not using it if it should work in a more simple way.
Setting the variable Release.EnvironmentName solved it and the environment-specific config transformation was triggert. This is possible on the stage-level (in case all jobs share the same environment name) and on the job-level.
An example:
# ...
- stage: TA
dependsOn: Build
variables:
Release.EnvironmentName: TA
jobs:
- deployment: TA
pool:
vmImage: windows-2019
environment: 'TA'
# ...
Answer provided by MSFT on developercommunity.visualstudio.com.