So, I have the following situation:
20 git repositories with a microservice in each
A repo with a template pipeline for the standard build process
Each of the 20 repos defines its own pipeline that uses this template with some parameters
On a PR build for any of the 20 repos, it will run its own pipeline as a build validation.
That's all working fine.
But now, I want to add an additional Optional Check to each of the 20 repos which would run a code analysis tool (eg. sonarqube) as part of the PR.
I don't want to add this to the main pipeline as I want it to appear in the PR as a separate optional check which can be skipped or toggled between optional/required.
The only way that I can find to achieve this is to add a CodeAnalysis.yml to each of the 20 repos and create 20 associated pipelines, which is an overhead I'd rather not deal with.
Is there a way that you can have a single pipeline that can be referenced as an optional check in all of these repos?
According to the docs, it should be possible for the shared pipeline to dynamically fetch the code from the right repo using something like this:
- checkout: git://ProjectName/$(Build.Repository.Name)#$(Build.SourceBranch)
But when I try this, the PR is unable to queue the pipeline (unhelpfully, it doesn't give a reason why).
Is there a solution to this?
You need to use Templates to design a shared template to run the code scanning. Essentially templates are reusable yaml files that you can pass parameters to to customise options.
For example, for your use case you could have the template for code scanning and add an existing job onto your pipelines to extend this template, pass any optional parameters you need (such as the repo to check out) and you can add in conditions to decide when to run the code scanning check
I know what you mean, this is not possible (at least not in the PR interface). Because when you press 'Queue' button in the PR build section, there won't even be a popup to select parameters, it will just choose the default value.
- checkout: git://ProjectName/$(Build.Repository.Name)#$(Build.SourceBranch)
This is also not possible, because runtime variables are not accepted here, they will be read directly as string types.
One suggestion is that you can specify parameters manually in pipeline page and then run the pipeline after setting the parameters
The reason for this is:
1, The checkout section is expanded before everything happens in pipeline run.
2, The Queue button on PR page didn't provide pop-out window to select parameters.
Your pipeline definition should be like this:
trigger:
- none
parameters:
- name: ProjectName
default: xxx
- name: RepoName
default: xxx
- name: BranchName
default: xxx
pool:
vmImage: windows-latest
steps:
- checkout: git://${{parameters.ProjectName}}/${{parameters.RepoName}}#${{parameters.BranchName}}
- script: |
dir
displayName: 'Run a multi-line script'
Manually select the parameters every time:
Related
I have a YAML file that process the output of two different pipelines. We successfully have that pipeline working. Now I would like to create another pipeline using the same YAML file, but using variables specific to each pipeline. I am trying to run one pipeline using the output of "develop" branch pipelines. Then a second pipeline that will use the output of "master/main" branch pipelines. I saw parameters, but I do not want people to have to remember to select 'develop' or 'main' when these are run.
When I created the 'master' pipeline, I see a button to create variable in the ADO UI. I thought those would be specific to the pipeline and not the underlying YAML file. I have tried various formats for the variables ${{ variables.branch }} and $(branch) .
Something like:
resources:
pipelines:
pipeline: xyzBuild # Name of the pipeline resource.
source: $(xyzLinux_Pipeline) # The name of the pipeline referenced by this pipeline resource.
trigger: true # Run this pipeline when any run of Build_xyz_MasterBranch completes
So, I would set xyz_Pipeline to the names of the pipelines in the specific pipeline.
Maybe I'm misunderstanding variables completely...
Sorry for the formatting... Cannot get the text to look like I want....
Thanks.
Edit: I see some Microsoft examples that have
Which looks like it will trigger for either the 'master' or 'releases/*', in my case I could use 'develop'. But how does the checkout 'know' which repo to pull the code from ?
# Update the local develop branch from the head of the remote repo.
- script: |
git checkout develop
git pull
displayName: 'Pull the develop branch from the remote repo.'
our YAML files have the branch name hardcoded. Is there some special keyword that I can use that will specify the correct repo based on the trigger name? We have only started to use these pipelines and we're all just guessing. I have looked at the ADO website, but it seems too fragmented for me to piece together the whole concept.
We've been migrating some of our manual deployment processes from Octopus to Azure DevOps Yaml pipelines. One of the QoL changes we're sorely missing is to be able to select the environment from a drop-down list/ auto-complete field as we could in Octopus.
Is there a way to achieve this? Currently, the only way I can think of doing it is to have a repo with a .yaml template file updated with a list of new environments as part of our provisioning process... Which seems less than ideal.
If you are going to trigger the pipeline manually then you can make use of Runtime parameters in the Azure DevOps pipeline.
For Example:
In order to make OS image name selectable from a list of choices, you can use the following snippet.
parameters:
- name: EnvName
displayName: EnvName
type: string
default: A
values:
- A
- B
- C
- D
- E
- F
trigger: none # trigger is explicitly set to none
jobs:
- job: build
displayName: build
steps:
- script: echo building $(Build.BuildNumber) with ${{ parameters.EnvName }}
Documentation about runtime parameters are here.
The downside to this is that the trigger: None limits you that the pipeline can only be manually triggered. Not sure how this works with other trigger options.
Hi I have a pipeline dependency challenge and have thought up a number of possible solutions. I can try them all out in a lab but the problem I am wondering if they work well "in the field" and so would like to know if anyone has tried them?
I have 3 stages, each in their own YML file. Each one is called from a main YML which is called from a main pipeline.
- template: 'build.yml'
- template: 'deploy.yml'
- template: 'test.yml'
The 'deploy.yml' generates a large number of output environment variables and 4 of these are consumed by the 'test.yml' using the "stageDependencies" syntax:
stages:
- stage: 'Test_Stage'
dependsOn: Deploy_Stage
jobs:
job: 'Test_Job'
variables:
MyWebSite: [ stageDependencies.Deploy_Stage.Variables_Job.outputs['Variables_Job.Variables.MyWebSite'] ]
This works nicely.
But, I'd like to be able to create a pipeline that just does the test stage (to test a pre-existing web site). That doesn't work of course because of the dependency dependsOn: Deploy_Stage.
I can think of a few possible solutions:
Instead of having a dependency and using the [ stageDependencies... ] syntax, send the MyWebSite as a pipeline parameter between stages. (Note that there are actually parameters not 1, I just simplified to demonstrate the challenge.) If I do that, the tester gets prompted to fill out (or choose from a list) the various parameters. But, it does create linkage between Deploy_Stage and Test_Stage - I don't know if that's a bad thing?
Pass a Boolean parameter from Deploy_Stage to Test_Stage such as "CalledFromDeployStage" and then in Test_Stage, do this:
stages:
- stage: 'Test_Stage'
${{ if eq(parameters.CalledFromDeployStage, true) }}:
dependsOn: Deploy_Stage
jobs:
job: 'Test_Job'
variables:
MyWebSite: [ stageDependencies.Deploy_Stage.Variables_Job.outputs['Variables_Job.Variables.MyWebSite'] ]
This feels a bit clunky.
Create a new YML called "Test_Stage_Manual" and get it to prompt for the various parameters and leave the rest as-is. (If I do this, I would probably put the jobs into their own YML file and call that YML from both Test stages.)
Something else?
You can try like as below:
Create an individual YAML pipeline to only run the test.
In the "Deploy_Stage" of the main pipeline, add a step or job at the end of this stage to execute the API "Runs - Run Pipeline" to trigger the pipeline for test after all the previous steps and jobs in this stage are completed successfully.
When calling the "Runs - Run Pipeline" API, you can pass the variables and parameters generated in the "Deploy_Stage" to the pipeline for test via the Request Body (JSON type) of the API.
Due to the test is in an individual pipeline, you can manually trigger this pipeline if you like. When manually trigger, you can manually set the value of the required variables and parameters in the pipeline.
With this way, you can trigger the pipeline for test both via the "Deploy_Stage" and manually.
I have a pipeline in Azure DevOps somewhat like this:
parameters:
- name: Scenario
displayName: Scenario suite
type: string
default: 'Default'
variables:
Scenario: ${{ parameters.Scenario }}
...
steps:
- script: echo Scenario is $(Scenario)
And I'm executing the pipeline via the VSTS CLI like this:
vsts build queue ... --variables Scenario=Test
When I run my pipeline, it seems that the parameter default value overwrites my cmd line specified variable value and I get the step output Scenario is Default. I tried something like Scenario: $[coalesce(variables['Scenario'], ${{ parameters.Scenario }})] but I think I got the syntax wrong because that caused a parsing issue.
What would be the best way to only use the parameter value if the Scenario variable has not already been set?
What would be the best way to only use the parameter value if the
Scenario variable has not already been set?
Sorry but as I know your scenario is not supported by design. The Note here has stated that:
When you set a variable in the YAML file, don't define it in the web editor as settable at queue time. You can't currently change variables that are set in the YAML file at queue time. If you need a variable to be settable at queue time, don't set it in the YAML file.
The --variables switch in command can only be used to overwrite the variables which are marked as Settable at queue time. Since yaml pipeline doesn't support Settable variables by design, your --variables Scenario=Test won't actually be passed when queuing the yaml pipeline.
Here're my several tests to prove that:
1.Yaml pipeline which doesn't support Settable variable at Queue time:
pool:
vmImage: 'windows-latest'
variables:
Scenario: Test
steps:
- script: echo Scenario is $(Scenario)
I ran the command vsts build queue ... --variables Scenario=Test123, the pipeline run started but the output log would always be Scenario is Test instead of expected Scenario is Test123. It proves that it's not Pipeline parameter overwrites variable value, instead the --variables Scenario=xxx doesn't get passed cause yaml pipeline doesn't support Settable variables.
2.Create Classic UI build pipeline with pipeline variable Scenario:
Queuing it via command az pipelines build queue ... --variables Scenario=Test12345(It has the same function like vsts build queue ... --variables Scenario=Test) only gives this error:
Could not queue the build because there were validation errors or warnings.
3.Then enable the Settable at queue time option of this variable:
Run the same command again and now it works to queue the build. Also it succeeds to overwrite the original pipeline variable with the new value set in command-line.
You can do similar tests like what I did to figure out the cause of the behavior you met.
In addition:
VSTS CLI has been deprecated and replaced by Azure CLI with the Azure DevOps extension for a long time. So now it's more recommend to use az pipelines build queue
instead.
Lance had a great suggestion, but here is how I ended up solving it:
- name: Scenario
displayName: Scenario suite
type: string
default: 'Default'
variables:
ScenarioFinal: $[coalesce(variables['Scenario'], '${{ parameters.Scenario }}')]
...
steps:
- script: echo Scenario is $(ScenarioFinal)
In this case we use the coalesce expression to assign the value of a new variable, ScenarioFinal. This way we can still use --variables Scenario=Test via the CLI or use the parameter via the pipeline UI. coalesce will take the first non-null value and effectively "reorder" the precedence Lance linked to here: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#expansion-of-variables
(Note that there need to be single quotes around the parameter reference '${{}}' because the ${{}} is simply converted to to the value, but then the coalesce expression doesn't know how to interpret the raw value unless it has the single quotes around it to denote it as a string)
Note that the ability to set parameters via the CLI is a current feature suggestion here: https://github.com/Azure/azure-devops-cli-extension/issues/972
Is it possible in Azure Devops YAML pipelines to dynamically create additional steps based on some variable data (without creating our own plugin)
The thing is I see that I want to iterate through several directories, but I don't want to just lump it all in a single step since it makes it harder to scan through to find an error.
Is it possible in Azure Devops YAML pipelines to dynamically create
additional steps based on some variable data (without creating our own
plugin)
No, Yaml pipelines(azure-pipeline.yml) are under Version Control. So what you want (for your original title) is to dynamically commit changes to the azure-pipeline.yml file when executing the pipeline. That's not a recommended workflow.
1.Instead you can consider using Azure Devops Conditions to dynamically enable/disable the additional steps.
Use a template parameter as part of a condition
Use the output variable from a job in a condition in a subsequent job
Or Use some predefined variables:
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
2.If you're not using Conditions, you can check conditional template as Simon suggests above.
Also, both #1 and #2 can work with new feature runtime parameters.
3.However, if the dynamic variable you mean comes from the result of components = result of ls -1 $(Pipeline.Workspace)/components command, above tips won't work for this situation. For this you can try something like this:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
# some logic to run `components = result of ls -1 $(Pipeline.Workspace)/components` and determine whether to set the WhetherToRun=true.
'Write-Host "##vso[task.setvariable variable=WhetherToRun]True"'
- task: CmdLine#2
inputs:
script: |
echo Hello world
condition: eq(variables['WhetherToRun'], 'True')
It is possible to include steps conditionally with an if statement.
I think the example of extending a template on the same page will give you a good indication of how to iterate through a list parameter and create / run a step based on each value.