I have a YAML file that process the output of two different pipelines. We successfully have that pipeline working. Now I would like to create another pipeline using the same YAML file, but using variables specific to each pipeline. I am trying to run one pipeline using the output of "develop" branch pipelines. Then a second pipeline that will use the output of "master/main" branch pipelines. I saw parameters, but I do not want people to have to remember to select 'develop' or 'main' when these are run.
When I created the 'master' pipeline, I see a button to create variable in the ADO UI. I thought those would be specific to the pipeline and not the underlying YAML file. I have tried various formats for the variables ${{ variables.branch }} and $(branch) .
Something like:
resources:
pipelines:
pipeline: xyzBuild # Name of the pipeline resource.
source: $(xyzLinux_Pipeline) # The name of the pipeline referenced by this pipeline resource.
trigger: true # Run this pipeline when any run of Build_xyz_MasterBranch completes
So, I would set xyz_Pipeline to the names of the pipelines in the specific pipeline.
Maybe I'm misunderstanding variables completely...
Sorry for the formatting... Cannot get the text to look like I want....
Thanks.
Edit: I see some Microsoft examples that have
Which looks like it will trigger for either the 'master' or 'releases/*', in my case I could use 'develop'. But how does the checkout 'know' which repo to pull the code from ?
# Update the local develop branch from the head of the remote repo.
- script: |
git checkout develop
git pull
displayName: 'Pull the develop branch from the remote repo.'
our YAML files have the branch name hardcoded. Is there some special keyword that I can use that will specify the correct repo based on the trigger name? We have only started to use these pipelines and we're all just guessing. I have looked at the ADO website, but it seems too fragmented for me to piece together the whole concept.
Related
Am using Azure datafactory in combination with git:
During execution of a pipeline, can i get the current branch name as pipeline parameter, maybe like this:?
Wished output (additional column with branch name):
Sources: ETL / ELT pipelines - Metainformation about the pipeline
The workaround to achieve your scenario is follow below steps:
create pipeline parameter with type string and default value as your branch name.
In source of copy activity use additional column and value as parameter you created for branch name.
Output
Only possible system variables are as mentioned in documents are as below:
So, I have the following situation:
20 git repositories with a microservice in each
A repo with a template pipeline for the standard build process
Each of the 20 repos defines its own pipeline that uses this template with some parameters
On a PR build for any of the 20 repos, it will run its own pipeline as a build validation.
That's all working fine.
But now, I want to add an additional Optional Check to each of the 20 repos which would run a code analysis tool (eg. sonarqube) as part of the PR.
I don't want to add this to the main pipeline as I want it to appear in the PR as a separate optional check which can be skipped or toggled between optional/required.
The only way that I can find to achieve this is to add a CodeAnalysis.yml to each of the 20 repos and create 20 associated pipelines, which is an overhead I'd rather not deal with.
Is there a way that you can have a single pipeline that can be referenced as an optional check in all of these repos?
According to the docs, it should be possible for the shared pipeline to dynamically fetch the code from the right repo using something like this:
- checkout: git://ProjectName/$(Build.Repository.Name)#$(Build.SourceBranch)
But when I try this, the PR is unable to queue the pipeline (unhelpfully, it doesn't give a reason why).
Is there a solution to this?
You need to use Templates to design a shared template to run the code scanning. Essentially templates are reusable yaml files that you can pass parameters to to customise options.
For example, for your use case you could have the template for code scanning and add an existing job onto your pipelines to extend this template, pass any optional parameters you need (such as the repo to check out) and you can add in conditions to decide when to run the code scanning check
I know what you mean, this is not possible (at least not in the PR interface). Because when you press 'Queue' button in the PR build section, there won't even be a popup to select parameters, it will just choose the default value.
- checkout: git://ProjectName/$(Build.Repository.Name)#$(Build.SourceBranch)
This is also not possible, because runtime variables are not accepted here, they will be read directly as string types.
One suggestion is that you can specify parameters manually in pipeline page and then run the pipeline after setting the parameters
The reason for this is:
1, The checkout section is expanded before everything happens in pipeline run.
2, The Queue button on PR page didn't provide pop-out window to select parameters.
Your pipeline definition should be like this:
trigger:
- none
parameters:
- name: ProjectName
default: xxx
- name: RepoName
default: xxx
- name: BranchName
default: xxx
pool:
vmImage: windows-latest
steps:
- checkout: git://${{parameters.ProjectName}}/${{parameters.RepoName}}#${{parameters.BranchName}}
- script: |
dir
displayName: 'Run a multi-line script'
Manually select the parameters every time:
I have a collection of files in my pipeline from my build stages. Each is put into a folder of appropriate name and published by PublishPipelineArtifact. In a release stage (not another pipeline), I would like to download specific artifacts and for a foreach over them. Specifically, I want to run a SqlAzureDacpacDeployment task on each of the dacpacs but I can't figure out how to get the Azure each expression to work over the set of files downloaded. I will need to do similar foreach to do things like push microservices, etc.
I am currently using this to get all the files I want
task: DownloadPipelineArtifact#2
inputs:
source: current
path: $(Pipeline.Workspace)
patterns: "SQLScripts-**"
How do I foreach over this and call SqlAzureDacpacDeployment (or whatever other steps)?
I have Azure DevOps yaml deployment pipelines that trigger when a build pipeline completes and publishes an artifact. According to docs found here, artifacts will be downloaded to
to $(PIPELINE.WORKSPACE)/pipeline-identifier/artifact-identifier folder.
my trigger is similar to
resources:
pipelines:
- pipeline: SmartHotel-PipelineIdentifier
project: DevOpsProject
source: SmartHotel-CI
trigger:
branches:
include:
- main
How can i access the pipeline-identifier from a template? I need to be able to create
$(PIPELINE.WORKSPACE)/SmartHotel-PipelineIdentifier/artifact-identifier
based upon the pipeline definition above.
When the pipeline is triggered by a build, I'm able to use
$(Pipeline.Workspace)/$(resources.triggeringAlias)/artifact-identifier
to give me what I need, but that value is blank when the pipeline is triggered manually.
How can I access what appears to be called the pipeline-identifier value for a specific pipeline to be used within templates in the deployment jobs of pipelines?
You can get various properties of the pipeline resource, but that's all assuming you know the alias. But there's no way to get that alias if it's not triggered automatically. If you had more pipeline resources, how would you know which one to get?
In this case, you could:
pass the pipeline identifier (SmartHotel-PipelineIdentifier) as a parameter to the template.
assume a convention for well-known pipeline identifiers (i.e. always use artifact-pipeline instead of SmartHotel-PipelineIdentifier)
download (or move) the artifact to a well-known location before calling the template. That's assuming you already use the pipeline identifier for the download task anyway, so you could simply do mv $(Pipeline.Workspace)/SmartHotel-PipelineIdentifier/artifact-identifier $(Pipeline.Workspace)/artifact-identifier right after download.
You can also use the full DownloadPipelineArtifact task (which download is a shorthand for) and configure it to download to a different directory.
determine the directory name in runtime, once the artifact is downloaded and set a pipeline variable dynamically:
# there will be directories in $(Pipeline.Workspace) like "a", "b", "s" for checked out repositories - ignore them
- pwsh: |
ls $(Pipeline.Workspace)
$alias = Get-ChildItem $(Pipeline.Workspace) | ? { $_.Name.Length -gt 1 } | select -ExpandProperty Name -First 1
echo "##vso[task.setvariable variable=ArtifactAlias]$alias"
displayName: determine artifact pipeline alias
- pwsh: echo 'alias: $(ArtifactAlias)' # use $(ArtifactAlias) where you would use $(resources.triggeringAlias)
We have a project with repo on Azure DevOps where we store ARM templates of our infrastructure. What we want to achieve is to deploy templates on every commit on master branch.
The question is: is it possible to define one pipeline which could trigger a deployment only of ARM templates changed with that commit? Let's go with example. We 3 templates in repo:
t1.json
t2.json
t3.json
The latest commit changed only t2.json. In this case we want pipeline to only deploy t2.json as t1.json and t3.json hasn't been changed in this commit.
Is it possible to create one universal pipeline or we should rather create separate pipeline for every template which is triggered by commit on specific file?
It is possible to define only one pipeline to deploy the changed template. You need to add a script task to get the changed template file name in your pipeline.
It is easy to get the changed files using git commands git diff-tree --no-commit-id --name-only -r commitId. When you get the changed file's name, you need to assign it to a variable using expression ##vso[task.setvariable variable=VariableName]value. Then you can set the csmFile parameter like this csmFile: '**\$(fileName)' in AzureResourceGroupDeployment task
You can check below yaml pipeline for example:
- powershell: |
#get the changed template
$a = git diff-tree --no-commit-id --name-only -r $(Build.SourceVersion)
#assign the filename to a variable
echo "##vso[task.setvariable variable=fileName]$a"
- task: AzureResourceGroupDeployment#2
inputs:
....
templateLocation: 'Linked artifact'
csmFile: '**\$(fileName)'
It is also easy to define multiple pipelines to achieve only deploying the changed template. You only need to add the paths trigger to the specific template file in the each pipeline. So that the changed template file only triggers its corresponding pipeline.
trigger:
paths:
include:
- pathTo/template1.json
...
- task: AzureResourceGroupDeployment#2
inputs:
....
templateLocation: 'Linked artifact'
csmFile: '**\template1.json'
Hope above helps!
What you asked is not supported out of the box. From what I understood you want to have triggers (based on file change) per a step or a job (depending hoq you organize your pipeline). However, I'm not sure if you need this. Deploying ARM template which was not changed will not affect you Azure resources if use use Create Or Update Resource Group doc here
You can also try to manually detect which file was changed (using powershell for instance and git commands), then set a flag and later use this falg to fire or not some steps. But it looks like overkill for what you want achieve.