I have a collection of files in my pipeline from my build stages. Each is put into a folder of appropriate name and published by PublishPipelineArtifact. In a release stage (not another pipeline), I would like to download specific artifacts and for a foreach over them. Specifically, I want to run a SqlAzureDacpacDeployment task on each of the dacpacs but I can't figure out how to get the Azure each expression to work over the set of files downloaded. I will need to do similar foreach to do things like push microservices, etc.
I am currently using this to get all the files I want
task: DownloadPipelineArtifact#2
inputs:
source: current
path: $(Pipeline.Workspace)
patterns: "SQLScripts-**"
How do I foreach over this and call SqlAzureDacpacDeployment (or whatever other steps)?
Related
I have a YAML file that process the output of two different pipelines. We successfully have that pipeline working. Now I would like to create another pipeline using the same YAML file, but using variables specific to each pipeline. I am trying to run one pipeline using the output of "develop" branch pipelines. Then a second pipeline that will use the output of "master/main" branch pipelines. I saw parameters, but I do not want people to have to remember to select 'develop' or 'main' when these are run.
When I created the 'master' pipeline, I see a button to create variable in the ADO UI. I thought those would be specific to the pipeline and not the underlying YAML file. I have tried various formats for the variables ${{ variables.branch }} and $(branch) .
Something like:
resources:
pipelines:
pipeline: xyzBuild # Name of the pipeline resource.
source: $(xyzLinux_Pipeline) # The name of the pipeline referenced by this pipeline resource.
trigger: true # Run this pipeline when any run of Build_xyz_MasterBranch completes
So, I would set xyz_Pipeline to the names of the pipelines in the specific pipeline.
Maybe I'm misunderstanding variables completely...
Sorry for the formatting... Cannot get the text to look like I want....
Thanks.
Edit: I see some Microsoft examples that have
Which looks like it will trigger for either the 'master' or 'releases/*', in my case I could use 'develop'. But how does the checkout 'know' which repo to pull the code from ?
# Update the local develop branch from the head of the remote repo.
- script: |
git checkout develop
git pull
displayName: 'Pull the develop branch from the remote repo.'
our YAML files have the branch name hardcoded. Is there some special keyword that I can use that will specify the correct repo based on the trigger name? We have only started to use these pipelines and we're all just guessing. I have looked at the ADO website, but it seems too fragmented for me to piece together the whole concept.
Because of the limitation of Azure Test Plans not being able to choose a build from a different project, I was wondering if it is possible to create a pipeline that would at least clone the build number from another project.
Here's the narrative:
There's a project ProjA with pipeline P1 that generates a build number using the following line
name: $(date:yyyyMMdd)$(rev:.r)
I want it such that:
Another project ProjB has a pipeline P1 which matches the name in ProjA gets triggered so that there's a build recorded whenever ProjA.P1 is successful and have the build recorded with the same name as the build run from ProjA.P1
UPDATE note I am looking specifically for ProjA.P1 and not whatever would've triggered ProjA.P1. The original accepted answer works for the simple case where ProjA.P1 is triggered from the ProjA.P1 pipeline.
However, if ProjA.P1 has triggers: none and uses resources.pipelines to trigger it's build it uses the build number of the referenced pipeline rather than ProjA.P1.
We can set a pipeline completion trigger to Trigger one pipeline after another. (ProjB.P1 is triggered when ProjA.P1 completes.)
We can get the trigger build name using the resource variable $(resources.pipeline.<Alias>.runName) in YAML. (It will retrieve the buildNumber of the pipeline ProjA.P1 in pipeline.) See Pipeline resource metadata as predefined variables for details.
Then using the UpdateBuildNumber logging command (##vso[build.updatebuildnumber]build number) to update the build number of the ProjB.P1.
ProjB.P1 YAML for your reference:
trigger: none
resources:
pipelines:
- pipeline: ProjA-Trigger # Any alias here
source: P1 # The real pipeline name (ProjA.P1 here)
project: ProjA # Project ProjA
trigger:
branches:
include:
- refs/heads/main
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: 'echo "##vso[build.updatebuildnumber]$(resources.pipeline.ProjA-Trigger.runName)"'
In Azure DevOps, I have one build pipeline that runs which could produce 1 or 2 artifacts I want to release. Lets call them Artifact1 and Artifact2.
Is it possible to have one release pipeline with multiple stages that only start if a specific artifact exists? So if Artifact1 was produced, run the Stage Artifact1Stage, but not Artifact2Stage.
I see there are branch filters in DevOps, but that doesn't get me what I want. I want to filter on the artifact produced.
In the Deployment group job (in a stage) there is an Artifact download option which allows me to select the specific artifact I want, but this doesn't prevent the stage from running (and then failing if the specific artifact wasn't produced).
EDIT: If it matters, I am not using yaml syntax for the release pipeline.
I would do something like the following (assuming this is classic pipeline as you suggested):
Add a Powershell task whose script inspects the contents of $(System.DefaultWorkingDirectory) (including whatever subfolder contains your artifact) and then uses Write-Host "##vso[task.setvariable variable=whichArtifact;isoutput=true]some value that tells you if it's artifact 1 or 2 that's present"
Create 2 jobs in a single stage, one for each artifact (or two stages with one job each, your call). For the Run on agent job additional option Run this Job, choose Custom condition using variable expressions, and the value would be something like eq(variables['whichArtifact'], 'value that tells you which artifact this is')
The variable value you set and condition you choose should be mutually exclusive so that you can clearly select the stage/job you wish to run.
I have Azure DevOps yaml deployment pipelines that trigger when a build pipeline completes and publishes an artifact. According to docs found here, artifacts will be downloaded to
to $(PIPELINE.WORKSPACE)/pipeline-identifier/artifact-identifier folder.
my trigger is similar to
resources:
pipelines:
- pipeline: SmartHotel-PipelineIdentifier
project: DevOpsProject
source: SmartHotel-CI
trigger:
branches:
include:
- main
How can i access the pipeline-identifier from a template? I need to be able to create
$(PIPELINE.WORKSPACE)/SmartHotel-PipelineIdentifier/artifact-identifier
based upon the pipeline definition above.
When the pipeline is triggered by a build, I'm able to use
$(Pipeline.Workspace)/$(resources.triggeringAlias)/artifact-identifier
to give me what I need, but that value is blank when the pipeline is triggered manually.
How can I access what appears to be called the pipeline-identifier value for a specific pipeline to be used within templates in the deployment jobs of pipelines?
You can get various properties of the pipeline resource, but that's all assuming you know the alias. But there's no way to get that alias if it's not triggered automatically. If you had more pipeline resources, how would you know which one to get?
In this case, you could:
pass the pipeline identifier (SmartHotel-PipelineIdentifier) as a parameter to the template.
assume a convention for well-known pipeline identifiers (i.e. always use artifact-pipeline instead of SmartHotel-PipelineIdentifier)
download (or move) the artifact to a well-known location before calling the template. That's assuming you already use the pipeline identifier for the download task anyway, so you could simply do mv $(Pipeline.Workspace)/SmartHotel-PipelineIdentifier/artifact-identifier $(Pipeline.Workspace)/artifact-identifier right after download.
You can also use the full DownloadPipelineArtifact task (which download is a shorthand for) and configure it to download to a different directory.
determine the directory name in runtime, once the artifact is downloaded and set a pipeline variable dynamically:
# there will be directories in $(Pipeline.Workspace) like "a", "b", "s" for checked out repositories - ignore them
- pwsh: |
ls $(Pipeline.Workspace)
$alias = Get-ChildItem $(Pipeline.Workspace) | ? { $_.Name.Length -gt 1 } | select -ExpandProperty Name -First 1
echo "##vso[task.setvariable variable=ArtifactAlias]$alias"
displayName: determine artifact pipeline alias
- pwsh: echo 'alias: $(ArtifactAlias)' # use $(ArtifactAlias) where you would use $(resources.triggeringAlias)
We have a project with repo on Azure DevOps where we store ARM templates of our infrastructure. What we want to achieve is to deploy templates on every commit on master branch.
The question is: is it possible to define one pipeline which could trigger a deployment only of ARM templates changed with that commit? Let's go with example. We 3 templates in repo:
t1.json
t2.json
t3.json
The latest commit changed only t2.json. In this case we want pipeline to only deploy t2.json as t1.json and t3.json hasn't been changed in this commit.
Is it possible to create one universal pipeline or we should rather create separate pipeline for every template which is triggered by commit on specific file?
It is possible to define only one pipeline to deploy the changed template. You need to add a script task to get the changed template file name in your pipeline.
It is easy to get the changed files using git commands git diff-tree --no-commit-id --name-only -r commitId. When you get the changed file's name, you need to assign it to a variable using expression ##vso[task.setvariable variable=VariableName]value. Then you can set the csmFile parameter like this csmFile: '**\$(fileName)' in AzureResourceGroupDeployment task
You can check below yaml pipeline for example:
- powershell: |
#get the changed template
$a = git diff-tree --no-commit-id --name-only -r $(Build.SourceVersion)
#assign the filename to a variable
echo "##vso[task.setvariable variable=fileName]$a"
- task: AzureResourceGroupDeployment#2
inputs:
....
templateLocation: 'Linked artifact'
csmFile: '**\$(fileName)'
It is also easy to define multiple pipelines to achieve only deploying the changed template. You only need to add the paths trigger to the specific template file in the each pipeline. So that the changed template file only triggers its corresponding pipeline.
trigger:
paths:
include:
- pathTo/template1.json
...
- task: AzureResourceGroupDeployment#2
inputs:
....
templateLocation: 'Linked artifact'
csmFile: '**\template1.json'
Hope above helps!
What you asked is not supported out of the box. From what I understood you want to have triggers (based on file change) per a step or a job (depending hoq you organize your pipeline). However, I'm not sure if you need this. Deploying ARM template which was not changed will not affect you Azure resources if use use Create Or Update Resource Group doc here
You can also try to manually detect which file was changed (using powershell for instance and git commands), then set a flag and later use this falg to fire or not some steps. But it looks like overkill for what you want achieve.