Equivalent predefined variables in azure from gitlab - azure-devops

I am migrating from .gitlab-ci.yml to azure-pipelines.yml
In one of the lines inside the .gitlab-ci.yml, there are variables
$CI_PROJECT_ID,$CI_PIPELINE_ID, $CI_JOB_ID.
I have figured out the equivalent for $CI_PROJECT_ID. In azure it is $(System.TeamProjectId)
However, need help in figuring out : $CI_PIPELINE_ID and $CI_JOB_ID
Looking forward for some suggestions

I'm not sure what these variables mean but consider this on Azure Pipelines:
$(System.DefinitionId) - The ID of the build pipeline.
$(System.JobId) - A unique identifier for a single attempt of a single job. The value is unique to the current pipeline.
$(Build.BuildId) - The ID of the record for the completed build.
All predefined variables you can find here

Related

Azure DevOps: How to eliminate the warning "Tags set for the trigger didn't match the pipeline" in Azure DevOps?

I have two Azure DevOps pipelines set up so that the completion of Pipeline One triggers Pipeline Two. It works fine, but it generates an unnecessary error message.
All recent Pipeline Two builds are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29
Any trigger issues are listed on this page (not really a link, don't bother clicking on it) : https://dev.azure.com/mycompany/myproject/_build?definitionId=29&view=triggerIssues
It appears that every run of Pipeline One -> Pipeline Two adds this warning to the Trigger Issues page: "Tags set for the trigger didn't match the pipeline". It's only a warning, not an error, and Pipeline Two executes successfully. But how can I eliminate this warning message?
The pipeline resource is specified in Pipeline Two as follows:
resources:
pipelines:
- pipeline: pipeline-one
source: mycompany.pipeline-one
# project: myproject # optional - only required if first pipeline is in a different project
trigger:
enabled: true
branches:
include:
- master
- develop
- release_*
No tags are specified, because tags are not used.
I have reviewed the following documentation without finding an answer. I may have missed something in the docs.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/resources?view=azure-devops&tabs=schema#define-a-pipelines-resource
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-pipelines-pipeline?view=azure-pipelines

Azure devops release variables are not getting resolved in custom server task

Below Release Variables are not getting resolved at run time when tried in a custom server task on a Release Pipeline.
System.JobName
System.JobDisplayName
System.StageDisplayName
System.DefinitionName
I am able to fetch JobId by using $(system.JobId), but not other job details (JobName) as mentioned above.
Am I missing anything here:
Those YAML pipeline variables. They will not work in a classic release pipeline.
Use the release-specific equivalents: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azure-devops&tabs=batch#default-variables

How do I load values from a .json file into a Devops Yaml Pipeline Parameter

Microsoft Documentation explains the use of parameters in Yaml Pipeline jobs as
# File: azure-pipelines.yml
trigger:
- master
extends:
template: simple-param.yml
parameters:
yesNo: false # set to a non-boolean value to have the build fail
But instead of statically specifying the value of yesNo: I'd prefer to load it from a completely separate json config file. Preferably a json file that both my Build Job and my Application could share so that parameters specified for the Application could also be used in the Build Job.
Thus the question:
How do I load values from a .json file into a Devops Yaml Pipeline Parameter?
I've been using this marketplace task:
https://marketplace.visualstudio.com/items?itemName=OneLuckiDev.json2variable
And it's been working great so far. Haven't tried it builds, but can't see why it wouldn't work with separate build pipelines/multi-staged builds. There are a few things you have to be aware of/stumble upon, like double escaping slashes in directory paths - and you'll have to fetch secrets from someplace else, like traditional variable groups.

Specify runtime parameter in a pipeline task

We have a requirement to somehow pass a dynamic runtime parameter to a pipeline task.
For example below paramater APPROVAL would be different for each run of the task.
This APPROVAL parameter is for the change and release number so that the task can tag it on the terraform resources created for audit purposes.
Been searching the web for a while but with no luck in finding a solution, is this possible in a concourse pipeline or best practice?
- task: plan-terraform
file: ci/concourse-jobs/pipelines/tasks/terraform/plan-terraform.yaml
params:
ENV: dev
APPROVAL: test
CHANNEL: Developement
GITLAB_KEY: ((gitlab_key))
REGION: eu-west-2
TF_FOLDER: terraform/squid
input_mapping:
ci: ci
tf: squid
output_mapping:
plan: plan
tags:
- dev
From https://concourse-ci.org/tasks.html:
ideally tasks are pure functions: given the same set of inputs, it should either always succeed with the same outputs or always fail.
A dynamic parameter would break that contract and produce different outputs from the same set of inputs. Could you possibly make APPROVAL an input? Then you'd maintain your build traceability. If it's a (file) input, you could then load it into a variable:
APPROVAL=$(cat <filename>)

Download artifact from Azure DevOps Pipeline grandparent Pipeline

Given 3 Azure DevOps Pipelines (more may exist), as follows:
Build, Unit Test, Publish Artifacts
Deploy Staging, Integration Test
Deploy Production, Smoke Test
How can I ensure Pipeline 3 downloads the specific artifacts published in Pipeline 1?
The challenge as I see it is that the Task DownloadPipelineArtifact#2 only offers a means to do this if the artifact came from the immediately preceding pipeline. By using the following Pipeline task:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: '$(System.TeamProjectId)'
definition: 1
specificBuildWithTriggering: true
buildVersionToDownload: 'latest'
artifactName: 'example.zip'
This works fine for a parent "triggering pipeline", but not a grandparent. Instead it returns the error message:
Artifact example.zip was not found for build nnn.
where nnn is the run ID of the immediate predecessor, as though I had specified pipelineId: $(Build.TriggeredBy.BuildId). Effectively, Pipeline 3 attempts to retrieve the Pipeline 1 artifact from Pipeline 2. It would be nice if that definition: 1 line did something, but alas, it seems to do nothing when specificBuildWithTriggering: true is set.
Note that buildType: 'latest' isn't safe; it appears it permits publishing an untested artifact, if emitted from Pipeline 1 while Pipeline 2 is running.
There may be no way to accomplish this with the DownloadPipelineArtifact#2. It's hard to be sure because the documentation doesn't have much detail. Perhaps there's another reasonable way to accomplish this... I suppose publishing another copy of the artifact at each of the intervening pipelines, even the ones that don't use it, is one way, but not very reasonable. We could eliminate the ugly aspect of creating copies of the binaries, by instead publishing an artifact with the BuildId recorded in it, but we'd still have to retrieve it and republish it from every pipeline.
If there is a way to identify the original CI trigger, e.g. find the hash of the initiating GIT commit, I could use that to name and refer to the artifacts. Does Build.SourceVersion remain constant between triggered builds? Any other "Initiating ID" would work equally well.
You are welcome to comment on the example pipeline scenario, as I'm actually currently using it, but it isn't the point of my question. I think this problem is broadly applicable, as it will apply when building dependent packages, or for any other reasons for which "Triggers" are useful.
An MS representative suggested using REST for this. For example:
HTTP GET https://dev.azure.com/ORGNAME/PROJECTGUID/_apis/build/Builds/2536
-
{
"id": 2536,
"definition": {
"id": 17
},
"triggeredByBuild": {
"id": 2535,
"definition": {
"id": 10
}
}
}
By walking the parents, one could find the ancestor with the desired definition ID (e.g. 10). Then its run ID (e.g. 2535) could be used to download the artifact.
#merlin-liang-msft suggested a similar process for a different requirement from #sschmeck, and their answer has accompanying code.
There are extensions that allow you to do this, but the official solution it to use a multi-stage pipeline and not 3 independent pipelines.
One way is using release pipelines (you can't code/edit it in YAML) but you can use the same artifacts through whole deployment.
Release pipeline
You can also specify required triggers to start deployment on
Approval and triggers
Alternatively, there exist multi-stage pipeline, that are in preview.(https://devblogs.microsoft.com/devops/whats-new-with-azure-pipelines/ ).
You can access it by enabling it in your "preview feature".
Why don't you output some pipeline artifacts with meta info and concatenate these in the preceding pipes like.
Grandparent >meta about pipe
Parent > meta about pipe and grantparent meta
Etc