Capturing build date in deployment pipleine - azure-devops

I'm trying to capture the build date of an artifact to be logged elsewhere. Join me on my convoluted journey to solve this.
I know we have these handy variables
$(Build.SourceVersion)
$(Build.BuildNumber)
EDIT: These are not as handy as I thought. These are just the identifers for the deploy pipeline, not the original build pipeline that generated the artefact. So I can repeatedly deploy the same build / artefact, and these numbers will continue to increment, having no relevance to what I built - I'm not interested in that.
But there is no build date. I know it can be derived from the BuildNumber but it seems over the top to call a REST API to get that info.
So in my build pipeline I am writing Get-Date to a file then publishing that as an artefact
- powershell: (Get-Date).ToString("yyyy-MM-dd HH:mm:ss") | Out-File -FilePath $(Build.ArtifactStagingDirectory)\BuildDt.txt
Then I pick that up in the deploy pipeline and save to a variable using the kludgy Write-Host method
- stage: DownloadDBArtifacts
displayName: Download DB Artifacts
dependsOn: []
jobs:
- job: GetArtefacts
displayName: Get Artefacts
steps:
- download: DBBuild
- task: PowerShell#2
displayName: Get Build timestamp
name: GetBuildDt
inputs:
targetType: inline
script: |
$BuildDt = Get-Content -Path $(Pipeline.Workspace)\DBBuild\drop\BuildDt.txt
Write-Host "##vso[task.setvariable variable=BuildDt;isoutput=true]$BuildDt"
Write-Host "##[debug]Artifact Creation Date: $BuildDt"
This is done in stage DownloadDBArtifacts
Now I need to use it in a later stage, that is also in a child YAML template
I beleive this is the syntax for extracting the variable:
stageDependencies.DownloadDBArtifacts.GetArtefacts.outputs['GetBuildDt.BuildDt']
I'm having difficulty getting this recognised in later stages. Here is a subsequent stage that tries to capture the value based on examples from here:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#use-outputs-in-a-different-stage
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#job-to-job-dependencies-across-stages
- stage: DeployDBtoTST
displayName: Deploy DB to TST
dependsOn: DownloadDBArtifacts
variables:
- group: vgTST
jobs:
- deployment: DeployDBtoTST
displayName: Deploy DB to TST
environment: TST Environment
variables:
BuildDt: $[ stageDependencies.DownloadDBArtifacts.GetArtefacts.outputs['GetBuildDt.BuildDt'] ]
strategy:
runOnce:
deploy:
steps:
- powershell: |
Write-Host "var: $(BuildDt)"
however the value is not being passed through as the final powershell step just produces this output:
var:

Capturing build date in deployment pipleine
I could reproduce this issue with your YAML sample.
To resolve this issue, please update your DownloadDBArtifacts by following code:
- stage: DownloadDBArtifacts
displayName: Download DB Artifacts
dependsOn: []
jobs:
- job: GetArtefacts
displayName: Get Artefacts
steps:
- download: DBBuild
- task: InlinePowershell#1
displayName: 'Get Artefacts'
inputs:
Script: |
$BuildDt = Get-Content -Path $(Pipeline.Workspace)\DBBuild\drop\BuildDt.txt
Write-Host "##vso[task.setvariable variable=BuildDt;isOutput=true]$BuildDt"
name: GetBuildDt
The test result:
Update:
Sorry, I tried changing the DownloadDBArtifacts as you mentioned above
andit made no difference.
You have a slight letter error in your code that is causing the issue.
One is name: GetBuiltDt and another is outputs['GetBuildDt.BuildDt'] ]. The Built should be Build:

Related

Azure DevOPS - run task only if artifact exists from build

I have two pipelines - build and publish. Build pipeline can produce up two artifacts but it depends on given parameters.
Publish pipeline is automatically triggered when build pipeline is completed. Publish pipeline then tooks published artifacts and deploy them. However I want to run publish tasks only and only if particular artifacts exists from build pipeline.
Right now, if artifact does not exists, it will fail "download" task.
Simplified to important parts and redacted some secret info
resources:
pipelines:
- pipeline: buildDev # Internal name of the source pipeline, used elsewhere within app-ci YAML, # e.g. to reference published artifacts
source: "Build"
trigger:
branches:
- dev
- feat/*
stages:
- stage: publish
displayName: "🚀🔥 Publish to Firebase"
jobs:
- job: publish_firebase_android
displayName: "🔥🤖Publish Android to Firebase"
steps:
- script: |
- download: buildDev
artifact: android
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload>
displayName: "Deploy APK to Firebase"
workingDirectory: "$(Pipeline.Workspace)/buildDev/android/"
- job: publish_firebase_ios
displayName: "🔥🍏Publish iOS to Firebase"
steps:
- download: buildDev
artifact: iOS
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload...>
workingDirectory: "$(Pipeline.Workspace)/buildDev/iOS/"
displayName: "Deploy IPA to Firebase"
I've tried to find some solution but every other solution solve the only problem within the same pipeline. Based on MS Docs I can't find if there is a prepared env. a variable that could point to "pipeline resources". With that env. variable I could theoretically run a script which checks presence of artifact, set variable and use that variable as condition for steps.
I think you can use stage filters in trigger. I don't know what structure your build pipeline is, but you can set up a stage to publish artifacts. Execute that stage if there are artifacts to publish, otherwise skip it. You can do this using conditions. Here is a simple sample:
stages:
- stage: Build
jobs:
- job: build
steps:
...
- stage: Artifact
condition: ... # Set the condition based on your parameter
jobs:
- job: artifact
steps:
...
Then use the stage filter in the publishing pipeline. If the stage executes successfully, then the publish pipeline will run, otherwise, the publish pipeline will not run.
resources:
pipelines:
- pipeline: buildpipeline
source: buildpipeline
trigger:
stages:
- Artifact
Using variable groups is an option as well. You can use the variable groups to pass variable from a pipeline to another pipeline. Here are the detailed steps:
(1). Create a variable group in Pipelines/Library and add a new Variable. I will call this variable "var" later.
(2). In your build pipeline, you can update "var" based on your parameters:
variables:
- group: {group name}
- bash: |
az pipelines variable-group variable update --group-id {id} --name var --value yes
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
condition: ...
Tip 1. If you don't know your variable group id, go to Pipelines/Library and select your variable group. You can find it in the URL: https://dev.azure.com/...&variableGroupId={id}&...
Tip 2. If you meet the error "You do not have permissions to perform this operation on the variable group.", go to Pipelines/Library and select your variable group. Click on "Security" and give "{pipeline name} Build Service" user the Administrator role.
Tip 3. Use your parameter in condition to decide whether to update var.
(3). In your publish pipeline, you can use var from variable group in condition:
condition: eq(variables['var'], 'yes')

Accessing variables in Azure loops using templates

I want to loop through the pipeline artifacts and pass them as variables to a task.
Followed this answer here, but no luck:
https://stackoverflow.com/a/59451690/5436341
Have this powershell script in the build stage to get the artifact names and store them in a variable:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Fetching value files"
cd $(Build.ArtifactStagingDirectory)\MSI
$a=dir -Filter "*.msi"
$List = $a | foreach {$_}
Write-Host $List
$d = '"{0}"' -f ($List -join '","')
Write-Host $d
Write-Host "##vso[task.setvariable variable=MSINames;isOutput=true]$d"
name: getMSINames
And passing them as parameters to a template from another stage as below:
- stage: deployPoolsStage
displayName: Deploy Pools
dependsOn:
- Build
jobs:
- job: CloudTest_AgentBased_Job
displayName: 'CloudTest AgentBased Job'
timeoutInMinutes: 120
variables:
MSIFiles: $[dependencies.Build.outputs['getMSINames.MSINames']]
steps:
- template: TestPipeline.yml
parameters:
files : $(MSIFiles)
Now, my template looks like this:
parameters:
files : []
steps:
- ${{ each filename in parameters.files }}:
- task: SomeTask
inputs:
Properties: worker:VsTestVersion=V150;worker:MSIFile=${{ filename }}
displayName: 'tests'
Now this is failing with an error saying: "Expected a sequence or mapping. Actual value '$(MSIFiles)'". It's the same error even without using the template and directly accessing the variable in the original yml file.
Please let me know of a way to loop through my pipeline artifacts and pass them to my task.
You are exporting the variable MSINames from a task in Build stage that needs to be inside a job (say BuildJob). Also, you are trying to access the exported variable from another stage but inside a job. Thus, you should use the format to read variable from another stage but within a job. What you have used is a wrong format in deployPoolsStage stage. Try correcting the format as below inside CloudTest_AgentBased_Job job,
variables:
MSIFiles: $[stageDependencies.Build.BuildJob.outputs['getMSINames.MSINames']]
NOTE: I have assumed that the getMSINames task is defined inside the BuildJob job under Build stage according to what you have provided.
See docs related to this here.

How do I print out a pipeline's or a stage's parameter?

Let's say I have the following stage and pipeline:
stages:
- stage: A
lockBehavior: sequential
jobs:
- job: Job
steps:
- script: Hey!
lockBehavior: runLatest
stages:
- stage: A
jobs:
- job: Job
steps:
- script: Hey!
How would I print out the lockBehavior parameter out in Azure Devops when running the pipeline? I have tried printing out all variables using this code:
jobs:
- job: testing
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
But this does not work.
I checked the parameters "lockBehavior" from here . But I haven't found out a method to show the value in the pipeline log. However, there is a note "If you do not specify a lockBehavior, it is assumed to be runLatest." In my view,it's not hard to know the value if you set up it by yourself.
As my experience, you can print some "Use predefined variables" or "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops&tabs=script" by following the offical doc
So i guess that perhaps this parameter will be added in the future in the "Use predefined variables".

Azure DevOps yaml: use a powershell task output parameter to generate a loop in dependent job

I have the following yaml as used in an Azure DevOps pipeline (this is not the full pipeline - it's just a portion of yaml that is in a template):
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
# - ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = $(testVar)"
displayName: Test workspaces output
this works correctly in that the second job retrieves a variable from a powershell task in the previous job and outputs that variable value. The task in the second job outputs a list of apps using variable testVar. The output contains:
app1,app2,app3,app4 etc
I would like to take this the next stage which is I would like to create a loop of jobs that repeated runs for this application list. Something like:
jobs:
- job: CheckExcludedWorkspaces
displayName: Check Excluded Workspaces
pool:
name: DefaultWindows
steps:
- task: PowerShell#2
name: GetWorkspaces
displayName: Check Excluded Workspaces
inputs:
filePath: "$(System.DefaultWorkingDirectory)/pipelines_v2/powershell/checkExcludedWorkspaces.ps1"
targetType: FilePath
errorActionPreference: 'stop'
arguments: -environmentFolder "$(rootFolderPrefix)\${{parameters.environmentFolder}}" -excludeFolderList "${{parameters.tagOutList}}"
pwsh: false
- ${{ each folder in dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] }}:
- job: NewJob
dependsOn: CheckExcludedWorkspaces
variables:
testVar: $[ dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList'] ]
pool:
name: DefaultWindows
steps:
- powershell: |
Write-Host "Test var = ${{folder}}"
displayName: Test workspaces output
This code gives me an error:
Unrecognized value: 'dependencies'. Located at position 1 within expression: dependencies.CheckExcludedWorkspaces.outputs['GetWorkspaces.WorkspaceList']
Is there a way i can use a powershell task output variable, to create a list of jobs in a dependent job? The problem is that i don't know at design time what the list of applications will be (the pipeline should ideally find this out when it runs). The list of applications is based on the list of folders that are created within a repository - which changes over time..
In current situation, we cannot use the 'each' key word for the variables. The 'each' keyword is used for the Obj type, but the variable is String.
For more details, you can refer the doc: Each keyword

Azure YAML Get variable from a job run in a previous stage

I am creating YAML pipeline in Azure DevOps that consists of two stages.
The first stage (Prerequisites) is responsible for reading the git commit and creates a comma separated variable containing the list of services that has been affected by the commit.
The second stage (Build) is responsible for building and unit testing the project. This Stage consists of many templates, one for each Service. In the template script, the job will check if the relevant Service in in the variable created in the previous stage. If the job finds the Service it will continue to build and test the service. However if it cannot find the service, it will skip that job.
Run.yml:
stages:
- stage: Prerequisites
jobs:
- job: SetBuildQueue
steps:
- task: powershell#2
name: SetBuildQueue
displayName: 'Set.Build.Queue'
inputs:
targetType: inline
script: |
## ... PowerShell script to get changes - working as expected
Write-Host "Build Queue Auto: $global:buildQueueVariable"
Write-Host "##vso[task.setvariable variable=buildQueue;isOutput=true]$global:buildQueueVariable"
- stage: Build
jobs:
- job: StageInitialization
- template: Build.yml
parameters:
projectName: Service001
projectLocation: src/Service001
- template: Build.yml
parameters:
projectName: Service002
projectLocation: src/Service002
Build.yml:
parameters:
projectName: ''
projectLocation: ''
jobs:
- job:
displayName: '${{ parameters.projectName }} - Build'
dependsOn: SetBuildQueue
continueOnError: true
condition: and(succeeded(), contains(dependencies.SetBuildQueue.outputs['SetBuildQueue.buildQueue'], '${{ parameters.projectName }}'))
steps:
- task: NuGetToolInstaller#1
displayName: 'Install Nuget'
Issue:
When the first stages runs it will create a variable called buildQueue which is populated as seen in the console output of the PowerShell script task:
Service001 Changed
Build Queue Auto: Service001;
However when it gets to stage two and it tries to run the build template, when it checks the conditions it returns the following output:
Started: Today at 12:05 PM
Duration: 16m 7s
Evaluating: and(succeeded(), contains(dependencies['SetBuildQueue']['outputs']['SetBuildQueue.buildQueue'], 'STARS.API.Customer.Assessment'))
Expanded: and(True, contains(Null, 'service001'))
Result: False
So my question is how do I set the dependsOn and condition to get the information from the previous stage?
It because you want to access the variable in a different stage from where you defined them. currently, it's impossible, each stage it's a new instance of a fresh agent.
In this blog you can find a workaround that involves writing the variable to disk and then passing it as a file, leveraging pipeline artifacts.
To pass the variable FOO from a job to another one in a different stage:
Create a folder that will contain all variables you want to pass; any folder could work, but something like mkdir -p $(Pipeline.Workspace)/variables might be a good idea.
Write the contents of the variable to a file, for example echo "$FOO" > $(Pipeline.Workspace)/variables/FOO. Even though the name could be anything you’d like, giving the file the same name as the variable might be a good idea.
Publish the $(Pipeline.Workspace)/variables folder as a pipeline artifact named variables
In the second stage, download the variables pipeline artifact
Read each file into a variable, for example FOO=$(cat $(Pipeline.Workspace)/variables/FOO)
Expose the variable in the current job, just like we did in the first example: echo "##vso[task.setvariable variable=FOO]$FOO"
You can then access the variable by expanding it within Azure Pipelines ($(FOO)) or use it as an environmental variable inside a bash script ($FOO).