How can i ensure that all stages of my pipelines are performed in the same working directory.
I have pipeline that looks like this:
resources:
repositories:
- repository: AzureRepoDatagovernance
type: git
name: DIF_data_governance
ref: develop
trigger:
branches:
include:
- main
paths:
include:
- terraform/DIF
variables:
- group: PRD_new_resources
- name: initial_deployment
value: false
pool: $(agent_pool_name)
stages:
- stage: VariableCheck
jobs:
- job: VariableMerge
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- ${{ if eq(variables.initial_deployment, 'false') }}:
- task: PythonScript#0
inputs:
scriptSource: filePath
scriptPath: DIF-devops/config/dynamic_containers.py
pythonInterpreter: /usr/bin/python3
arguments: --automount-path $(System.DefaultWorkingDirectory)/DIF_data_governance/data_ingestion_framework/$(env)/AutoMount_Config.json --variables-path $(System.DefaultWorkingDirectory)/DIF-devops/terraform/DIF/DIF.tfvars.json
displayName: "Adjust container names in variables.tf.json"
- stage: Plan
jobs:
- job: Plan
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform init
terraform plan -out=outfile -var-file=DIF.tfvars.json
displayName: "Plan infrastructure changes to $(terraform_folder_name) environment"
- stage: ManualCheck
jobs:
- job: ManualCheck
pool: server
steps:
- task: ManualValidation#0
timeoutInMinutes: 5
displayName: "Validate the configuration changes"
- stage: Apply
jobs:
- job: Apply
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform apply -auto-approve "outfile"
displayName: "Apply infrastructure changes to $(terraform_folder_name) environment"
How can I make sure that all 4 stages are inside this same working directory so I can check out just once and all stages have access to work done by previous jobs? I know that this
I know that my pipeline has some flaws that will need to be polished.
This is not possible. Each azure devops stage has its own working directory and it is considered a different devops agent job. The jobs inside the stage will use the same working directory for the steps that are included on them.
If you need to pass code or artifacts between stages you should use publish pipeline artifacts and download pipeline artifacts native devops tasks.
Related
I'm trying to implement Approval gates in an Azure Devops YML pipeline and following these steps. An issue I am seeing is that since configuring the pipeline to use a blank environment (so it is only used for the approval process as per the article) when the pipeline runs I am no longer seeing the "Checkout" step where my repo contents are loaded into the virtual machine runner used by the pipeline. How can I use an environment and load in the repo contents?
This code has the checkout step:
trigger: none
pool:
vmImage: ubuntu-latest
stages:
- stage: tempname
jobs:
- job: tempname
steps:
- bash: |
echo hello world
This code does not have the checkout step:
trigger: none
pool:
vmImage: ubuntu-latest
stages:
- stage: tempname
jobs:
- deployment: tempname
environment: test
strategy:
runOnce:
deploy:
steps:
- bash: |
echo hello world
checkout is a step, so should go in the steps section - see https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/steps-checkout?view=azure-pipelines.
Example of how it might look:
trigger: none
pool:
vmImage: ubuntu-latest
stages:
- stage: tempname
jobs:
- deployment: tempname
environment: test
strategy:
runOnce:
deploy:
steps:
- checkout: self # Use none to avoid checking out code.
- bash: |
echo hello world
By referencing another pipeline in a YAML pipeline's resources all artifacts published by the referenced pipeline get automatically downloaded. I'm not sure how to stop this behavior and download only the needed artifacts. Adding a download task for only the needed artifacts does not stop the initial download of the full set of artifacts.
So what you need is disabling default behavior as
Artifacts are only downloaded automatically in deployment jobs. In a regular build job, you need to explicitly use the download step keyword or Download Pipeline Artifact task.
To stop artifacts from being downloaded automatically, add a download step and set its value to none:
steps:
- download: none
and then add additional step to download specific artifact.
Here is an example:
resources:
pipelines:
- pipeline: MultipleArtifact
project: 'DevOps Manual'
source: 'kmadof.devops-manual (64)'
jobs:
- job: Build
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
# Track deployments on the environment.
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-16.04'
# Creates an environment if it doesn't exist.
environment: 'smarthotel-dev'
strategy:
# Default deployment strategy, more coming...
runOnce:
deploy:
steps:
- download: none
- download: MultipleArtifact
artifact: art-1
- checkout: self
- script: echo my first deployment
To download a specific artifact rather than all artifacts, you can include the following:
steps:
- download: current
artifact: 'Artifact-Name'
Below is a complete example comparing the default behaviour with download: current and download: none.
Here is a screenshot from Azure pipelines showing correct number of Download steps in each case.
jobs:
- job: Create_Two_Artifacts
steps:
- bash: echo "test" >> file.txt
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: 'file.txt'
artifactName: 'Artifact1'
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: 'file.txt'
artifactName: 'Artifact2'
- deployment: Download_All_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps: []
- deployment: Download_SpecificArtefact_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: 'Artifact1'
- deployment: Download_None_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps:
- download: none
The deploy stage of the pipeline fails without error after build stage completes successfully.
Enabling system diagnostics does not give in any additional information (see the screenshot below).
The following pipelines yaml file was used:
trigger:
- master
resources:
- repo: self
variables:
vmImageName: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: CmdLine#2
inputs:
script: |
ls -la
- stage: Deploy
displayName: Deploy Notebook Instance Stage
dependsOn: Build
jobs:
- deployment: Deploy
displayName: Deploy
pool:
vmImage: $(vmImageName)
environment: 'myenv.default'
strategy:
runOnce:
deploy:
steps:
- task: CmdLine#2
inputs:
script: echo Some debug text!
I used your script and I change only environment as I don't have myenv.default and all is fine.
Please double check your environment setting.
In Azure Devops multistage YAML pipeline we got multiple environments.
In stages to run normally we do a build and deploy only in QA, so we need to deselect each stage manually. By default all stages are selected is it possible to have exact opposite, where all stages are deselected by default???
trigger: none
pr: none
stages:
- stage: 'Build'
jobs:
- deployment: 'Build'
pool:
name: Default
# testing
environment: INT
strategy:
runOnce:
deploy:
steps:
- checkout: none
- powershell: |
echo "Hello Testing"
Start-Sleep -Seconds 10
- stage: 'Sandbox'
jobs:
- job: 'Sandbox'
pool:
name: Default
steps:
- checkout: none
# testing
- powershell: |
echo "Hello Testing"
- stage: 'Test'
jobs:
- job: 'DEV'
pool:
name: Default
steps:
- checkout: none
- powershell: |
echo "Hello Testing"
- stage: 'QA'
dependsOn: ['Test','Test1','Test2']
jobs:
- job: 'QA'
pool:
name: Default
steps:
- checkout: none
# Testing
- powershell: |
echo "Hello Testing"
I am afraid that there is no UI (like stage to run) method that can meet your needs.
You could try to add parameters to your Yaml Sample.
Here is an example:
trigger: none
pr: none
parameters:
- name: stageTest
displayName: Run Stage test
type: boolean
default: false
- name: stageBuild
displayName: Run Stage build
type: boolean
default: false
stages:
- ${{ if eq(parameters.stageBuild, true) }}:
- stage: 'Build'
jobs:
- deployment: 'Build'
pool:
name: Default
environment: INT
strategy:
runOnce:
deploy:
steps:
- checkout: none
- powershell: |
echo "Hello Testing"
Start-Sleep -Seconds 10
- ${{ if eq(parameters.stageTest, true) }}:
- stage: Test
dependsOn: []
jobs:
- job: B1
steps:
- script: echo "B1"
The parameters are used to determine whether to run these stages. You could add expressions before the stage to check if the parameter value could meet expression.
The default value is false. This means that the stage will not run by default.
Here is the result:
You can select the stage you need to run by clicking the selection box.
Update
Workaround has some limitations. When the select stage has depenencies, you need to select all dependent stages to run.
For example:
- stage: 'QA'
dependsOn: ['Test','Test1','Test2']
On the other hand, I have created a suggestion ticket to report this feature request. Here is the suggestion ticket link: Pipeline Deselect Stages By Default You could vote and add comment in it .
I've used this solution to build a nuget-package, and:
always push packages from master
conditionally push packages from other branches
Using GitVersion ensures that the packages from other branches get prerelease version numbers, e.g. 2.2.12-my-branch-name.3 or 2.2.12-PullRequest7803.4. The main branch simply gets 2.2.12, so the master branch is recognized as a "regular" version.
The reason I'm repeating the answer above, is that I chose to make the stage conditional instead of using an if:
trigger:
- master
parameters:
- name: pushPackage
displayName: Push the NuGet package
type: boolean
default: false
stages:
- stage: Build
jobs:
- job: DoBuild
steps:
- script: echo "I'm building a NuGet package (versioned with GitVersion)"
- stage: Push
condition: and(succeeded('build'), or(eq('${{ parameters.pushPackage }}', true), eq(variables['build.sourceBranch'], 'refs/heads/master')))
jobs:
- job: DoPush
steps:
- script: echo "I'm pushing the NuGet package"
Like the other answer, this results in a dialog:
But what's different from the (equally valid) solution with '${{ if }}', is that the stage is always shown (even if it's skipped):
The deployment job automatically downloads all the pipeline resources. However, standard job does not. I tried to use - download: current, but that does not download the pipeline resources.
The reason I want to do this is to simulate a deployment for GitOps. The simulation will include a step that does a git diff that shows the differences for review.
However, I don't see an all or * option in https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema#download
My current workaround is to do a deployment to a "Temp" environment prior to the real deployment.
UPDATE:
Here's an example of what I have tried
resources:
pipelines:
- pipeline: auth
project: myproj
source: auth CI
trigger:
branches:
include:
- master
...
jobs:
- job: diagnostics
displayName: Job Diagnostics
steps:
- checkout: self
# - download:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
- bash: |
env | sort
displayName: Display environment variables
- bash: |
pwd
displayName: Present working directory
- bash: |
find $(Pipeline.Workspace) -type f -print
displayName: Display files
UPDATE:
Another approach I was mulling is to create a pipeline that creates another pipeline. That way the list of pipeline resources does not need to be Azure Pipeline YAML, it could be a CSV or a simplified YAML that I transform in which case I can generate
resources:
pipelines:
- pipeline: pipelineAlias1
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
- pipeline: pipelineAlias2
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
...
job:
steps:
- download: pipelineAlias1
- download: pipelineAlias2
and then set that up as another Azure pipeline to execute when updated.
How do you download all pipeline resources in a job?
Indeed, this is a known issue about using the download keyword to download the Pipeline Artifacts.
You could track this issue from below ticket:
Artifacts do not download on multi stage yaml build using DownloadPipelineArtifactV2
To resolve this issue, please try to use the DownloadPipelineArtifact task instead of the download keyword:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
Update:
I have noticed your yml file, it seems you do not add build job in your pipeline. The task DownloadPipelineArtifact is used to:
download pipeline artifacts from earlier stages in this pipeline,
or from another pipeline
So, we need to add stage to build the pipeline to generate the artifact, otherwise, no pipeline artifacts were downloaded.
Check my test YAML file:
variables:
ArtifactName: drop
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\LibmanTest'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: LibmanTest'
inputs:
ArtifactName: $(ArtifactName)
- stage: Dev
dependsOn: Build
jobs:
- job: Dev
displayName: Dev
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\TestSample'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: TestSample'
inputs:
ArtifactName: $(ArtifactName)
- stage: Deployment
dependsOn: Dev
pool:
name: MyPrivateAgent
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
As you can see, I use two stages Build and Dev to copy the project LibmanTest and TestSample as artifact, then use the task DownloadPipelineArtifact to download those two artifacts.
The test result:
Update2:
your example still does not show resources.pripelines
So, now you want to download the artifact from other pipelines, you need not use the default configuration for the task DownloadPipelineArtifact:
resources:
pipelines:
- pipeline: xxx
project: MyTestProject
source: xxx
trigger:
branches:
include:
- master
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact For Test'
inputs:
buildType: specific
project: MyTestProject
definition: 13
The test result:
Besides, you could check the configuration from the classic editor, then get the YAML file:
Hope this hellps.