Azure Pipelines Deploy Stage Failing without Error - azure-devops

The deploy stage of the pipeline fails without error after build stage completes successfully.
Enabling system diagnostics does not give in any additional information (see the screenshot below).
The following pipelines yaml file was used:
trigger:
- master
resources:
- repo: self
variables:
vmImageName: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: CmdLine#2
inputs:
script: |
ls -la
- stage: Deploy
displayName: Deploy Notebook Instance Stage
dependsOn: Build
jobs:
- deployment: Deploy
displayName: Deploy
pool:
vmImage: $(vmImageName)
environment: 'myenv.default'
strategy:
runOnce:
deploy:
steps:
- task: CmdLine#2
inputs:
script: echo Some debug text!

I used your script and I change only environment as I don't have myenv.default and all is fine.
Please double check your environment setting.

Related

Azure DevOps - How to ensure working directory

How can i ensure that all stages of my pipelines are performed in the same working directory.
I have pipeline that looks like this:
resources:
repositories:
- repository: AzureRepoDatagovernance
type: git
name: DIF_data_governance
ref: develop
trigger:
branches:
include:
- main
paths:
include:
- terraform/DIF
variables:
- group: PRD_new_resources
- name: initial_deployment
value: false
pool: $(agent_pool_name)
stages:
- stage: VariableCheck
jobs:
- job: VariableMerge
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- ${{ if eq(variables.initial_deployment, 'false') }}:
- task: PythonScript#0
inputs:
scriptSource: filePath
scriptPath: DIF-devops/config/dynamic_containers.py
pythonInterpreter: /usr/bin/python3
arguments: --automount-path $(System.DefaultWorkingDirectory)/DIF_data_governance/data_ingestion_framework/$(env)/AutoMount_Config.json --variables-path $(System.DefaultWorkingDirectory)/DIF-devops/terraform/DIF/DIF.tfvars.json
displayName: "Adjust container names in variables.tf.json"
- stage: Plan
jobs:
- job: Plan
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform init
terraform plan -out=outfile -var-file=DIF.tfvars.json
displayName: "Plan infrastructure changes to $(terraform_folder_name) environment"
- stage: ManualCheck
jobs:
- job: ManualCheck
pool: server
steps:
- task: ManualValidation#0
timeoutInMinutes: 5
displayName: "Validate the configuration changes"
- stage: Apply
jobs:
- job: Apply
steps:
- checkout: self
- checkout: AzureRepoDatagovernance
- script: |
cd $(System.DefaultWorkingDirectory)$(terraform_folder_name) && ls -lah
terraform apply -auto-approve "outfile"
displayName: "Apply infrastructure changes to $(terraform_folder_name) environment"
How can I make sure that all 4 stages are inside this same working directory so I can check out just once and all stages have access to work done by previous jobs? I know that this
I know that my pipeline has some flaws that will need to be polished.
This is not possible. Each azure devops stage has its own working directory and it is considered a different devops agent job. The jobs inside the stage will use the same working directory for the steps that are included on them.
If you need to pass code or artifacts between stages you should use publish pipeline artifacts and download pipeline artifacts native devops tasks.

Azure DevOps yaml file recommendation when I have multi build and multi environments

I am working on implementing a YAML pipeline to deploy application to the four different environments. Unfortunately I need to perform build every time specific to environment.
I have four front end portals and 8 microservices (.NET web API's) related code in a single repo.
For example I have a angular code for one of the front end and I need to execute npm run dev:build, npm run dev:qa ...npm run prod:prod to generate artifacts per environment and then I can deploy on the respective environment.
With environment branching strategy I can deploy easily mean maintaining branches like Dev, QA, UAT and PROD. By creating pull requests I can merge and deploy to the respective environment. But I don't want this complex branching strategy.
So I decided to create a single yaml, and used parameters and conditionals to deploy on respective environment with one click like below
Azure DevOps yaml not picking up the parameters
Since I have 4 frontend portals and 8 microservices its difficult to run pipeline with one click manually. So i decided to add triggers for Dev and QA at least and using Release/* branch I want to deploy the code to higher environments.
But when I use triggers and parameters pipeline is not running. When I include parameters I guess Trigger is None.
trigger:
- dev
parameters:
- name: environment
displayName: Environment to choose
type: string
values:
- UAT
variables:
vmImageName: 'windows-2019'
stages:
- stage: DevBuildStage
dependsOn: []
displayName: Dev_Build
jobs:
- job: Build_Job
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: PowerShell#2
displayName: command1
inputs:
targetType: 'inline'
script: |
Write-Host "This is Dev build"
- stage: UATBuildStage
condition: and(succeeded(), eq('${{ parameters.environment }}', 'UAT'))
dependsOn: []
displayName: UAT_Build
jobs:
- job: Build_Job
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: PowerShell#2
displayName: command1
inputs:
targetType: 'inline'
script: |
Write-Host "This is UAT build"
- stage: DeploytoDev
displayName: Deploy to Dev
dependsOn: DevBuildStage
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
variables:
- group: dev
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "Deploy to Dev"
- stage: DeploytoUAT
displayName: Deploy to UAT
dependsOn: UATBuildStage
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'uat'
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Deploy to UAT"
This is how I want to run Dev build and deploy automatically but UAT I can run whenever I need manually using parameters. This way I want to reduce the number of branches.
The simple branching strategy like Dev(Dev environement) --- Release/*(QA, UAT and PROD) --- main (Just to maintain the stable code)
Please help me on this issue.
Finally I have decided to use environmental branching strategy means a specific branch to deploy to the environment ...Dev(Branch) --- DEV(environment). Also I have used templates to reuse step.

Azure DevOps yaml pipeline : how to download only specific artifacts of a referenced pipeline?

By referencing another pipeline in a YAML pipeline's resources all artifacts published by the referenced pipeline get automatically downloaded. I'm not sure how to stop this behavior and download only the needed artifacts. Adding a download task for only the needed artifacts does not stop the initial download of the full set of artifacts.
So what you need is disabling default behavior as
Artifacts are only downloaded automatically in deployment jobs. In a regular build job, you need to explicitly use the download step keyword or Download Pipeline Artifact task.
To stop artifacts from being downloaded automatically, add a download step and set its value to none:
steps:
- download: none
and then add additional step to download specific artifact.
Here is an example:
resources:
pipelines:
- pipeline: MultipleArtifact
project: 'DevOps Manual'
source: 'kmadof.devops-manual (64)'
jobs:
- job: Build
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
# Track deployments on the environment.
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-16.04'
# Creates an environment if it doesn't exist.
environment: 'smarthotel-dev'
strategy:
# Default deployment strategy, more coming...
runOnce:
deploy:
steps:
- download: none
- download: MultipleArtifact
artifact: art-1
- checkout: self
- script: echo my first deployment
To download a specific artifact rather than all artifacts, you can include the following:
steps:
- download: current
artifact: 'Artifact-Name'
Below is a complete example comparing the default behaviour with download: current and download: none.
Here is a screenshot from Azure pipelines showing correct number of Download steps in each case.
jobs:
- job: Create_Two_Artifacts
steps:
- bash: echo "test" >> file.txt
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: 'file.txt'
artifactName: 'Artifact1'
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: 'file.txt'
artifactName: 'Artifact2'
- deployment: Download_All_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps: []
- deployment: Download_SpecificArtefact_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: 'Artifact1'
- deployment: Download_None_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps:
- download: none

How do you download *all* pipeline resources in a job?

The deployment job automatically downloads all the pipeline resources. However, standard job does not. I tried to use - download: current, but that does not download the pipeline resources.
The reason I want to do this is to simulate a deployment for GitOps. The simulation will include a step that does a git diff that shows the differences for review.
However, I don't see an all or * option in https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema#download
My current workaround is to do a deployment to a "Temp" environment prior to the real deployment.
UPDATE:
Here's an example of what I have tried
resources:
pipelines:
- pipeline: auth
project: myproj
source: auth CI
trigger:
branches:
include:
- master
...
jobs:
- job: diagnostics
displayName: Job Diagnostics
steps:
- checkout: self
# - download:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
- bash: |
env | sort
displayName: Display environment variables
- bash: |
pwd
displayName: Present working directory
- bash: |
find $(Pipeline.Workspace) -type f -print
displayName: Display files
UPDATE:
Another approach I was mulling is to create a pipeline that creates another pipeline. That way the list of pipeline resources does not need to be Azure Pipeline YAML, it could be a CSV or a simplified YAML that I transform in which case I can generate
resources:
pipelines:
- pipeline: pipelineAlias1
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
- pipeline: pipelineAlias2
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
...
job:
steps:
- download: pipelineAlias1
- download: pipelineAlias2
and then set that up as another Azure pipeline to execute when updated.
How do you download all pipeline resources in a job?
Indeed, this is a known issue about using the download keyword to download the Pipeline Artifacts.
You could track this issue from below ticket:
Artifacts do not download on multi stage yaml build using DownloadPipelineArtifactV2
To resolve this issue, please try to use the DownloadPipelineArtifact task instead of the download keyword:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
Update:
I have noticed your yml file, it seems you do not add build job in your pipeline. The task DownloadPipelineArtifact is used to:
download pipeline artifacts from earlier stages in this pipeline,
or from another pipeline
So, we need to add stage to build the pipeline to generate the artifact, otherwise, no pipeline artifacts were downloaded.
Check my test YAML file:
variables:
ArtifactName: drop
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\LibmanTest'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: LibmanTest'
inputs:
ArtifactName: $(ArtifactName)
- stage: Dev
dependsOn: Build
jobs:
- job: Dev
displayName: Dev
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\TestSample'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: TestSample'
inputs:
ArtifactName: $(ArtifactName)
- stage: Deployment
dependsOn: Dev
pool:
name: MyPrivateAgent
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
As you can see, I use two stages Build and Dev to copy the project LibmanTest and TestSample as artifact, then use the task DownloadPipelineArtifact to download those two artifacts.
The test result:
Update2:
your example still does not show resources.pripelines
So, now you want to download the artifact from other pipelines, you need not use the default configuration for the task DownloadPipelineArtifact:
resources:
pipelines:
- pipeline: xxx
project: MyTestProject
source: xxx
trigger:
branches:
include:
- master
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact For Test'
inputs:
buildType: specific
project: MyTestProject
definition: 13
The test result:
Besides, you could check the configuration from the classic editor, then get the YAML file:
Hope this hellps.

How do I specify that a specific Azure job needs a given OS on the level of the jobs?

All of my build agents are in one pool but different build agents have a different OS. Certain jobs however need Windows and I at the moment unsuccessfully try to tell that to azure via demands:
stages:
- stage: project_frontend
dependsOn: common_container
demands: Agent.OS -equals Windows_NT
jobs:
- job: build_container
steps:
- task: Docker#2
displayName: 'login to docker hub'
inputs:
command: login
containerRegistry: dockerHubServiceConnection
Check documentation here: YAML schema reference. The job level supports pool and demands. As example for the Microsoft hosted agents:
jobs:
- job: Windows
pool:
vmImage: 'vs2017-win2016'
steps:
- script: echo hello from Windows
- job: macOS
pool:
vmImage: 'macOS-10.14'
steps:
- script: echo hello from macOS
- job: Linux
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: echo hello from Linux