Azure DevOps Pipeline with environments get stuck - azure-devops

I created an environment and registered a virtual machine (on prem) as a resource
Whenever I try to run the deployment in the resources, the pipeline gets stuck in the deployment stage. Inside of the job, the only log that I see is JOb is pending...
This is the relevant section of the pipeline:
- stage: deployInTest
displayName: Deploy in Test Envs
dependsOn: build
jobs:
- deployment: Deploy
displayName: "Deploy in Test"
environment:
name: 'Development'
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: DownloadBuildArtifacts#1
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'frontEnd'
downloadPath: '$(System.ArtifactsDirectory)'
Note that if I change this to the following yaml, the stage runs, but it tries to execute the task IISWebAppManagementOnMachineGroup#0 on the deployment server (OnPrem too) where IIS is not installed.
- stage: deployInTest
displayName: Deploy in Test Envs
dependsOn: build
jobs:
- deployment: Deploy
displayName: "Deploy in dev3"
environment: "Development"
strategy:
runOnce:
deploy:
steps:
- task: DownloadBuildArtifacts#1
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'frontEnd'
downloadPath: '$(System.ArtifactsDirectory)'

Ok, so after trying something that was in the back on my mind since yesterday it worked. I saw another post in SO (I can't find it right now) that had the same issue, but they had 2 deployment jobs.
Their solution was to give them unique names i.e. 'DeployInTest' instead of deploy.
For me changing
- stage: deployInTest
displayName: Deploy in Test Envs
dependsOn: build
jobs:
- deployment: Deploy
displayName: "Deploy in Test"
Into
- stage: deployInTest
displayName: Deploy in Test Envs
dependsOn: build
jobs:
- deployment: DeployInTest #<-- this is what changed
displayName: "Deploy in Test"
Did the trick. I just realized that I was even trying that before writing the question. I'll edit the question to show the actual status with which it was not working

Related

How to use Azure DevOps Pipelines Machine File Copy Using Environments?

I need to move files from ADO to a VM. This VM is set up using "Environments" and tagged appropriately. I would like to copy those files to that VM using its environment name and tag. So far I've only found "Windows machine file copy" which requires storing a admin login. Is there a way to use the Environments instead of hardcoding a login?
You can set up the YAML pipeline like as below:
If you want to copy the build artifact files to the VM, reference below sample.
In the Build stage set up the jobs to build the source code, and publish the artifact files for use.
In the Deploy stage, when setting the deployment job with an environment on an VM, all steps in this job will be run on this VM. In the deployment job, at the very first step, it will automatically download the artifact files to the working directory on the VM.
Then you can use the CopyFiles task to the copy the artifact files to any accessible directory on the VM.
stages:
- stage: Build
displayName: 'Build'
. . .
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: '{EnvironmentName.ResourceName}'
strategy:
runOnce:
deploy:
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: {Destination Directory}'
inputs:
SourceFolder: '$(Pipeline.Workspace)/drop'
Contents: '**'
TargetFolder: '{Destination Directory}'
CleanTargetFolder: true
If the files you want to copy the VM are the source files in the repository, reference below sample.
stages:
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: '{EnvironmentName.ResourceName}'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: CopyFiles#2
displayName: 'Copy Files to: {Destination Directory}'
inputs:
SourceFolder: '$(Pipeline.Workspace)'
Contents: '**'
TargetFolder: '{Destination Directory}'
CleanTargetFolder: true
For more details, you can see: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-kubernetes?view=azure-devops
I have been struggling for a while with setting up an Azure Pipeline to deploy .Net Core service to VM. I had the following requirements:
to use YAML files instead of classic UI
to deploy as a windows service not to IIS
to use stages in pipeline
I was using monorepo with the service residing in MyService folder
In addition I had an external NuGet feed
My solution consisted of several projects and I was building only one of them
appsettings.release.json was being replaced with the one persisted on the server to preserve settings
I was inspired by Bright Ran-MSFT answer and suggest my complete azure-pipelines.yml file
trigger:
branches:
include:
- staging
paths:
include:
- MyService
pool:
vmImage: "windows-latest"
variables:
solution: "MyService/MyService.sln"
buildPlatform: "Any CPU"
buildConfiguration: "Release"
stages:
- stage: Build
jobs:
- job: BuildJob
steps:
- task: NuGetCommand#2
inputs:
restoreSolution: "$(solution)"
feedsToUse: "config"
nugetConfigPath: "MyService/NuGet.Config"
- task: VSBuild#1
inputs:
solution: "$(solution)"
msbuildArgs: '/t:MyService:Rebuild /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:SkipInvalidConfigurations=true /p:OutDir="$(build.artifactStagingDirectory)\service_package"'
platform: "$(buildPlatform)"
configuration: "$(buildConfiguration)"
- task: VSTest#2
inputs:
platform: "$(buildPlatform)"
configuration: "$(buildConfiguration)"
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(build.artifactStagingDirectory)\service_package'
artifactName: "service_package"
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: 'MainDeployEnv.MY_VM_SERVER'
strategy:
runOnce:
deploy:
steps:
- task: CopyFiles#2
displayName: 'Copy Package to: C:\azservices\MyService\service'
inputs:
SourceFolder: '$(Pipeline.Workspace)/service_package'
Contents: '**'
TargetFolder: 'C:\azservices\MyService\service\'
CleanTargetFolder: true
- task: CopyFiles#2
displayName: 'Replace appsettings.release.json'
inputs:
SourceFolder: 'C:\azservices\MyService\settings'
Contents: 'appsettings.release.json'
TargetFolder: 'C:\azservices\MyService\service\'
OverWrite: true

Configure approval for Azure Pipelines deployment stage against Azure App Services

I'm defining an Azure Pipeline as code in which there will be several deployment stages (staging and production). I need the production stage to be executed on approval from some users.
Currently there's a way to define approvals for "Environments". However this only includes resources such as VMs and K8s, but the application will be deployed on Azure App Services:
Pipeline scerpt:
- stage: Deploy_Production
pool:
vmImage: ubuntu-latest
jobs:
- job: deploy
steps:
- script: find ./
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'drop'
downloadPath: '$(System.ArtifactsDirectory)'
- script: 'find $(System.ArtifactsDirectory)'
- task: AzureRmWebAppDeployment#4
inputs:
ConnectionType: 'AzureRM'
azureSubscription: 'Free Trial(xxx)'
appType: 'webAppLinux'
WebAppName: 'app'
packageForLinux: '$(System.ArtifactsDirectory)/**/*.jar'
RuntimeStack: 'JAVA|11-java11'
StartupCommand: 'java - jar $(System.ArtifactsDirectory)/drop/build/libs/app.jar'
How can I configure approvals in this scenarios?
UPDATE:
Following MorrowSolutions' answer I updated my pipeline
If I leave it as shown in the answer, the steps entry is highlighted as invalid syntax:
If I indent it, it seems to be correct. The deployment stage executes and downloads the artifact, but nothing else seems to be executed (scripts, deploy task...):
So, the resources you tie to an environment do not restrict which pipelines can be associated with that environment. Also, they're not required and at the moment Microsoft only supports Kubernetes & VMS, so you won't be able to associate an Azure App Service.
In your case, don't associate any resources with your environment. You'll want to update your YAML to use a deployment job specifically and specify the environment within your parameters. This will tell your pipeline to associate releases with the environment you've configured. It should look something like this in your case:
stages:
- stage: Deploy_Production
pool:
vmImage: ubuntu-latest
jobs:
- deployment: DeployWeb
displayName: Deploy Web App
environment: YourApp-QA
pool:
vmImage: 'ubuntu-latest'
strategy:
runOnce:
deploy:
steps:
- script: find ./
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'drop'
downloadPath: '$(System.ArtifactsDirectory)'
- script: 'find $(System.ArtifactsDirectory)'
- task: AzureRmWebAppDeployment#4
inputs:
ConnectionType: 'AzureRM'
azureSubscription: 'Free Trial(xxx)'
appType: 'webAppLinux'
WebAppName: 'app'
packageForLinux: '$(System.ArtifactsDirectory)/**/*.jar'
RuntimeStack: 'JAVA|11-java11'
StartupCommand: 'java - jar $(System.ArtifactsDirectory)/drop/build/libs/app.jar'
Here is Microsoft's documentation on the deployment job schema, with more information on how to use the environment parameter:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#schema
I actually just had a conversation with someone about this. You're not alone in thinking that the resources you tie to an environment have to be associated with the resources you're deploying within your YAML pipeline :)

Azure DevOps yaml pipeline : how to download only specific artifacts of a referenced pipeline?

By referencing another pipeline in a YAML pipeline's resources all artifacts published by the referenced pipeline get automatically downloaded. I'm not sure how to stop this behavior and download only the needed artifacts. Adding a download task for only the needed artifacts does not stop the initial download of the full set of artifacts.
So what you need is disabling default behavior as
Artifacts are only downloaded automatically in deployment jobs. In a regular build job, you need to explicitly use the download step keyword or Download Pipeline Artifact task.
To stop artifacts from being downloaded automatically, add a download step and set its value to none:
steps:
- download: none
and then add additional step to download specific artifact.
Here is an example:
resources:
pipelines:
- pipeline: MultipleArtifact
project: 'DevOps Manual'
source: 'kmadof.devops-manual (64)'
jobs:
- job: Build
pool:
vmImage: 'ubuntu-latest'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
# Track deployments on the environment.
- deployment: DeployWeb
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-16.04'
# Creates an environment if it doesn't exist.
environment: 'smarthotel-dev'
strategy:
# Default deployment strategy, more coming...
runOnce:
deploy:
steps:
- download: none
- download: MultipleArtifact
artifact: art-1
- checkout: self
- script: echo my first deployment
To download a specific artifact rather than all artifacts, you can include the following:
steps:
- download: current
artifact: 'Artifact-Name'
Below is a complete example comparing the default behaviour with download: current and download: none.
Here is a screenshot from Azure pipelines showing correct number of Download steps in each case.
jobs:
- job: Create_Two_Artifacts
steps:
- bash: echo "test" >> file.txt
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: 'file.txt'
artifactName: 'Artifact1'
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: 'file.txt'
artifactName: 'Artifact2'
- deployment: Download_All_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps: []
- deployment: Download_SpecificArtefact_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: 'Artifact1'
- deployment: Download_None_Deployment
environment: MyEnvironment
dependsOn: [Create_Two_Artifacts]
strategy:
runOnce:
deploy:
steps:
- download: none

Azure Pipeline with multiple drop folders in YAML

I have create a YAML pipeline for Azure deployment. There are many templates, but I will only show the master pipeline to illustrate my issue.
Basically
the first stage is to build from repository source.
the next stage is pre-deployment followed by deployment
The build drops the output files to a drop folder. During pre-deployment some of these files go through some transformations (replacing tokens with values according to target environment).
The problem is that currently there is only one drop folder, so you can see the problem coming .... If I deploy to DEV, the files are transformed using the DEV values. But then if I deploy to INT, the files are already transformed and I end up deploying to INT files with DEV values.
It get worse if DEV and INT deployment run at the same time...
So how can I use separate drop folder per environment ? In predeployment, should I copy the drop folder to another location before transformation. In which case, how do I specify the new location in the deployment stages ?
Here's the master pipeline for reference :
trigger:
- master
pool:
name: Default
demands:
- npm
- msbuild
- visualstudio
stages:
- stage: build
displayName: 'Build & Test stage'
jobs:
- template: templates/pipeline-build/master.yml
parameters:
buildConfiguration: 'Release'
dropFolder: '\\srvbuild\DROP'
- stage: deployDev
displayName: 'Deploy Dev Stage'
dependsOn: build
condition: succeeded()
jobs:
- deployment: deploymentjob
displayName: deployment job
environment: dev
variables:
- template: variables/dev.yml
strategy:
runOnce:
preDeploy:
steps:
- template: templates/pipeline-predeploy/master.yml
deploy:
steps:
- template: templates/pipeline-deploy/master.yml
- stage: deployInt
displayName: 'Deploy Int Stage'
dependsOn: build
condition: succeeded()
jobs:
- deployment: deploymentjob
displayName: deployment job
environment: int
variables:
- template: variables/int.yml
strategy:
runOnce:
preDeploy:
steps:
- template: templates/pipeline-predeploy/master.yml
deploy:
steps:
- template: templates/pipeline-deploy/master.yml
As workaround, you can publish the build artifact to A file share, and then download the build artifact through the Download Fileshare Artifacts task in each stage to transform
it separately.
- task: PublishPipelineArtifact#1
displayName: 'Publish Pipeline Artifact'
inputs:
artifact: drop
publishLocation: filepath
fileSharePath: '***'
Use this task to download fileshare artifacts:
- task: DownloadFileshareArtifacts#1
inputs:
filesharePath:
artifactName:
#itemPattern: '**' # Optional
#downloadPath: '$(System.ArtifactsDirectory)'
#parallelizationLimit: '8' # Optional

How do you download *all* pipeline resources in a job?

The deployment job automatically downloads all the pipeline resources. However, standard job does not. I tried to use - download: current, but that does not download the pipeline resources.
The reason I want to do this is to simulate a deployment for GitOps. The simulation will include a step that does a git diff that shows the differences for review.
However, I don't see an all or * option in https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema#download
My current workaround is to do a deployment to a "Temp" environment prior to the real deployment.
UPDATE:
Here's an example of what I have tried
resources:
pipelines:
- pipeline: auth
project: myproj
source: auth CI
trigger:
branches:
include:
- master
...
jobs:
- job: diagnostics
displayName: Job Diagnostics
steps:
- checkout: self
# - download:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
- bash: |
env | sort
displayName: Display environment variables
- bash: |
pwd
displayName: Present working directory
- bash: |
find $(Pipeline.Workspace) -type f -print
displayName: Display files
UPDATE:
Another approach I was mulling is to create a pipeline that creates another pipeline. That way the list of pipeline resources does not need to be Azure Pipeline YAML, it could be a CSV or a simplified YAML that I transform in which case I can generate
resources:
pipelines:
- pipeline: pipelineAlias1
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
- pipeline: pipelineAlias2
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
...
job:
steps:
- download: pipelineAlias1
- download: pipelineAlias2
and then set that up as another Azure pipeline to execute when updated.
How do you download all pipeline resources in a job?
Indeed, this is a known issue about using the download keyword to download the Pipeline Artifacts.
You could track this issue from below ticket:
Artifacts do not download on multi stage yaml build using DownloadPipelineArtifactV2
To resolve this issue, please try to use the DownloadPipelineArtifact task instead of the download keyword:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
Update:
I have noticed your yml file, it seems you do not add build job in your pipeline. The task DownloadPipelineArtifact is used to:
download pipeline artifacts from earlier stages in this pipeline,
or from another pipeline
So, we need to add stage to build the pipeline to generate the artifact, otherwise, no pipeline artifacts were downloaded.
Check my test YAML file:
variables:
ArtifactName: drop
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\LibmanTest'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: LibmanTest'
inputs:
ArtifactName: $(ArtifactName)
- stage: Dev
dependsOn: Build
jobs:
- job: Dev
displayName: Dev
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\TestSample'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: TestSample'
inputs:
ArtifactName: $(ArtifactName)
- stage: Deployment
dependsOn: Dev
pool:
name: MyPrivateAgent
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
As you can see, I use two stages Build and Dev to copy the project LibmanTest and TestSample as artifact, then use the task DownloadPipelineArtifact to download those two artifacts.
The test result:
Update2:
your example still does not show resources.pripelines
So, now you want to download the artifact from other pipelines, you need not use the default configuration for the task DownloadPipelineArtifact:
resources:
pipelines:
- pipeline: xxx
project: MyTestProject
source: xxx
trigger:
branches:
include:
- master
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact For Test'
inputs:
buildType: specific
project: MyTestProject
definition: 13
The test result:
Besides, you could check the configuration from the classic editor, then get the YAML file:
Hope this hellps.