I need to move files from ADO to a VM. This VM is set up using "Environments" and tagged appropriately. I would like to copy those files to that VM using its environment name and tag. So far I've only found "Windows machine file copy" which requires storing a admin login. Is there a way to use the Environments instead of hardcoding a login?
You can set up the YAML pipeline like as below:
If you want to copy the build artifact files to the VM, reference below sample.
In the Build stage set up the jobs to build the source code, and publish the artifact files for use.
In the Deploy stage, when setting the deployment job with an environment on an VM, all steps in this job will be run on this VM. In the deployment job, at the very first step, it will automatically download the artifact files to the working directory on the VM.
Then you can use the CopyFiles task to the copy the artifact files to any accessible directory on the VM.
stages:
- stage: Build
displayName: 'Build'
. . .
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: '{EnvironmentName.ResourceName}'
strategy:
runOnce:
deploy:
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: {Destination Directory}'
inputs:
SourceFolder: '$(Pipeline.Workspace)/drop'
Contents: '**'
TargetFolder: '{Destination Directory}'
CleanTargetFolder: true
If the files you want to copy the VM are the source files in the repository, reference below sample.
stages:
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: '{EnvironmentName.ResourceName}'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: CopyFiles#2
displayName: 'Copy Files to: {Destination Directory}'
inputs:
SourceFolder: '$(Pipeline.Workspace)'
Contents: '**'
TargetFolder: '{Destination Directory}'
CleanTargetFolder: true
For more details, you can see: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-kubernetes?view=azure-devops
I have been struggling for a while with setting up an Azure Pipeline to deploy .Net Core service to VM. I had the following requirements:
to use YAML files instead of classic UI
to deploy as a windows service not to IIS
to use stages in pipeline
I was using monorepo with the service residing in MyService folder
In addition I had an external NuGet feed
My solution consisted of several projects and I was building only one of them
appsettings.release.json was being replaced with the one persisted on the server to preserve settings
I was inspired by Bright Ran-MSFT answer and suggest my complete azure-pipelines.yml file
trigger:
branches:
include:
- staging
paths:
include:
- MyService
pool:
vmImage: "windows-latest"
variables:
solution: "MyService/MyService.sln"
buildPlatform: "Any CPU"
buildConfiguration: "Release"
stages:
- stage: Build
jobs:
- job: BuildJob
steps:
- task: NuGetCommand#2
inputs:
restoreSolution: "$(solution)"
feedsToUse: "config"
nugetConfigPath: "MyService/NuGet.Config"
- task: VSBuild#1
inputs:
solution: "$(solution)"
msbuildArgs: '/t:MyService:Rebuild /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:SkipInvalidConfigurations=true /p:OutDir="$(build.artifactStagingDirectory)\service_package"'
platform: "$(buildPlatform)"
configuration: "$(buildConfiguration)"
- task: VSTest#2
inputs:
platform: "$(buildPlatform)"
configuration: "$(buildConfiguration)"
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(build.artifactStagingDirectory)\service_package'
artifactName: "service_package"
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: 'MainDeployEnv.MY_VM_SERVER'
strategy:
runOnce:
deploy:
steps:
- task: CopyFiles#2
displayName: 'Copy Package to: C:\azservices\MyService\service'
inputs:
SourceFolder: '$(Pipeline.Workspace)/service_package'
Contents: '**'
TargetFolder: 'C:\azservices\MyService\service\'
CleanTargetFolder: true
- task: CopyFiles#2
displayName: 'Replace appsettings.release.json'
inputs:
SourceFolder: 'C:\azservices\MyService\settings'
Contents: 'appsettings.release.json'
TargetFolder: 'C:\azservices\MyService\service\'
OverWrite: true
Related
I am getting this error of Solution not found using search pattern 'D:\a\1\s***.sln' while building and deploying dacpac via yaml file.
My yaml file is below.
trigger:
- master
pool:
name: Azure Pipelines
vmImage: 'vs2017-win2016'
jobs:
- deployment: BuildAndDeploy
displayName: Build And Deploy Dacpac
environment: 'DEV'
strategy:
runOnce:
deploy:
steps:
- task: VSBuild#1
displayName: 'Build solution **\*.sln'
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(agent.builddirectory)'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: AzureKeyVault#1
displayName: 'Azure Key Vault: kv-agaurav-poc'
inputs:
azureSubscription: 'Visual Studio Enterprise-abonnement (b5970491-02a8-4fd0-b9b4-73a6e63a273a)'
KeyVaultName: 'kv-agaurav-poc'
RunAsPreJob: true
- task: SqlAzureDacpacDeployment#1
displayName: 'Azure SQL DacpacTask'
inputs:
azureSubscription: 'Visual Studio Enterprise-abonnement (b5970491-02a8-4fd0-b9b4-73a6e63a273a)'
ServerName: 'fastbin-server.database.windows.net'
DatabaseName: 'fastbin-db'
SqlUsername: agaurav
SqlPassword: '$(sqlpassword)'
DacpacFile: 'D:\a\1\a\s\bin\Debug\fastbin-db.dacpac'
One thing to note is that if I have the steps and tasks outside the environment, it works.
So, my question is how can I make yaml file find the solution inside any environment tags (In this case environment: 'DEV').
deployment job doesn't checkout you code by default. You need to add - checkout: self to download code first before you try to build you solution.
The deployment job automatically downloads all the pipeline resources. However, standard job does not. I tried to use - download: current, but that does not download the pipeline resources.
The reason I want to do this is to simulate a deployment for GitOps. The simulation will include a step that does a git diff that shows the differences for review.
However, I don't see an all or * option in https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema#download
My current workaround is to do a deployment to a "Temp" environment prior to the real deployment.
UPDATE:
Here's an example of what I have tried
resources:
pipelines:
- pipeline: auth
project: myproj
source: auth CI
trigger:
branches:
include:
- master
...
jobs:
- job: diagnostics
displayName: Job Diagnostics
steps:
- checkout: self
# - download:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
- bash: |
env | sort
displayName: Display environment variables
- bash: |
pwd
displayName: Present working directory
- bash: |
find $(Pipeline.Workspace) -type f -print
displayName: Display files
UPDATE:
Another approach I was mulling is to create a pipeline that creates another pipeline. That way the list of pipeline resources does not need to be Azure Pipeline YAML, it could be a CSV or a simplified YAML that I transform in which case I can generate
resources:
pipelines:
- pipeline: pipelineAlias1
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
- pipeline: pipelineAlias2
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
...
job:
steps:
- download: pipelineAlias1
- download: pipelineAlias2
and then set that up as another Azure pipeline to execute when updated.
How do you download all pipeline resources in a job?
Indeed, this is a known issue about using the download keyword to download the Pipeline Artifacts.
You could track this issue from below ticket:
Artifacts do not download on multi stage yaml build using DownloadPipelineArtifactV2
To resolve this issue, please try to use the DownloadPipelineArtifact task instead of the download keyword:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
Update:
I have noticed your yml file, it seems you do not add build job in your pipeline. The task DownloadPipelineArtifact is used to:
download pipeline artifacts from earlier stages in this pipeline,
or from another pipeline
So, we need to add stage to build the pipeline to generate the artifact, otherwise, no pipeline artifacts were downloaded.
Check my test YAML file:
variables:
ArtifactName: drop
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\LibmanTest'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: LibmanTest'
inputs:
ArtifactName: $(ArtifactName)
- stage: Dev
dependsOn: Build
jobs:
- job: Dev
displayName: Dev
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\TestSample'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: TestSample'
inputs:
ArtifactName: $(ArtifactName)
- stage: Deployment
dependsOn: Dev
pool:
name: MyPrivateAgent
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
As you can see, I use two stages Build and Dev to copy the project LibmanTest and TestSample as artifact, then use the task DownloadPipelineArtifact to download those two artifacts.
The test result:
Update2:
your example still does not show resources.pripelines
So, now you want to download the artifact from other pipelines, you need not use the default configuration for the task DownloadPipelineArtifact:
resources:
pipelines:
- pipeline: xxx
project: MyTestProject
source: xxx
trigger:
branches:
include:
- master
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact For Test'
inputs:
buildType: specific
project: MyTestProject
definition: 13
The test result:
Besides, you could check the configuration from the classic editor, then get the YAML file:
Hope this hellps.
i have one solutions with multiple projects.
DSS.DMN.Client project dependens on other projects(have references)
This is how my yaml files looks like
- task: MSBuild#1
displayName: 'Build solution Client'
inputs:
platform: anyCPU
maximumCpuCount: true
configuration: 'Integration'
solution: DSS.DMN.Client
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.SourcesDirectory)'
Contents: '**\bin\**'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
the thing is that Client have configuration as INT but the other two dependent projects have configuration as debug.
now when i run the build i get the error:
##[error]C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Current\Bin\Microsoft.Common.CurrentVersion.targets(777,5): Error : The OutputPath property is not set for project 'DSS.DMN.AVModule.csproj'. Please check to make sure that you have specified a valid combination of Configuration and Platform for this project. Configuration='INT' Platform='AnyCPU'. You may be seeing this message because you are trying to build a project without a solution file, and have specified a non-default Configuration or Platform that doesn't exist for this project.
the problem is build is trying to find INT configration for the project for DSS.DMN.AVModule.csproj though it have only debug configuration.
question: How do i provide differernt configation on my build pipeline for different project in a single build?
try this - attention in folder src/
my project is under src root folder
name: functions-name
trigger:
branches:
include:
- main
paths:
include:
- src/*
exclude:
- README.md
variables:
- name: functionAppName
value: 'func-name-env'
- name: azureSubscriptionENV
value: 'Azure ENV'
- name: vmImageName
value: 'windows-2019'
- name: workingDirectory
value: 'src/xxx.Functions'
- name: systemName
value: BatchAllocation
pool:
vmImage: $(vmImageName)
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
workspace:
clean: all
pool:
vmImage: $(vmImageName)
steps:
- task: NuGetCommand#2
inputs:
command: 'restore'
restoreSolution: '**\*.sln'
feedsToUse: 'config'
noCache: false
- task: VSBuild#1
inputs:
solution: '**\*.sln'
msbuildArgs: '/p:DeployOnBuild=true /p:SkipInvalidConfigurations=false /p:OutDir="$(System.DefaultWorkingDirectory)\publish_output"'
platform: 'Any CPU'
configuration: 'Release'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)/publish_output'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(systemName)-v$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(systemName)-v$(Build.BuildId).zip
I created a yaml-based, multi-stage pipeline in Azure DevOps.
variables:
versionPrefix: '7.1.0.'
versionRevision: $[counter(variables['versionPrefix'], 100)]
version: $[format('{0}{1}',variables['versionPrefix'],variables['versionRevision'])]
solution: '**/product.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Debug'
name: $(version)_$(Date:yyyyMMdd)$(Rev:.r)
stages:
- stage: Build
pool: Default
jobs:
- job: Build
displayName: Build
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VersionAssemblies#2
displayName: Version Assemblies
inputs:
Path: '$(Build.SourcesDirectory)'
VersionNumber: '$(version)'
InjectVersion: true
FilenamePattern: 'AssemblyInfo.*'
OutputVersion: 'OutputedVersion'
- task: VSBuild#1
displayName: Build product
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
maximumCpuCount: true
- stage: Deploy
dependsOn: Build
pool: Default
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "Deployed"
As shown above, the pipeline includes a deployment stage that references an environment named '7-1-0'. After the pipeline runs, a deployment is displayed in the UI for that environment. However, under that environment both the Changes and Workitems pages are empty. I confirmed there are new changes that have not previously been deployed to this environment. Why?
Note the deployment stage doesn't actually do anything. We're doing the actual deployment manually, but were hoping to track the changes to the environment via DevOps. Also, we haven't defined any resources for the environment. I couldn't find anything stating it was required to have a resource defined for traceability of commits and work items.
UPDATE 1
Per #Leo-Liu-MSFT suggestion below, I updated the pipeline to publish an artifact. Note that the build runs on a self-hosted agent. However, I'm still not getting any results in Environment Changes and Workitems.
variables:
versionPrefix: '7.1.0.'
versionRevision: $[counter(variables['versionPrefix'], 100)]
version: $[format('{0}{1}',variables['versionPrefix'],variables['versionRevision'])]
solution: '**/product.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Debug'
name: $(version)_$(Date:yyyyMMdd)$(Rev:.r)
stages:
- stage: Build
pool: Default
jobs:
- job: Build
displayName: Build
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VersionAssemblies#2
displayName: Version Assemblies
inputs:
Path: '$(Build.SourcesDirectory)'
VersionNumber: '$(version)'
InjectVersion: true
FilenamePattern: 'AssemblyInfo.*'
OutputVersion: 'OutputedVersion'
- task: VSBuild#1
displayName: Build product
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
maximumCpuCount: true
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
New-Item -Path '$(build.artifactstagingdirectory)' -Name "testfile1.txt" -ItemType "file" -Value "Hello, DevOps!" -force
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'FilePath'
TargetPath: 'C:\a\p\\$(Build.DefinitionName)\\$(Build.BuildNumber)'
- stage: Deploy
dependsOn: Build
pool: Default
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "Deployed"
UPDATE 2
Per follow up suggestion from #Leo-Liu-MSFT, I created the following attempted publishing the artifact to Azure. I also simplified the yaml to use a Microsoft Hosted Agent. Note I did have the issue described here which is why I configured the deployment task they way I did with 'download: none'. I'm still not getting any Changes or Workitems in the environment.
variables:
ArtifactName: drop
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
vmImage: ubuntu-latest
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)/Build'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
ArtifactName: $(ArtifactName)
- stage: Deploy
dependsOn: Build
pool:
vmImage: ubuntu-latest
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- download: none
- task: DownloadBuildArtifacts#0
inputs:
artifactName: $(ArtifactName)
buildType: 'current'
downloadType: 'single'
downloadPath: '$(System.ArtifactsDirectory)'
FINAL UPDATE
Here's the working YAML. The final trick was to set download to current and specify the artifact name.
variables:
ArtifactName: drop
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
vmImage: ubuntu-latest
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)/Build'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
ArtifactName: $(ArtifactName)
- stage: Deploy
dependsOn: Build
pool:
vmImage: ubuntu-latest
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: $(ArtifactName)
Why are the Changes and Workitems pages empty under an Environment in a multi-stage Azure Devops pipline?
You need to add publish build Artifacts task to publish build artifacts to Azure Pipelines.
Azure devops track the changes and workitems via REST API, then Azure devops passes this information to other environments by transferring files.
So, we need to publish the artifact to the Azure Pipelines so that the deploy stage could get those info when it is getting source.
As test, I just add the copy task and publish build artifact task in the build stage, like:
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
vmImage: ubuntu-latest
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
- stage: Deploy
dependsOn: Build
pool:
vmImage: ubuntu-latest
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "Deployed"
The result:
Update:
I had to make a few updates to deal with an issue downloading the
artifact. Still not getting any Changes or Workitems on the
environment. I do appreciate your help!
That because you are disable the built-in download task instead of using DownloadBuildArtifacts task, which task do not have feature to fetch the commits and work items.
- download: none
You need delete above in your YAML. As I test your updated YAML without - download: none, it works fine.
Hope this helps.
I am using azure devops multi staging pipelines and have the following YAML file. I can create a build and then publish the build artifact to drop. When I try to deploy, I get an error seen below.
I have tried many things but I want my deployment to be in the same pipeline as I know you can add it to the release pipeline. Am I missing something?
stages:
- stage: Build
jobs:
- job: Build
pool:
name: Hosted Windows 2019 with VS2019
demands: azureps
steps:
# Restore
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: restore
projects: '**/*.csproj'
feedsToUse: select
vstsFeed : myfeed
includeNuGetOrg : true
# Build
- task: DotNetCoreCLI#2
displayName: Build
inputs:
command: build
projects: '**/*.csproj'
arguments: '--configuration Release'
# Publish
- task: DotNetCoreCLI#2
displayName: Publish
inputs:
command: publish
publishWebProjects: True
arguments: '--configuration $(BuildConfiguration) --output $(Build.ArtifactStagingDirectory)'
zipAfterPublish: True
# Publish Artifact
- task: PublishBuildArtifacts#1
- stage: Dev
jobs:
# track deployments on the environment
- deployment: DeployWeb
pool:
vmImage: 'ubuntu-latest'
# creates an environment if it doesn’t exist
environment: 'my-dev'
strategy:
# default deployment strategy
runOnce:
deploy:
steps:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'drop'
downloadPath: '$(build.ArtifactStagingDirectory)'
- task: AzureWebApp#1
displayName: Azure Web App Deploy
inputs:
appType: 'webapp'
azureSubscription: '213456123'
appName: mytestapp
package:$(System.DefaultWorkingDirectory)/**/*.zip
Error I am getting
The artifact has been downloaded to artifact folder $(build.ArtifactStagingDirectory), so the package path could be: package:$(build.ArtifactStagingDirectory)/**/*.zip
In my case, I simply copied artifact download path from DownloadBuildArtifacts#0 task after its implementation and pasted it in AzureWebApp#1 task's package property.
Yes, it required me to run a single fail so that I can find exact path where artifacts are downloaded. Can simply run DownloadBuildArtifacts#0 task only so that you can find exact download path of artifacts.