I'm writing my first Terraform YAML pipeline in Azure DevOps. I define four repositories as resources, check them out and run to Terraform Plan. The pipeline succeeds but only the first repository of Terraform runs. This is based what is coming to the screen from that repositories' output.tf. The others don't generate any output even when I define a output variable with a string value. Something like this:
output "rg_module_debug" { value="rg module ran" }
Here is the pipeline code, any feedback on why the other code isn't running would be appreciated
name: 'Naming Test'
trigger:
- None
pool:
vmImage: ubuntu-latest
resources:
repositories:
- repository: VariablesRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-variables-environment
ref: main
- repository: NameRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-module-name
ref: main
- repository: NamingRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-module-naming
ref: main
- repository: ResourceGrpRepo # identifier (A-Z, a-z, 0-9, and underscore)
type: git #git refers to Azure Repos Git repos
name: AzureTutorial/terraform-azurerm-module-resource_group
ref: main
stages:
- stage: Install
jobs:
- job:
timeoutInMinutes: 60 # how long to run the job before automatically cancelling
cancelTimeoutInMinutes: 2 # how much time to give 'run always even if cancelled tasks'
before stopping them
steps:
- checkout: self
- checkout: VariablesRepo
- checkout: NameRepo
- checkout: NamingRepo
- checkout: ResourceGrpRepo
- task: TerraformInstaller#0
displayName: 'install'
inputs:
terraformVersion: 'latest'
- task: TerraformCLI#0
displayName: 'terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/terraform-azurerm-module-name'
#environmentServiceName: TutorialSvcCon
- task: TerraformCLI#0
displayName: 'terraform plan'
inputs:
provider: 'azurerm'
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/terraform-azurerm-module-name'
#environmentServiceName: TutorialSvcCon
You are checking out multiple repositories therefore according to the documentation:
Multiple repositories:
If you have multiple checkout steps in your job, your source code is
checked out into directories named after the repositories as a
subfolder of s in (Agent.BuildDirectory). If (Agent.BuildDirectory) is
C:\agent_work\1 and your repositories are named tools and code, your
code is checked out to C:\agent_work\1\s\tools and
C:\agent_work\1\s\code.
Thus to achieve what you want you'd have to create one TerraformCLI#0 task for each checked out repository.
Therefore your stages code would be:
...
stages:
- stage: Install
jobs:
- job:
timeoutInMinutes: 60 # how long to run the job before automatically cancelling
cancelTimeoutInMinutes: 2 # how much time to give 'run always even if cancelled tasks'
before stopping them
steps:
- checkout: self
- checkout: VariablesRepo
- checkout: NameRepo
- checkout: NamingRepo
- checkout: ResourceGrpRepo
- task: TerraformInstaller#0
displayName: 'install'
inputs:
terraformVersion: 'latest'
- task: TerraformCLI#0
displayName: 'terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/NameRepo'
#environmentServiceName: TutorialSvcCon
- task: TerraformCLI#0
displayName: 'terraform plan'
inputs:
provider: 'azurerm'
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/NameRepo'
#environmentServiceName: TutorialSvcCon
If for instance you want to do the same operation for each repository then you add more tasks pointing to the RepositoryName as you correctly set up at the beginning:
- task: TerraformCLI#0
displayName: 'terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/VariablesRepo'
#environmentServiceName: TutorialSvcCon
- task: TerraformCLI#0
displayName: 'terraform plan'
inputs:
provider: 'azurerm'
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/VariablesRepo'
#environmentServiceName: TutorialSvcCon
And so forth.
Related
I have a multi-stage Azure DevOps YAML pipeline that had been running fine up until I included a block of parameters (in the part of YAML before the first stage). Now, with these parameters, the Build stage continues to run. But the Release stage does nothing. It doesn't produce an error, it just doesn't run. I'm left with a message:
"Parent pipeline used these runtime parameters"
And then there is is a collapsible UI element you can expand to see the parameters, which isn't super helpful, given that I know what the parameters were.
The parameter block that is now making my multi-stage YAML pipeline fail without any helpful errors is this:
parameters:
- name: configuration
type: object
default:
- Texas
- Arizona
Is it even possible to make a multi-stage YAML AzureDevOps pipeline that accepts parameters? Or is that the limitation (seemingly undocumented) that I am running up against?
EDIT: Here's the whole YAML:
# ASP.NET
# Build and test ASP.NET projects.
# Add steps that publish symbols, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/apps/aspnet/build-aspnet-4
# Gus' first working multi-stage pipeline
trigger: none
pool:
#windows-latest worked until Microsoft cnaged it to windows-2022 and i got
#"The reference assemblies for .NETFramework,Version=v4.6.1 were not found." error
vmImage: 'windows-2019'
parameters:
- name: configuration
type: object
default:
- Texas
- Arizona
variables:
- name: solution
value: '**/*.sln'
- name: buildPlatform
value: 'Any CPU'
- name: buildConfiguration
value: "Texas"
stages:
- stage: build
displayName: Build
jobs:
- job: Build
steps:
- ${{ each configuration in parameters.configuration }}:
- script: 'echo ${{ configuration }}'
- checkout: self
- task: CmdLine#2
inputs:
script: |
cd $(build.sourcesdirectory)
#without this first one, bad things happen!!
- task: NuGetCommand#2
inputs:
command: 'restore'
restoreSolution: '**/*.sln'
feedsToUse: 'config'
nugetConfigPath: 'NuGet.config'
- task: NuGetToolInstaller#1
- task: VSBuild#1
inputs:
solution: '$(solution)'
msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactStagingDirectory)"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
- stage: Release
displayName: Release
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/master'))
jobs:
- deployment:
displayName: Release
environment:
name: QA
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
#- script: echo building $(Build.BuildNumber) with ${{ parameters.configuration }}
- task: CopyFiles#2
inputs:
SourceFolder: '$(Agent.WorkFolder)\1\drop'
Contents: '**\*.zip'
OverWrite: true
TargetFolder: 'C:\QA\Web Sites\${buildConfiguration}'
I need to move files from ADO to a VM. This VM is set up using "Environments" and tagged appropriately. I would like to copy those files to that VM using its environment name and tag. So far I've only found "Windows machine file copy" which requires storing a admin login. Is there a way to use the Environments instead of hardcoding a login?
You can set up the YAML pipeline like as below:
If you want to copy the build artifact files to the VM, reference below sample.
In the Build stage set up the jobs to build the source code, and publish the artifact files for use.
In the Deploy stage, when setting the deployment job with an environment on an VM, all steps in this job will be run on this VM. In the deployment job, at the very first step, it will automatically download the artifact files to the working directory on the VM.
Then you can use the CopyFiles task to the copy the artifact files to any accessible directory on the VM.
stages:
- stage: Build
displayName: 'Build'
. . .
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: '{EnvironmentName.ResourceName}'
strategy:
runOnce:
deploy:
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: {Destination Directory}'
inputs:
SourceFolder: '$(Pipeline.Workspace)/drop'
Contents: '**'
TargetFolder: '{Destination Directory}'
CleanTargetFolder: true
If the files you want to copy the VM are the source files in the repository, reference below sample.
stages:
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: '{EnvironmentName.ResourceName}'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: CopyFiles#2
displayName: 'Copy Files to: {Destination Directory}'
inputs:
SourceFolder: '$(Pipeline.Workspace)'
Contents: '**'
TargetFolder: '{Destination Directory}'
CleanTargetFolder: true
For more details, you can see: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-kubernetes?view=azure-devops
I have been struggling for a while with setting up an Azure Pipeline to deploy .Net Core service to VM. I had the following requirements:
to use YAML files instead of classic UI
to deploy as a windows service not to IIS
to use stages in pipeline
I was using monorepo with the service residing in MyService folder
In addition I had an external NuGet feed
My solution consisted of several projects and I was building only one of them
appsettings.release.json was being replaced with the one persisted on the server to preserve settings
I was inspired by Bright Ran-MSFT answer and suggest my complete azure-pipelines.yml file
trigger:
branches:
include:
- staging
paths:
include:
- MyService
pool:
vmImage: "windows-latest"
variables:
solution: "MyService/MyService.sln"
buildPlatform: "Any CPU"
buildConfiguration: "Release"
stages:
- stage: Build
jobs:
- job: BuildJob
steps:
- task: NuGetCommand#2
inputs:
restoreSolution: "$(solution)"
feedsToUse: "config"
nugetConfigPath: "MyService/NuGet.Config"
- task: VSBuild#1
inputs:
solution: "$(solution)"
msbuildArgs: '/t:MyService:Rebuild /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:SkipInvalidConfigurations=true /p:OutDir="$(build.artifactStagingDirectory)\service_package"'
platform: "$(buildPlatform)"
configuration: "$(buildConfiguration)"
- task: VSTest#2
inputs:
platform: "$(buildPlatform)"
configuration: "$(buildConfiguration)"
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(build.artifactStagingDirectory)\service_package'
artifactName: "service_package"
- stage: Deploy
displayName: 'Deploy'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deployment
displayName: 'Deployment'
environment: 'MainDeployEnv.MY_VM_SERVER'
strategy:
runOnce:
deploy:
steps:
- task: CopyFiles#2
displayName: 'Copy Package to: C:\azservices\MyService\service'
inputs:
SourceFolder: '$(Pipeline.Workspace)/service_package'
Contents: '**'
TargetFolder: 'C:\azservices\MyService\service\'
CleanTargetFolder: true
- task: CopyFiles#2
displayName: 'Replace appsettings.release.json'
inputs:
SourceFolder: 'C:\azservices\MyService\settings'
Contents: 'appsettings.release.json'
TargetFolder: 'C:\azservices\MyService\service\'
OverWrite: true
'm deploying my application to Azure Functions using the pipeline below.
Deploying to Functions in the Consumption plan completes the deployment stage in less than a minute, but deploying to Functions in the App Service Plan takes more than 10 minutes to deploy.
Why is it taking so long to deploy to Functions in AppServicePlan from Azure Pipelines?
yml
trigger:
branches:
include:
- develop
resources:
- repo: self
variables:
pythonVersion: 3.6
serviceConnetion: 'subscription'
functionAppName: 'functionAppName'
vmImageName: 'ubuntu-latest'
workingDirectory: '$(System.DefaultWorkingDirectory)/'
pool:
vmImage: $(vmImageName)
stages:
- stage: Build
displayName: 'Build stage'
jobs:
- job: Build
displayName: Build
steps:
- bash: |
if [ -f extensions.csproj ]
then
dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin
fi
workingDirectory: $(workingDirectory)
displayName: 'Build extensions'
- task: UsePythonVersion#0
displayName: 'Use Python $(pythonVersion)'
inputs:
versionSpec: $(pythonVersion)
- bash: |
python -m venv worker_venv
source worker_venv/bin/activate
pip install -r requirements.txt
workingDirectory: $(workingDirectory)
displayName: 'Install application dependencies'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(workingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- stage: Deploy
displayName: 'Develop stage'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
strategy:
runOnce:
deploy:
steps:
- task: AzureFunctionApp#1
displayName: 'Azure functions app deploy'
inputs:
azureSubscription: '$(serviceConnetion)'
appType:
appName: $(functionAppName)
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
deploymentMethod: 'runFromPackage'
remote build may be enabled, please disable
it
https://learn.microsoft.com/en-us/azure/azure-functions/functions-deployment-technologies#remote-build
If you see my Yaml below, I have $(Build.SourcesDirectory) which has artifacts which I am downloading and copying it in Azure Blob. In Next Stage I would like to make use of $(Build.SourcesDirectory) contents, but I am not getting anything if I call $(Build.SourcesDirectory) in my next stage. Why is that happening and how can I fix this??
- task: PublishPipelineArtifact#1
displayName: 'Publish PS'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)\flow_ps
artifactName: 'Flow_Tools_PS'
- task: DownloadPipelineArtifact#2
displayName: Download Flow CodeGen
inputs:
artifact: 'Flow_Tools_PS'
path: '$(Build.SourcesDirectory)/Flow_Tools_PS'
- task: AzureFileCopy#2
displayName: 'Publish Flow_Tools_PS to Blob'
inputs:
SourcePath: '$(Build.SourcesDirectory)/Flow_Tools_PS'
azureSubscription: 'Azure CICD'
Destination: AzureBlob
storage: '$(BlobStorageAccount)'
ContainerName: '$(BlobContainer)'
BlobPrefix: '$(BlobPrefix)/$(DeploymentVersion)/Flow_Tools_PS'
AdditionalArgumentsForBlobCopy: '/V /S'
outputStorageUri: BlobUri
outputStorageContainerSasToken: BlobSASToken
- stage: PublishFlowNotebooks
dependsOn: Flow
jobs:
- job: DevOpsScripts
pool:
vmImage: 'windows-latest'
environment: 'Flow'
steps:
- checkout: DevOpsScripts
- powershell: |
Get-ChildItem $(Build.SourcesDirectory) -Recurse
name: DebugCheckout
displayName: Debug script checkout
- task: DownloadPipelineArtifact#2
inputs:
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
- task: PowerShell#2
displayName: 'PowerShell Script'
inputs:
targetType: filePath
filePath: '$(System.DefaultWorkingDirectory)/ReleaseNoteScripts/UploadWithReleaseNotes.ps1'
If you want to share some directories between jobs you should publish content of that directory:
steps:
- publish: $(Build.SourcesDirectory)
artifact: Flow
and then on the next job download this artifact:
steps:
- download: current
artifact: Flow
Be aware that
By default, files are downloaded to $(Pipeline.Workspace)/. If an artifact name was not specified, a sub-directory will be created for each downloaded artifact.
You can read more about this here.
And if you want to checkout multiple repositories this is possible
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
- repository: MyBitbucketRepo
type: bitbucket
endpoint: MyBitbucketServiceConnection
name: MyBitbucketOrgOrUser/MyBitbucketRepo
- repository: MyAzureReposGitRepository # In a different organization
endpoint: MyAzureReposGitServiceConnection
type: git
name: OtherProject/MyAzureReposGitRepo
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- checkout: MyGitHubRepo
- checkout: MyBitbucketRepo
- checkout: MyAzureReposGitRepository
- script: dir $(Build.SourcesDirectory)
The deployment job automatically downloads all the pipeline resources. However, standard job does not. I tried to use - download: current, but that does not download the pipeline resources.
The reason I want to do this is to simulate a deployment for GitOps. The simulation will include a step that does a git diff that shows the differences for review.
However, I don't see an all or * option in https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema#download
My current workaround is to do a deployment to a "Temp" environment prior to the real deployment.
UPDATE:
Here's an example of what I have tried
resources:
pipelines:
- pipeline: auth
project: myproj
source: auth CI
trigger:
branches:
include:
- master
...
jobs:
- job: diagnostics
displayName: Job Diagnostics
steps:
- checkout: self
# - download:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
- bash: |
env | sort
displayName: Display environment variables
- bash: |
pwd
displayName: Present working directory
- bash: |
find $(Pipeline.Workspace) -type f -print
displayName: Display files
UPDATE:
Another approach I was mulling is to create a pipeline that creates another pipeline. That way the list of pipeline resources does not need to be Azure Pipeline YAML, it could be a CSV or a simplified YAML that I transform in which case I can generate
resources:
pipelines:
- pipeline: pipelineAlias1
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
- pipeline: pipelineAlias2
project: myproj
source: auth CI
branch: master
trigger:
branches:
include:
- master
...
job:
steps:
- download: pipelineAlias1
- download: pipelineAlias2
and then set that up as another Azure pipeline to execute when updated.
How do you download all pipeline resources in a job?
Indeed, this is a known issue about using the download keyword to download the Pipeline Artifacts.
You could track this issue from below ticket:
Artifacts do not download on multi stage yaml build using DownloadPipelineArtifactV2
To resolve this issue, please try to use the DownloadPipelineArtifact task instead of the download keyword:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
Update:
I have noticed your yml file, it seems you do not add build job in your pipeline. The task DownloadPipelineArtifact is used to:
download pipeline artifacts from earlier stages in this pipeline,
or from another pipeline
So, we need to add stage to build the pipeline to generate the artifact, otherwise, no pipeline artifacts were downloaded.
Check my test YAML file:
variables:
ArtifactName: drop
stages:
- stage: Build
jobs:
- job: Build
displayName: Build
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\LibmanTest'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: LibmanTest'
inputs:
ArtifactName: $(ArtifactName)
- stage: Dev
dependsOn: Build
jobs:
- job: Dev
displayName: Dev
pool:
name: MyPrivateAgent
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\TestSample'
targetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: TestSample'
inputs:
ArtifactName: $(ArtifactName)
- stage: Deployment
dependsOn: Dev
pool:
name: MyPrivateAgent
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
path: $(Build.SourcesDirectory)
As you can see, I use two stages Build and Dev to copy the project LibmanTest and TestSample as artifact, then use the task DownloadPipelineArtifact to download those two artifacts.
The test result:
Update2:
your example still does not show resources.pripelines
So, now you want to download the artifact from other pipelines, you need not use the default configuration for the task DownloadPipelineArtifact:
resources:
pipelines:
- pipeline: xxx
project: MyTestProject
source: xxx
trigger:
branches:
include:
- master
jobs:
- deployment: Deployment
displayName: DeployA
environment: 7-1-0
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact For Test'
inputs:
buildType: specific
project: MyTestProject
definition: 13
The test result:
Besides, you could check the configuration from the classic editor, then get the YAML file:
Hope this hellps.