I have an Azure DevOps Build pipeline that publishes the entire repository as an artifact to be used with the Release pipeline.
# Publish artifacts to be used in release
- task: PublishBuildArtifacts#1
displayName: 'publish artifacts'
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)'
ArtifactName: 'TerraformModule'
publishLocation: 'Container'
The build pipeline triggers the creation of a release pipeline where I try to deploy the terraform configuration.
I can successfully run terraform init in this pipeline but when I try to run plan or apply, I get the following error:
Looking at the screenshot, it looks like it tries to execute the command from /usr/local/bin instead of what I specified in the step? Confused by this. Below is the yaml for my plan step:
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV3#3
displayName: 'terraform plan'
inputs:
provider: aws
command: plan
workingDirectory: '/home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod'
environmentServiceNameAWS: 'AWS-Terraform-Build'
I manually changed workingDirectory to where the Artifacts from the build pipeline were downloaded to. See log below for example:
2022-08-14T23:41:31.3359557Z Downloaded TerraformModule/Projects/Potentium/Prod/main.tf to /home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod/main.tf
The plan step in my build pipeline executes without any issues so I have a feeling it is something to do with the artefacts/extraction that is occurring in the download step. Looking for any advice.
I've had similar issues with the extraction phase, when using ExtractFiles#1 doing a similar thing with terraform. I think there's a bug in it, I could not get it to extract files back to the root of System.DefaultWorkingDirectory unless the root folder was included in the archiv, I am using ArchiveFiles#2. So I was ending up with /opt/az_devops/_work/*/s/s
My solution, was to shell out a command to do the extraction. No problems extracting to the root of System.DefaultWorkingDirectory
Just remember if you're running a subsequent terraform plan, by default the working directory System.DefaultWorkingDirectory will change between runs. So ensure you use these variables rather than an explicit reference.
Related
I have an Azure DevOps pipeline step failing running the OWASP dependency check. I want to find what dependencies need to be updated.
The logs that are written during the dependency check pipeline step say:
[INFO] Writing report to: e:\vsts\a\7567\TestResults\dependency-check\dependency-check-report.html
I assume this dependency-check-report.html is where it will tell me what dependencies need to be updated. But I do not understand where this e:\vsts\a\7567\TestResults\ location is, as this step is being run in DevOps. Is this somewhere in DevOps? I cannot seem to find it anywhere. "Download logs" on the pipeline page doesn't seem to have it either.
where this e:\vsts\a\7567\TestResults\ location is
When you run the pipeline in Azure DevOps, this path represents the local path of the machine where the agent locates.
In your case, the agent is self-hosted agent. You go to the local machine where the agent locates and find the dependency-check-report.html in e:\vsts\a\7567\TestResults\dependency-check.
On the other hand, you can use the Publish Pipeline Artifacts task to upload the target file to Pipeline artifacts.
For example:
steps:
- task: dependency-check-build-task#6
displayName: 'Dependency Check'
inputs:
projectName: test
scanPath: test
continueOnError: true
- task: PublishPipelineArtifact#1
displayName: 'Publish Pipeline Artifact'
inputs:
targetPath: '$(Common.TestResultsDirectory)'
artifact: drop
Note: You need to set the continueOnError: true in OWASP dependency check task.
In this case, the dependency-check-report.html on agent machine will be uploaded to Azure Artifacts.
For example:
I'm building a pipeline in Azure DevOps that first builds a project using CMake and then, conditionally, creates an installer out of it. The creation of the installer is also done using CMake (and CPack). Since the installer is only supposed to be created and published under certain conditions (a tag is created), I'd like to share the build folder from the first job and reuse it in the second job. I do so by uploading the artifacts as the last step from the first job
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(System.DefaultWorkingDirectory)/build
artifactName: build_release
(pipeline artifacts are currently not supported by our environment)
and downloading and moving it in the second step
- task: DownloadBuildArtifacts#0
inputs:
downloadType: 'single'
artifactName: build_release
downloadPath: $(System.DefaultWorkingDirectory)/tmp
- task: CopyFiles#2
displayName: Copy Artifacts into Build Directory
inputs:
SourceFolder: $(System.DefaultWorkingDirectory)/tmp/build_release
TargetFolder: $(System.DefaultWorkingDirectory)/build
Unfortunately, I'm running into an issue with cached absolute paths now:
CMake is re-running because D:/.../_work/171/s/build/vs-project/CMakeFiles/generate.stamp dependency file is missing.
CMake Error: The current CMakeCache.txt directory D:/.../_work/212/s/build/vs-project/CMakeCache.txt is different than the directory d:/.../_work/171/s/build/vs-project where CMakeCache.txt was created. This may result in binaries being created in the wrong place. If you are not sure, reedit the CMakeCache.txt
CMake Error: The source directory "D:/.../_work/171/s/build" does not exist.
I understand that several paths are cached in CMakeCache.txt and do not match since the build is taking place in a separate folder. Is there an elegant way to have ADO rewrite the paths in CMakeCache.txt automatically? Or to make sure the folder names stay constant for the two jobs?
Any help appreciated here!
I'm relatively new with Azure DevOps and I was wondering what will be the most practical way to publish Cypress test screenshots in Azure pipelines (or maybe even somewhere external)?
The only way I found online is this:
http://codestyle.dk/2020/05/19/cypress-screenshots-are-missing-in-azure-pipelines/
But maybe there is some more "practical" solution ?!
To publish the screenshots of your failed Cypress tests, you can add the following task to your pipeline definition .yaml file after running your tests. This will publish all created screenshots in the pipeline artifacts of the current pipeline run.
- task: PublishBuildArtifacts#1
displayName: 'Publish Cypress Screenshot Files'
condition: failed()
inputs:
PathtoPublish: 'cypress/screenshots/'
ArtifactName: 'screenshots'
Two notes about this:
If you want to publish screenshots not only when the tests fail, then you have to remove the line condition: failed()
The cypress/screenshots folder is only automatically created by Cypress if the test execution also creates screenshots. If no screenshot was created, then the folder does not exist and the above pipeline task would fail. Therefore I would also persist the empty screenshots folder in the repo by using a .gitkeep file.
I'm trying to deploy my project in Azure DevOps through IIS website and SQL deployment. However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process.
P.S. I do not have access to the VM where I am deploying due to restrictions. I can access database as I marked as DBO on the machine.
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
Thank you for your help!
However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process. I can access database as I marked as DBO on the machine.
Firstly you have to create SQL Server Database Project using SSDT (or Azure Data Studio insiders preview) by importing objects of the live database.
The database project then is to be placed into a repository
The pipeline (classic or yaml) is to have a build task MSBuild#1. Here is an YAML example. It generates the dacpac
- task: MSBuild#1
displayName: 'Build solution YourDatabase.sln'
inputs:
solution: 'src/YourDatabase.sln'
This task compiles the database project and produces dacpac file(s)
Then produced files are to be extracted:
- task: CopyFiles#2
displayName: 'Extract DACPACs'
inputs:
CleanTargetFolder: false
SourceFolder: '$(agent.builddirectory)\s\src\YourDatabase\bin\Debug\'
Contents: '*.dacpac'
TargetFolder: '$(build.artifactstagingdirectory)'
And finally, published as the artefact
- task: PublishPipelineArtifact#1
displayName: 'Publish Artifact'
inputs:
targetPath: '$(build.artifactstagingdirectory)'
artifact: 'drop'
Deployment of the dacpac is the final goal and can be done using SqlDacpacDeploymentOnMachineGroup#0, however, this is out of the scope of the original question
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
It depends.
Classic pipelines have a separation of BUILD and RELEASE phases. In this case, you can build it once and reuse that dacpac for many future releases.
In case of multi-stage yaml pipelines, it is common that every pipeline run triggers build and deployment stages, because they are still belong to the same pipeline and run as a single unit work.
I would need to have a build definition totally included inside a powershell script.
I can install dotnet, restore packages, build everything, and now I need to create an artifact.
I cannot find a way to call in Powershell the equivalent of the task PublishBuildArtifacts#1, nothing comes up also googling everywhere. It shouldn't be difficult...
Thanks in advance.
Maybe I found it, finally: I will try it using this documentation:
https://learn.microsoft.com/en-us/rest/api/azure/devops/build/artifacts/create?view=azure-devops-rest-4.1
'Publish build artifacts' should solve it.
I would also add a task to archive the files to be considered artifacts before the publish.
The build report now shows a link to download the files produced by the powershell script.
Summary:
Archive task to $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
Publish Build Artifacts task publishing $(Build.ArtifactStagingDirectory)
yaml for the archive:
- task: ArchiveFiles#2
displayName: 'Archive output'
inputs:
rootFolderOrFile: Deliver
yaml for the publish:
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'