Share CMake build folder between jobs in Azure DevOps - azure-devops

I'm building a pipeline in Azure DevOps that first builds a project using CMake and then, conditionally, creates an installer out of it. The creation of the installer is also done using CMake (and CPack). Since the installer is only supposed to be created and published under certain conditions (a tag is created), I'd like to share the build folder from the first job and reuse it in the second job. I do so by uploading the artifacts as the last step from the first job
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(System.DefaultWorkingDirectory)/build
artifactName: build_release
(pipeline artifacts are currently not supported by our environment)
and downloading and moving it in the second step
- task: DownloadBuildArtifacts#0
inputs:
downloadType: 'single'
artifactName: build_release
downloadPath: $(System.DefaultWorkingDirectory)/tmp
- task: CopyFiles#2
displayName: Copy Artifacts into Build Directory
inputs:
SourceFolder: $(System.DefaultWorkingDirectory)/tmp/build_release
TargetFolder: $(System.DefaultWorkingDirectory)/build
Unfortunately, I'm running into an issue with cached absolute paths now:
CMake is re-running because D:/.../_work/171/s/build/vs-project/CMakeFiles/generate.stamp dependency file is missing.
CMake Error: The current CMakeCache.txt directory D:/.../_work/212/s/build/vs-project/CMakeCache.txt is different than the directory d:/.../_work/171/s/build/vs-project where CMakeCache.txt was created. This may result in binaries being created in the wrong place. If you are not sure, reedit the CMakeCache.txt
CMake Error: The source directory "D:/.../_work/171/s/build" does not exist.
I understand that several paths are cached in CMakeCache.txt and do not match since the build is taking place in a separate folder. Is there an elegant way to have ADO rewrite the paths in CMakeCache.txt automatically? Or to make sure the folder names stay constant for the two jobs?
Any help appreciated here!

Related

Terraform: Error while loading schemas for plugin components

I have an Azure DevOps Build pipeline that publishes the entire repository as an artifact to be used with the Release pipeline.
# Publish artifacts to be used in release
- task: PublishBuildArtifacts#1
displayName: 'publish artifacts'
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)'
ArtifactName: 'TerraformModule'
publishLocation: 'Container'
The build pipeline triggers the creation of a release pipeline where I try to deploy the terraform configuration.
I can successfully run terraform init in this pipeline but when I try to run plan or apply, I get the following error:
Looking at the screenshot, it looks like it tries to execute the command from /usr/local/bin instead of what I specified in the step? Confused by this. Below is the yaml for my plan step:
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV3#3
displayName: 'terraform plan'
inputs:
provider: aws
command: plan
workingDirectory: '/home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod'
environmentServiceNameAWS: 'AWS-Terraform-Build'
I manually changed workingDirectory to where the Artifacts from the build pipeline were downloaded to. See log below for example:
2022-08-14T23:41:31.3359557Z Downloaded TerraformModule/Projects/Potentium/Prod/main.tf to /home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod/main.tf
The plan step in my build pipeline executes without any issues so I have a feeling it is something to do with the artefacts/extraction that is occurring in the download step. Looking for any advice.
I've had similar issues with the extraction phase, when using ExtractFiles#1 doing a similar thing with terraform. I think there's a bug in it, I could not get it to extract files back to the root of System.DefaultWorkingDirectory unless the root folder was included in the archiv, I am using ArchiveFiles#2. So I was ending up with /opt/az_devops/_work/*/s/s
My solution, was to shell out a command to do the extraction. No problems extracting to the root of System.DefaultWorkingDirectory
Just remember if you're running a subsequent terraform plan, by default the working directory System.DefaultWorkingDirectory will change between runs. So ensure you use these variables rather than an explicit reference.

How to generate DACPAC file

I'm trying to deploy my project in Azure DevOps through IIS website and SQL deployment. However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process.
P.S. I do not have access to the VM where I am deploying due to restrictions. I can access database as I marked as DBO on the machine.
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
Thank you for your help!
However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process. I can access database as I marked as DBO on the machine.
Firstly you have to create SQL Server Database Project using SSDT (or Azure Data Studio insiders preview) by importing objects of the live database.
The database project then is to be placed into a repository
The pipeline (classic or yaml) is to have a build task MSBuild#1. Here is an YAML example. It generates the dacpac
- task: MSBuild#1
displayName: 'Build solution YourDatabase.sln'
inputs:
solution: 'src/YourDatabase.sln'
This task compiles the database project and produces dacpac file(s)
Then produced files are to be extracted:
- task: CopyFiles#2
displayName: 'Extract DACPACs'
inputs:
CleanTargetFolder: false
SourceFolder: '$(agent.builddirectory)\s\src\YourDatabase\bin\Debug\'
Contents: '*.dacpac'
TargetFolder: '$(build.artifactstagingdirectory)'
And finally, published as the artefact
- task: PublishPipelineArtifact#1
displayName: 'Publish Artifact'
inputs:
targetPath: '$(build.artifactstagingdirectory)'
artifact: 'drop'
Deployment of the dacpac is the final goal and can be done using SqlDacpacDeploymentOnMachineGroup#0, however, this is out of the scope of the original question
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
It depends.
Classic pipelines have a separation of BUILD and RELEASE phases. In this case, you can build it once and reuse that dacpac for many future releases.
In case of multi-stage yaml pipelines, it is common that every pipeline run triggers build and deployment stages, because they are still belong to the same pipeline and run as a single unit work.

UI Selenium tests on Azure Devops 2019

What is the correct way to run automated UI tests by a self-hosted agent?
I tried to add a tests step in the release pipeline but it's not working because the agent cannot find the DLLs with tests (they are in a few separate projects)
##[warning]No test assemblies found matching the pattern: **\Test.UI.dll,!**\*TestAdapter.dll,!**\obj\**.
Currently, the release pipeline is simple: one artifact from the build pipeline and one stage with the following steps:
1. IIS Web App Deploy
2. IIS Web App Manage
3. VsTest: tests are selected using Test assemblies option
No test assemblies found matching the pattern.
It seems that the .dll files don't exist in the artifact or under the path of the release agent. You may need to share more information about the Pipeline. (e.g. build agent, pipeline definition and vstest task definition)
Before this, you can refer to the following steps for troubleshooting.
First of all, you need to make sure that the .dll files exist in the build artifacts.
You could check this in Build Summary -> Artifacts. You could download it and check the files in the artifact.
If the files don't exist, you could add a Copy files task before the Publish Artifacts task.
For example:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(agent.builddirectory)'
TargetFolder: '$(build.artifactstagingdirectory)'
enabled: false
Then the "VsTest" task will find the files in $(System.DefaultWorkingDirectory) by default(No customization). In release pipeline, the path is equals to C:\agent\_work\rx\a (e.g. C:\agent\_work\r1\a)
Since you are using the self-hosted agent , you could directly check the files in the target path on your local machine.
If you couldn't find the target files, you may need to modify the search folder or Test files path.
You also could check the file path in the Release Pipeline log -> Download Artifacts Step.

Why is this Azure DevOps pipeline release failing?

This is an ASP.NET Core 3.0 project that builds with no errors, but when it triggers the pipeline to release to Azure App Service, it fails with the following error:
2019-11-10T23:09:23.8008460Z ##[error]Error: No package found with specified pattern: D:\a\r1\a***.zip
What needs to be done to fix the release pipeline? The pipeline release is pulling the latest build as its artifact.
Assumptions
The following info assumes that you are appropriately publishing your build artifact from your Build pipeline, and that you have added the correct build artifact into you release pipeline.
In your release pipeline you have specified a build artifact in the Artifacts area
When adding your build artifact to your release pipeline, you chose to give it an alias of Build Artifact. This means that at the very lease (with default settings) your .zip file will be in some sub-directory of $(system.DefaultWorkingDirectory)/Build Artifact/
A new unique folder in the agent is created for every release pipeline when you initiate a release, and the artifacts are downloaded into that folder. The $(System.DefaultWorkingDirectory) variable maps to this folder.
To ensure the uniqueness of every artifact download, each artifact source linked to a release pipeline is automatically provided with a specific download location known as the source alias. This location can be accessed through the variable:
$(System.DefaultWorkingDirectory)\[source alias]
This uniqueness also ensures that, if you later rename a linked artifact source in its original location (for example, rename a build pipeline in Azure Pipelines or a project in Jenkins), you don't need to edit the task properties because the download location defined in the agent does not change.
The source alias is, by default, the name of the source selected when you linked the artifact source, prefixed with an underscore; depending on the type of the artifact source this will be the name of the build pipeline, job, project, or repository. You can edit the source alias from the artifacts tab of a release pipeline; for example, when you change the name of the build pipeline and you want to use a source alias that reflects the name of the build pipeline.
(from some of the abundant documentation
Instead of searching for your package using ***.zip (which isn't proper wildcard syntax) use Build Artifact/**/*.zip
** is for recursively searching into directories
(I don't know what folder)
* is for searching a part of a given level of the path
any file/folder that
starts with (SomeFile.*)
ends with (*File.zip)
and i think contains (*meFi*)
The pipeline YAML was missing the following tasks. Not sure why this isn't included in the ASP.NET Core template, very confusing for developers new to Azure DevOps.
- task: DotNetCoreCLI#2
inputs:
command: 'publish'
publishWebProjects: true
- task: CopyFiles#2
inputs:
targetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
When adding an Artifact of Source type "Build", select "Default version" as "Latest from the build pipeline default branch with tags", as follows:

How to generate a build artifact in Azure Devops through Powershell

I would need to have a build definition totally included inside a powershell script.
I can install dotnet, restore packages, build everything, and now I need to create an artifact.
I cannot find a way to call in Powershell the equivalent of the task PublishBuildArtifacts#1, nothing comes up also googling everywhere. It shouldn't be difficult...
Thanks in advance.
Maybe I found it, finally: I will try it using this documentation:
https://learn.microsoft.com/en-us/rest/api/azure/devops/build/artifacts/create?view=azure-devops-rest-4.1
'Publish build artifacts' should solve it.
I would also add a task to archive the files to be considered artifacts before the publish.
The build report now shows a link to download the files produced by the powershell script.
Summary:
Archive task to $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
Publish Build Artifacts task publishing $(Build.ArtifactStagingDirectory)
yaml for the archive:
- task: ArchiveFiles#2
displayName: 'Archive output'
inputs:
rootFolderOrFile: Deliver
yaml for the publish:
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'