I'm beginning playing around with the Azure Pipeline and have a hard time figuring out what is the output of my builds and where they are stored.
Is there a way to explore files contained in builds? I feel like I'm blind when using those built-in variables without knowing whats behind.
Here is a list of predefined variables for referencing files in build pipeline tasks.
If you want to get more visibility to the files I suggest you create and configure your own build agent, instructions here.
One way to "explore" a build is to publish to a "drop" location using:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
After this open up the Artifacts dropdown in the top-right and explore your build
Related
I need to modify an existing YAML pipeline so that it downloads an artifact published from another existing ADO pipeline. The other pipeline is a Classic one in case it matters.
In the current setup, a daily Release pipeline takes the artifact from the Classic pipeline and pushes it to a company repository external to ADO.
Now, the YAML pipeline is only run occasionally and it is run manually. Currently it downloads the artifact from the external repo to which the Release pipeline pushed. This hasn't been a problem generally. However a recent issue highlighted that it would be desirable to be able to avoid the delays built in to the current approach and essentially just grab the artifact directly from the latest run of the Classic pipeline.
When I set out to do this, I assumed that it would be straightforward but I seem to have run into a brick wall. The only information I have found describes using DownloadPipelineArtifact#2 but this depends on various IDs like the pipeline ID and run ID which it seems are not easily obtainable.
I am pretty new to ADO. I'm not a devops person at all really but I've had this put on my plate. So before I spend too much time on this, am I missing something or is this just something that one should not really be doing in ADO? If it is possible, is there a guide somewhere?
UPDATE
Thanks to a useful answer from daniel-mann, I was able to get this working but there were some quirks that I thought I should mention in the event that they might be useful to anyone else.
When I started adding the DownloadPipelineArtifact#2 task (this was editing directly in ADO on a browser), ADO was hinting field names to me that seemed to be different from the documented ones. Possibly these were aliases but I had a hard time knowing what to trust with respect to documentation.
I also noticed a Settings "link" had appeared above the first line of the task definition. When I clicked on this it opened up an editor to the right of the page that helped fill in the fields. It provided dropdowns for things like the project and the pipeline ID.
This what I ended up with:
- task: DownloadPipelineArtifact#2
displayName: "my task description"
inputs:
buildType: 'specific'
project: <long "UID" string identifying project>
definition: <numeric id for pipeline>
buildVersionToDownload: 'latest'
artifactName: <name of artifact as defined in upstream pipeline>
targetPath: '$(Pipeline.Workspace)'
Note that the editor tool added a definition field, but apparently this is an alias for pipeline. I am not sure why it thinks this is more helpful.
Unfortunately the above did not work. I saw this error:
##[error]Pipeline Artifact Task is not supported in on-premises. Please use Build Artifact Task instead.
I don't know what caused this - perhaps the ADO setup in my organization? As I understand it the Build Artifact Task is deprecated in favour of the Pipeline Artifact Task but I did not have any choice but to try it and this time it did work for me.
This time I used the "Settings" editor from the outset and ended up with this:
- task: DownloadBuildArtifacts#0
displayName: "my task description"
inputs:
buildType: 'specific'
project: $(System.TeamProjectId)
pipeline: <numeric ID as above>
buildVersionToDownload: 'latest'
downloadType: 'single'
artifactName: '$(ARTNAME)'
downloadPath: '$(System.ArtifactsDirectory)'
The fields that I manually edited here were:
using our own ARTNAME variable that is we define to be the artifact name in one of our variable groups. The relevant variable group is imported to this pipeline.
using the builtin System.TeamProjectId for the project name. This seemed prefereable to having the "UID" string in there. (Though I also found that the normal name string for our project worked here too.)
but this depends on various IDs like the pipeline ID and run ID
Not for your use case.
You said
just grab the artifact directly from the latest run of the Classic pipeline.
In which case, referring to the parameters explained in the documentation,
# Download pipeline artifacts
# Download build and pipeline artifacts
- task: DownloadPipelineArtifact#2
inputs:
#source: 'current' # Options: current, specific
#project: # Required when source == Specific
#pipeline: # Required when source == Specific
#preferTriggeringPipeline: false # Optional
#runVersion: 'latest' # Required when source == Specific# Options: latest, latestFromBranch, specific
#runBranch: 'refs/heads/master' # Required when source == Specific && RunVersion == LatestFromBranch
#runId: # Required when source == Specific && RunVersion == Specific
#tags: # Optional
#artifact: # Optional
#patterns: '**' # Optional
#path: '$(Pipeline.Workspace)'
You would just need to set the project, pipeline, source: specific, and runVersion: latest parameters.
Or you could use the download alias, which is a little bit simpler but can achieve the same thing
I have an Azure DevOps Build pipeline that publishes the entire repository as an artifact to be used with the Release pipeline.
# Publish artifacts to be used in release
- task: PublishBuildArtifacts#1
displayName: 'publish artifacts'
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)'
ArtifactName: 'TerraformModule'
publishLocation: 'Container'
The build pipeline triggers the creation of a release pipeline where I try to deploy the terraform configuration.
I can successfully run terraform init in this pipeline but when I try to run plan or apply, I get the following error:
Looking at the screenshot, it looks like it tries to execute the command from /usr/local/bin instead of what I specified in the step? Confused by this. Below is the yaml for my plan step:
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV3#3
displayName: 'terraform plan'
inputs:
provider: aws
command: plan
workingDirectory: '/home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod'
environmentServiceNameAWS: 'AWS-Terraform-Build'
I manually changed workingDirectory to where the Artifacts from the build pipeline were downloaded to. See log below for example:
2022-08-14T23:41:31.3359557Z Downloaded TerraformModule/Projects/Potentium/Prod/main.tf to /home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod/main.tf
The plan step in my build pipeline executes without any issues so I have a feeling it is something to do with the artefacts/extraction that is occurring in the download step. Looking for any advice.
I've had similar issues with the extraction phase, when using ExtractFiles#1 doing a similar thing with terraform. I think there's a bug in it, I could not get it to extract files back to the root of System.DefaultWorkingDirectory unless the root folder was included in the archiv, I am using ArchiveFiles#2. So I was ending up with /opt/az_devops/_work/*/s/s
My solution, was to shell out a command to do the extraction. No problems extracting to the root of System.DefaultWorkingDirectory
Just remember if you're running a subsequent terraform plan, by default the working directory System.DefaultWorkingDirectory will change between runs. So ensure you use these variables rather than an explicit reference.
I'm building a pipeline in Azure DevOps that first builds a project using CMake and then, conditionally, creates an installer out of it. The creation of the installer is also done using CMake (and CPack). Since the installer is only supposed to be created and published under certain conditions (a tag is created), I'd like to share the build folder from the first job and reuse it in the second job. I do so by uploading the artifacts as the last step from the first job
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(System.DefaultWorkingDirectory)/build
artifactName: build_release
(pipeline artifacts are currently not supported by our environment)
and downloading and moving it in the second step
- task: DownloadBuildArtifacts#0
inputs:
downloadType: 'single'
artifactName: build_release
downloadPath: $(System.DefaultWorkingDirectory)/tmp
- task: CopyFiles#2
displayName: Copy Artifacts into Build Directory
inputs:
SourceFolder: $(System.DefaultWorkingDirectory)/tmp/build_release
TargetFolder: $(System.DefaultWorkingDirectory)/build
Unfortunately, I'm running into an issue with cached absolute paths now:
CMake is re-running because D:/.../_work/171/s/build/vs-project/CMakeFiles/generate.stamp dependency file is missing.
CMake Error: The current CMakeCache.txt directory D:/.../_work/212/s/build/vs-project/CMakeCache.txt is different than the directory d:/.../_work/171/s/build/vs-project where CMakeCache.txt was created. This may result in binaries being created in the wrong place. If you are not sure, reedit the CMakeCache.txt
CMake Error: The source directory "D:/.../_work/171/s/build" does not exist.
I understand that several paths are cached in CMakeCache.txt and do not match since the build is taking place in a separate folder. Is there an elegant way to have ADO rewrite the paths in CMakeCache.txt automatically? Or to make sure the folder names stay constant for the two jobs?
Any help appreciated here!
I'm relatively new with Azure DevOps and I was wondering what will be the most practical way to publish Cypress test screenshots in Azure pipelines (or maybe even somewhere external)?
The only way I found online is this:
http://codestyle.dk/2020/05/19/cypress-screenshots-are-missing-in-azure-pipelines/
But maybe there is some more "practical" solution ?!
To publish the screenshots of your failed Cypress tests, you can add the following task to your pipeline definition .yaml file after running your tests. This will publish all created screenshots in the pipeline artifacts of the current pipeline run.
- task: PublishBuildArtifacts#1
displayName: 'Publish Cypress Screenshot Files'
condition: failed()
inputs:
PathtoPublish: 'cypress/screenshots/'
ArtifactName: 'screenshots'
Two notes about this:
If you want to publish screenshots not only when the tests fail, then you have to remove the line condition: failed()
The cypress/screenshots folder is only automatically created by Cypress if the test execution also creates screenshots. If no screenshot was created, then the folder does not exist and the above pipeline task would fail. Therefore I would also persist the empty screenshots folder in the repo by using a .gitkeep file.
I'm trying to deploy my project in Azure DevOps through IIS website and SQL deployment. However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process.
P.S. I do not have access to the VM where I am deploying due to restrictions. I can access database as I marked as DBO on the machine.
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
Thank you for your help!
However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process. I can access database as I marked as DBO on the machine.
Firstly you have to create SQL Server Database Project using SSDT (or Azure Data Studio insiders preview) by importing objects of the live database.
The database project then is to be placed into a repository
The pipeline (classic or yaml) is to have a build task MSBuild#1. Here is an YAML example. It generates the dacpac
- task: MSBuild#1
displayName: 'Build solution YourDatabase.sln'
inputs:
solution: 'src/YourDatabase.sln'
This task compiles the database project and produces dacpac file(s)
Then produced files are to be extracted:
- task: CopyFiles#2
displayName: 'Extract DACPACs'
inputs:
CleanTargetFolder: false
SourceFolder: '$(agent.builddirectory)\s\src\YourDatabase\bin\Debug\'
Contents: '*.dacpac'
TargetFolder: '$(build.artifactstagingdirectory)'
And finally, published as the artefact
- task: PublishPipelineArtifact#1
displayName: 'Publish Artifact'
inputs:
targetPath: '$(build.artifactstagingdirectory)'
artifact: 'drop'
Deployment of the dacpac is the final goal and can be done using SqlDacpacDeploymentOnMachineGroup#0, however, this is out of the scope of the original question
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
It depends.
Classic pipelines have a separation of BUILD and RELEASE phases. In this case, you can build it once and reuse that dacpac for many future releases.
In case of multi-stage yaml pipelines, it is common that every pipeline run triggers build and deployment stages, because they are still belong to the same pipeline and run as a single unit work.