So I am looking for documentation on the tags available in NAnt for Vault. I saw some samples on SourceGear's site but I don't think they are the exhaustive list. Does anywhere know where I can find documentation on all the eligible tags?
Thanks
This documentation along with the NAnt tasks for Vault can be found at Vault Downloads by downloading "NAnt Tasks (zip)". The documentation is included in the doc folder in the zip as a CHM. It contains documentation for all the tasks, with all the attributes for each.
To get the NAnt tasks working copy them to the bin folder in the folder where you have NAnt (where nant.exe is). Just make sure the Vault tasks match your NAnt release (current latest is for NAnt 0.85).
At some point I also found the source code for this somewhere on SourceGear's websites, but I am not able to do it right now.
Related
I just started working in Azure DevOps and I keep seeing this combination C:\a\11\s or D:\a\11\a at the beginning of file paths. When I search to find out what it means I get no results. The C or D is a "drive" reference, but what is this part \a\11\s or \a\11\a of the path referencing?
This is an implementation detail of DevOps. The a is short for agent, the 11 is the numerical ID of a specific agent (there might be multiple on one machine) and the s is short for source (directory with sources being built) and a is short for artifacts (i.e. the build results). The short names have been chosen to prevent build failures due to too long paths.
For example, underneath ..\s there is the complete directory tree of the source, which could be deep and use long directory and file names.
Tools used in the build process might have issues with paths longer than 255 characters.
This approach does not prevent this, but makes it less likely than if they had choosen verbose directory names.
The directories are also available by predefined variables. For Example Build.SourcesDirectory or Build.ArtifactStagingDirectory.
It's a local working folder on the pipeline agent machine. A search for e.g. "pipeline agent working folder" might find some info.
I'm completely new to Azure DevOps Pipelines so if I'm doing something incorrectly I'd appreciate a nod in the right direction... I setup a build pipeline and that seems to be working, now I'm trying to setup a release pipeline in order to run tests, it's mostly based on Microsoft's documentation:
https://learn.microsoft.com/en-us/azure/devops/test/run-automated-tests-from-test-hub?view=azure-devops
Before running tests I need to transform a config file to replace some variables like access keys, usernames, etc. What I setup is what I have below but for the life of me I can't figure out what text box Package or folder refers to. The documentation is super helpful as you can imagine:
File path to the package or a folder
but what package or what folder is this referring to??? I've tried several different things but everything errors with
##[error]Error: Nopackagefoundwithspecifiedpattern D:\a\r1\a\**\*.zip
or pretty much whatever I specify for a value.
The File Transform task supports the .zip files.
Test with the default File Transform task settings, I could reproduce this issue.
In Release pipeline, the file path could has one more node for the build artifacts .zip file.
The format example:
$(System.DefaultWorkingDirectory)\{Source alias name}\{Artifacts name}\*.zip
So you could try to set the $(System.DefaultWorkingDirectory)/**/**/*.zip in Package Or folder field
For example:
On the other hand, you can check the specific path in the Release log -> Download Artifacts Step.
$(System.DefaultWorkingDirectory): D:\a\r1\a
You could aslo use this specific path in the task.
Update:
If your file is Project Folder, you refer to the following sample:
File structure:
Task Settings:
Note:You only need to assign to the folder node.
You could also select the folder path via ... option.
Context
creating an sdk that uses AWS DynamoDb in .NET 4.7.2. In my test project, I have an app.config file that references a secrets.confg file that stores my AWS keys. The secrets config is not included in any commits, for obvious reasons. However, I've uploaded secrets.config to my Library in DevOps as a "secure file". I also have a "download config file" task in the build process that downloads secrets.config to Agent.Temporary
Issue
I don't understand how modify the build process to "pick up" my secrets.config file so that when running my tests, the test project's app.config file knows where to look for secrets.config.
I've looked over a lot of documentation and I can't find exactly how to do this.
I figured this out by using a couple of tasks within the Build Configuration.
I added a Download Secure File task to download my app.config from the Library. And then setup a Copy Files task for each project that needed the app.config.
Each task set:
Source Folder to "$(Agent.TempDirectory)"
Contents to "app.config"
Target folder to "$(Build.SourcesDirectory)\Project.Tests\"
In the end, it wasn't any more complicated than that. I hope this will help others...
I'm trying to upload log files created by a 3rd party exe during a deployment and including it in the results of my deployment, on a separate tab if possible.
I tried using the publish artifact build tasks but that only works for build not release.
I tried logging tasks but ##vso[build.uploadlog]<local file path> seems to be for builds as well since it complains about finding a container for the build.
Release management does not have a container for build artifacts, that's why you see this error message.
You can try with following task:
Write-host "##vso[task.uploadfile]<filename>"
View and download attachments associated with releases
Would you like to upload additional logs or diagnostics or images when
running tasks in a release? This feature enables users to upload
additional files during deployments. To upload a new file, use the
following agent command in your script:
Write-host "##vso[task.uploadfile]"
The file is then available as part of the release logs. When you
download all the logs associated with the release, you will be able to
retrieve this file as well.
You can also add a powershell script task in your release definition to read the log files and output it to the console. Then you will be see the content of the log files from "Logs" tab powershell script step. And you can also click "Download all logs as zip" to download the logs.
I hope to provide more clarity to those looking for an answer. The accepted answer does work.
I had a lot of files (browser screenshots) to add to the release logs. Here's what I did:
If you have lots of files, Archive them into a zip.
Attach the zip to the log files via powershell.
Download the logs
Unzip and enjoy!
#DonRolling, thanks for the detailed answer. In my case instead of adding a new task to compress the folder i just included that part in the powershell:
Compress-Archive -Path "$(System.DefaultWorkingDirectory)/TestFolder/ScreenShots" -DestinationPath "$(System.DefaultWorkingDirectory)/TestFolder/ScreenShots" -Force
Write-host "##vso[task.uploadfile]$(System.DefaultWorkingDirectory)/TestFolder/ScreenShots.zip"
I was facing a similar problem but I also wanted to use the Artifacts in a subsequent agent phase.
Based on the previous answers I created an Extension that offers the possibility to:
Upload a file or a folder to the Release Logs
Automatically download an Artifact from the logs that was previously uploaded
The upload task is making use of the logging command as it was mentioned before. The download task then queries the Azure DevOps REST Api to download all logs collected thus far, tries to find the specified artifact and copies it to a specific place.
If anyone is interested, it can be found on the Marketplace
We've got a TeamCity (9.1) build configuration which is based on several snapshot dependencies to build correctly. I'm looking for a convenient way to provide each developer with a way to set up a proper build environment on their desktops. For this, I would like to download all the snapshot dependencies for a given build configuration from the TeamCity server onto the developer's desktop using the REST api.
I'm aware of how to access artifacts using REST. But this would address the artifacts created by a specific build configuration. I'm looking for a way to download all artifacts used by a given configuration specified by the dependencies.
There isn't an easy way to do this, however, it's not impossible. My answer is provided below followed by a possible alternate solution.
Answer:
The artifacts used by your target build are really just the artifacts that were created by its dependencies right?
I think what you are looking for is referenced here where you can query a build for all of its Snapshot Dependencies.
Once you have a list of the dependencies you would then need to query each of them for the artifacts they generated and then you could proceed to download them.
It's not the most straightforward thing and would require some slick Powershell or Python or whatever, but it is doable.
Another Idea:
Have you looked into something like Artifactory? It sounds like what you really need is a binary repository of sorts to track artifacts used, and artifacts created.
Or for small projects, you could probably get a way with just using a file share on the network where the build could "copy" to the share organizing files into "build" directories of some sort and then developers could "read" from the share.