I run UI tests for my app in a release definition in Azure DevOps. I generate test report. I decided that it is convenient to save it in the build directory(wrong assumption?). The directory where the report is is:
browserName + DateTime.Now.ToString("ddMMyyyyHHmmss", CultureInfo.InvariantCulture) + #"\";
so the directory regex would be for instance : Chrome\d+
I build the test project in the release pipeline, run the tests, then I try to publish my report. The goal is to make it available in the Azure DevOps, or send a link to download it or any other way that makes it accessible.
To do so I added a step Publish Build Artifact
but then I get an error:
but then I don't have a fileshare available(I am able to create Azure Storage Account for instance) additionaly Publish Build Artifact doesn't support wildcards so I can't use regex Chrome\d+ to pin down the report directory.
Question: How can I make a file that is generated during release pipeline accessible?
EDIT: I found out in the meantime that I have Sharepoint available with enough storage.
Unfortunately publishing from release pipeline is not allowed.
Can we publish artifacts in release pipeline - Azure devOps?
One way to get around this is to try to publish Universal package in release pipeline but it has limitations. Create feed and publish there your files so you can share URL to others. It is not best option but if your test results files are not large you can publish to the feed and clean it sometimes(manually, because REST API provides way to delete package but does not provide function to get list of all packages published).
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/package/universal-packages?view=azure-devops
The disadvantage of this option is that for free users it has limit up to 2GB. You can delete old packages when required but it takes around 24 hours to free space. But you can forget to free space and your pipeline will fail with not enough storage error and by next 24 hours you will have to disable this task to let pipeline pass.
"I decided that it is convenient to save it in the build directory"
Remember that agent working directory is cleaned depending on option you choose.
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops#clean-the-local-repo-on-the-agent
Related
I have created a c++ pipeline where the output of the build pipeline is published to drop container. The structure is the following
drop/v1.0.0/Release/MyService.dll
drop/v1.1.0/Release/MyService.dll
drop/v1.1.0/Release/MyService.dll
My engineers will need to view drop folder and according to the version that needs to be manually deployed to a client the will download the dll file.
As far as I understand there is not any way to view them under Artifacts (what a shame). I go to the project settings under Storage but I cannot view them either there. Only place that I am able to find them is under the pipeline run and then I have to find in which version of the pipeline run a specific service version was produced. This is a maze. We have dozens of c++ projects and we have to keep track of which pipeline version run of each project matches the service version.
Is there any way to be able to access them like in a folder structure?
You could use Builds - List via rest API to get all the builds for a pipeline, then use : Artifacts - List rest API to get all the artifacts for a build. It will list all the download URL for artifacts, then you could download them together or choose the one you want to download.
Besides, you could use the publishLocation argument in publish build artifacts task to copy the artifacts to a file share (FilePath). And the file share must be accessible from the agent running the pipeline. In this way you could publish all your artifacts to the file share you want for better management.
In addition, you could also use Universal Package task to publish your artifacts to your feed for better review.
I have an Azure DevOps pipeline build that has several steps and the build is long. Every time there is something wrong with the build we review the logs and identify issues or come up with theories, then in case of a theory we have to insert a diagnostic command line (such as get directory, show contents of a file, etc) in between the steps; and in case of a fix we add a fix but we have to wait for the whole pipeline to rerun and find out. This is causing us to take a lot of time to fix build issues.
If we had access to the state of the agent of an unfinished build and we could just log on using RDP or any other terminal and checkout the contents, and the state of the files on disk that would have saved us a lot of hours.
Is there any way with Azure DevOps to do any diagnostic of this type?
No, if you are using hosted agent. If you are using self-hosted agent you can obviously log in to that one. You can, however, implement steps that only work if the build failed and those steps can attempt to capture information you are interested in (say publish the state of the build directory).
If you are using Azure DevOps Services, there is a new REST API version out that will let you do a "preview" run of changes to the YAML definitions: https://learn.microsoft.com/en-us/azure/devops/release-notes/2020/sprint-165-update#preview-fully-parsed-yaml-document-without-committing-or-running-the-pipeline
Within Azure Devops I have a build pipeline which builds and publishes artifacts and a release pipeline which downloads those artifacts, defines some infrastructure configuration, and batch uploads the artifacts to a web container.
After the configuration definition I want to add a task to fetch the clientId of an AD registered app, dumps it into a json file and copies the file in the same folder as the build artifacts. The json has to be uploaded to the web container to provide runtime configuration for a spa app.
What I have tried:
generate a json in a release task and copy it into said folder
commit an empty json in the code, have it published as build artifact and update its content in a release task
use the file transform task which only seem to allow updating a key / value, not generating a new one
The contents of the folder which gets uploaded seem to be locked.
Is that correct ? What can I do to achieve my goal ?
Releases don't publish artifacts. Releases consume published artifacts. A release can be run multiple times for the same build. A release can have multiple environments. What you want to do would fall apart immediately in any of those scenarios.
What you should do is write a custom BASH or Powershell script (depending on your preferences and OS) that does exactly what you describe:
Generate an appropriate JSON file
Upload the JSON file to the "web container"
You haven't provided any details about what a "web container" is or what your deployment environment is (i.e. AWS, Azure, containers running in Kubernetes), so that's the most thorough answer that can be provided.
I'm new to Aure DevOps. Trying to create build and release pipelines there's one thing I don't understand:
Commonly, every kind of build finally results in some output, called artifacts.
With Azure DevOps it seems like there is always a final copy or publish task necessary to copy the created artifact from A to B, so the release task may then access the compiled artifacts.
Why aren't these artifacts plain accessible to a release pipeline right from the location where they have been built? Why don't the build tasks automatically set a variable pointing to the right folder, so the release pipeline may access the files right from there?
Or is this already happening and I'm just missing something from the tutorials I watched?
There are so many reasons.
Two easy ones:
There is no guarantee that the agent's working folder still contains the files. Agents are reused from build to build and release to release, and a given build or release will always use the same working folder. The working folder is cleaned up between builds.
Releases may run on different agents. On different machines. In different domains. Or any combination. There's no guarantee that the agent where the build ran is accessible by the agent where the release is running. Publishing the artifact allows a guarantee: As long as the machine the release is running on has the ability to talk to Azure DevOps (which is a requirement for the agent to function in the first place), it can get the artifacts it needs.
Why aren't these artifacts plain accessible to a release pipeline
right from the location where they have been built?
Agree with Daniel.
The main reason for me is because we can't hold the hosted agent all the time. Since MS wants to protect resources efficiently, it is not occupied for a long time.
When we queue a build, MS will assign us a brand new clean agent to execute our task, and after the build is complete, the MS will reclaim the agent assigned to our build and restore the agent to its initial state in preparation for accepting the assignment of the next task.
So, we could not keep hold the hosted agent to use it in next release pipeline. We have to store the artifacts in the cloud/server, then we could download it in the release pipeline. Otherwise, we could not get the artifact we need from an agent that has been restored.
Besides, MS is randomly assigned to the agent, and we cannot guarantee that the same agent will be allocated and built during the release pipeline.
That is the main reason why we need to copy or publish the artifacts.
If you do not want to copy or publish the artifacts, you could setup your own private agent, and do not clean the agent before you execute the release pipeline.
Update:
why is the user, well, bothered to find a place for the artifacts
manually? I would have expected every build pipeline to come with a
personal space to store the latest build artifacts. A space where
Azure DevOps automatically copies the build artifacts to. To me it
looks like things have to be manually copied from A to B and then
later from B to C.
That because not all output is needed, for example, the test project, what we need is test result/Code coverage for the test project, not the output for the solution. In this case, we do not need to copy the output to the artifacts. On the other hand, we need to copy some special files to artfacts, then automatically copy the build artifacts will not meet your requirements.
That is also the reason why we provide the task to copy files to artifacts, so that we could customize our personality needs.
Of course, if you think that manual copying is superfluous, you can use the MSBuild parameter /p:output=$(build.artifactstagingdirectory) to set the output directly to artifacts.
If I need to copy things from A to B in the Build pipeline, then what
should keep me from copying it to C right away? Then a separate
Release pipeline would be, well, rather optional, if not redundant.
If you are in the build pipeline, there is another task Download build artifacts, which could download the build artifacts.
if you are in the release pipeline, you just need select the build artifacts as source, release pipeline will download that artifact automatically:
Check this document for some more details.
Hope this helps.
I'm new to Azure DevOps and I'm trying to understand how to package a release of a PowerShell script project I'm working on.
I'm previously familiar with GitHub and the manual process for drafting a new release of my project repo. I'm now experimenting with Azure DevOps and what I want to achieve is a similar output to GitHub where my repo of PowerShell scripts are packaged into a zip file which I can publish for release.
I'm not familiar with the pipeline process in Azure DevOps or YAML as a newbie to proper release cycle tools. Previously I've just created scripts and shared them simply as they are or dropped them into a GitHub repo and manually packaged a release. I'm not likely to be turning out large numbers of builds and so have never had to come at this from an automated standpoint which seems to be the way Azure is driving me unless I'm missing something?
It's pretty simple. I prefer to do this using the old-fashing GUI (hint: there is a link when starting a new Build Pipeline that says Use the classic editor), and then convert to YAML after I get my Build Pipeline working.
1) Create your standard Build Pipeline.
2) Add the step to ZIP your files
3) Add properties to that Archive step. Specify the source to zip and target where you want the zip file to end up at.
4) Lastly, convert that single step to a YAML step by clicking in the upper-right corner on the link View YAML.
There are a lot of steps I am leaving out, but I hope this leads you into the right direction.