Github Actions, Azure Devops "Publish Pipeline Artifact" Equivalent? - azure-devops

I see that Microsoft is likely going to move in the direction of shying away from Azure DevOps and more heavily leaning on GitHub Actions as a primary automation platform (speculation, not sure if it's true), so I am trying to move all of my automation off of DevOps onto GitHub Actions and when doing so I noticed that there are some lacking similarities.
In this specific case, I am wondering if there is an equivalent to Azure DevOps "Publish Pipeline Artifacts" task in GitHub Actions?
The closest thing I can find in GitHub Actions is "actions/upload-artifact#v2", however this more similarly resembles Azure DevOps' "Publish build artifacts". I get the use case and understand what I could use it for, but I want to see if I can upload an entire Pipeline/workflow in a package, rather than file by file.
In Azure DevOps, my pipeline runs in < 5-7 minutes because I can use the "Publish Pipeline Artifacts" task, but in GitHub Actions, I only have the "actions/upload-artifact#v2" action and now it takes up to 3 hours to do the same automation tasks. (Insane difference!). I think the added time is due to the upload/publish task in GitHub Actions going file by file whereas in Azure DevOps, the upload/publish task somehow condenses it all and it only takes ~1 minute for it to finish.
Any/All help is greatly appreciated! My Google Fu is not coming up with anything atm.

It is slow because:
GZip is used internally to compress individual files before starting an upload.
So this is not only the case due to the fact that each file is sent individually but each file is also compressed individually. Your best workaround at the moment would be compress whole directory as riQQ already wrote.
It can be done like this:
- name: 'Tar files'
run: tar -cvf my_files.tar /path/to/my/directory
- name: 'Upload Artifact'
uses: actions/upload-artifact#v2
with:
name: my-artifact
path: my_files.tar
A big drawback is that now you need to each time unpack your artifact when you download it.
For more details please check this topic - Upload artifact dir is very slow

Related

Accessing Published Artifacts from yaml pipeline tasks

I have a PowerShell task that processes published artifacts from another stage.
I have noticed these are put into a folder "s". Random sampling shows its "s" all the time but I doubt if it will be the case always!
Wondering what's the best way to refer to these files safely?
Here's my publish artifacts task:
Published artifacts:
And the PowerShell task that consumes these artifacts:
The variable that you use does not work with the s folder. As described on the documentation Build.ArtifactStagingDirectory is the local path on the agent where any artifacts are copied to before being pushed to their destination. For example: c:\agent_work\1\a . You will find those files under a folder. The folder that is referred with the s letter is where the sources are downloaded Build.SourcesDirectory.
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
Documentation on Azure devops predefined variables that can be used and where the folders are located.
https://blog.geralexgr.com/devops/how-azure-devops-pipelines-agent-works
The predefined variable you are looking for is $(Pipeline.Workspace). If you look at the documentation for the download pipeline artifact task you can see the default path. Note if you are downloading more than one artifact they will be within subfolders
Edit - Having just looked again at the pipeline are you publishing and downloading the artifact, or are you just doing all of these tasks as a single pipeline?
The best way imo to have these setup would be to have two pipelines. The CI to build and publish the artifact, then the CD pipeline to download and use the artifact

How to make a file that is generated during release pipeline accessible?

I run UI tests for my app in a release definition in Azure DevOps. I generate test report. I decided that it is convenient to save it in the build directory(wrong assumption?). The directory where the report is is:
browserName + DateTime.Now.ToString("ddMMyyyyHHmmss", CultureInfo.InvariantCulture) + #"\";
so the directory regex would be for instance : Chrome\d+
I build the test project in the release pipeline, run the tests, then I try to publish my report. The goal is to make it available in the Azure DevOps, or send a link to download it or any other way that makes it accessible.
To do so I added a step Publish Build Artifact
but then I get an error:
but then I don't have a fileshare available(I am able to create Azure Storage Account for instance) additionaly Publish Build Artifact doesn't support wildcards so I can't use regex Chrome\d+ to pin down the report directory.
Question: How can I make a file that is generated during release pipeline accessible?
EDIT: I found out in the meantime that I have Sharepoint available with enough storage.
Unfortunately publishing from release pipeline is not allowed.
Can we publish artifacts in release pipeline - Azure devOps?
One way to get around this is to try to publish Universal package in release pipeline but it has limitations. Create feed and publish there your files so you can share URL to others. It is not best option but if your test results files are not large you can publish to the feed and clean it sometimes(manually, because REST API provides way to delete package but does not provide function to get list of all packages published).
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/package/universal-packages?view=azure-devops
The disadvantage of this option is that for free users it has limit up to 2GB. You can delete old packages when required but it takes around 24 hours to free space. But you can forget to free space and your pipeline will fail with not enough storage error and by next 24 hours you will have to disable this task to let pipeline pass.
"I decided that it is convenient to save it in the build directory"
Remember that agent working directory is cleaned depending on option you choose.
https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops#clean-the-local-repo-on-the-agent

How can I configure a Azure DevOps Release Pipeline to package PowerShell scripts?

I'm new to Azure DevOps and I'm trying to understand how to package a release of a PowerShell script project I'm working on.
I'm previously familiar with GitHub and the manual process for drafting a new release of my project repo. I'm now experimenting with Azure DevOps and what I want to achieve is a similar output to GitHub where my repo of PowerShell scripts are packaged into a zip file which I can publish for release.
I'm not familiar with the pipeline process in Azure DevOps or YAML as a newbie to proper release cycle tools. Previously I've just created scripts and shared them simply as they are or dropped them into a GitHub repo and manually packaged a release. I'm not likely to be turning out large numbers of builds and so have never had to come at this from an automated standpoint which seems to be the way Azure is driving me unless I'm missing something?
It's pretty simple. I prefer to do this using the old-fashing GUI (hint: there is a link when starting a new Build Pipeline that says Use the classic editor), and then convert to YAML after I get my Build Pipeline working.
1) Create your standard Build Pipeline.
2) Add the step to ZIP your files
3) Add properties to that Archive step. Specify the source to zip and target where you want the zip file to end up at.
4) Lastly, convert that single step to a YAML step by clicking in the upper-right corner on the link View YAML.
There are a lot of steps I am leaving out, but I hope this leads you into the right direction.

What is the difference between Build Artifact and Pipeline Artifact tasks?

In Azure DevOps, we have Download/Publish Build Artifact tasks and Download/Publish Pipeline Artifact tasks.
What is the difference between the build artifact tasks and the pipeline artifact tasks and when would we choose one over the other?
There is an issue about it in Azure DevOps GitHub and Microsoft answered:
Hey everyone - I'm from the Azure Artifacts team and we built the
Pipeline Artifacts feature that you've discovered in Azure Pipelines.
I'll address the most important comment around documentation - we've
got a whole new page coming out around Artifacts in Azure Pipelines
which lists out each artifact type that we support and what they are
for along with links to specific documentation. We think that should
answer most of your questions.
Because that is still being edited before we publish it I thought I
would give you the 30,000 foot view on the difference between Pipeline
Artifacts and Build Artifacts and also mention how Pipeline Artifacts
relate to Universal Packages.
Build Artifacts (published via the Publish Build Artifacts task) have
been in Azure DevOps for a long time and are the built-in artifact
storage mechanism for Azure Pipelines. Most builds that store
non-package artifacts today would likely use this task. The task can
push the content up to the server/cloud but can also copy the files to
a local file share.
Pipeline Artifacts (published using the Publish Pipeline Artifact task
are intended as the replacement for Build Artifacts. They are in
preview right now and there are a few more things we need to do to
reach parity. The benefit of Pipeline Artifacts is that they can
dramatically reduce the time it takes to upload and download large
artifacts. We do this be first checking whether the content that is
being uploaded exists in the service. We do this not just at the
per-file level but also at the sub-file level (in up to 128K chunks).
It can lead to really dramatic performance improvements.
Universal Packages - also in preview use the same storage/transfer
technology as Pipeline Artifacts. You would use Universal Packages
when you want to create an artifact with a life time independent of
the pipeline that created it. You can download Pipeline Artifacts
after a pipeline has completed via the artifacts UX - but if you want
something that really exists independent of pipeline you would go for
Universal Packages. There should be no performance difference between
the two.
Hopefully this helps. Feel free to ask any more questions and I'll
follow-up with answers.

Azure Devops - Build Automation

I have a Azure DevOps Git Repo with many solutions in it, and are starting down the path of build, test, deploy automation.
I figured out how to run a rebuild if any file changes in the repo.
However, since the repo has many solutions in it, I only want to run a given rebuild of a solution if a specific subfolder changes.
Is that possible, and if so, how do I accomplish this?
you can use path based trigger filters (i'm fairly certain they are only supported in yaml builds). example:
trigger:
paths:
include:
- folder1/*
- folder2/somefile
- etc
Reading:
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
https://learn.microsoft.com/en-us/azure/devops/pipelines/create-first-pipeline?view=azure-devops&tabs=tfs-2018-2