Use DevOps Release Pipeline to Upload Artifacts to a Linux Server - azure-devops

Is there a simple way to upload build artifacts in a zip file to a Linux Server using SCP or some other protocol?
Right now the build sits in a specific directory on the build server ready to be uploaded but the docs I've been reading haven't made it clear how to upload it to Linux using DevOps Release Pipeline. The documentation and question/answers I've been reading appears to be windows to windows rather than windows to Linux.
Thanks!

Is there a simple way to upload build artifacts in a zip file to a Linux Server using SCP or some other protocol?
You could try to use Copy Files Over SSH task to copy the Artifacts to a Linux Server.
Use this task in a build or release pipeline to copy files from a
source folder to a target folder on a remote machine over SSH.
This task allows you to connect to a remote machine using SSH and copy
files matching a set of minimatch patterns from specified source
folder to target folder on the remote machine. Supported protocols for
file transfer are SFTP and SCP via SFTP. In addition to Linux, macOS
is partially supported.
Please check the developer forum Copy Files Over SSH during Continuous Integration and Deployment for some more details.

Related

Post Build event that works on both Azure DevOps and Local PC

I have a VS2017 solution that Builds both locally and also on Azure DevOps.
I now need to run a Post Build script to run an EXE. I have this working on my local machine, but I guess there will be an issue with the Path to the EXE which has been added to the DevOps Library.
Note. The EXE is all installed on DevOps and runs fine from a Command Line Task - I just need it to run as a post build on one of the projects so that this project is ready to be packaged in the Installer SetUp project. (During a full Solution build).
This represents the Local Post Build script - How do I handle this on Azure, where the path will be different?
"C:\Program Files (x86)\{dir}\{app}.exe" -file "$(ProjectDir){file.txt}"
Any help appreciated. Thanks!
This represents the Local Post Build script - How do I handle this on
Azure, where the path will be different?
$(ProjectDir) is msbuild property, so it works on both Azure DevOps and Local PC. You only need to pay attention to the {dir} of the xx.exe.
My suggestion is to put the exe in solution folder (where the xx.sln file exists), then you can use script like "$(SolutionDir)\{app}.exe" -file "$(ProjectDir){file.txt}". The $(SolutionDir) and $(ProjectDir) can be recognized by msbuild. (It works for both local pc and online devops.)
Or you can put the xx.exe under root directory of your git repo, then use $(System.DefaultWorkingDirectory) as the path of your xx.exe, but it only works for online devops, it can't work on local PC. (Not recommended)

Azure DevOps - During the build pipeline run, what is the path where the Agent downloads the files locally?

We are using a Microsoft Hosted Agent to run a build pipeline for an automated test scenario for our application. What we would like to achieve is, having our automation procedure, to download a file (from a headless chrome browser), then navigate to the path where it is downloaded and open it.
How could I find the path where the files are being downloaded inside the Agent?
Quick compilation of list of pre-defined variables related to paths for the build on linux and windows self hosted agents from official doc link.
The one you are looking is Agent.BuildDirectory or Pipeline.Workspace.
List of predefined variables:
Variable type
Variable
Description
Example
Agent
Agent.BuildDirectory
The local path on the agent where all folders for a given build pipeline are created
D:\..\agent\_work\1
Agent
Agent.HomeDirectory
The directory the agent is installed into
C:\agent
Agent
Agent.TempDirectory
A temporary folder that is cleaned after each pipeline job
D:\..\agent\_work\_temp
Agent
Agent.ToolsDirectory
The directory used by tasks such as Node Tool Installer and Use Python Version to switch between multiple versions of a tool
D:\..\agent\_work\_tool
Agent
Agent.WorkFolder
The working directory for agent
c:\agent_work
Build
Build.SourcesDirectory
The local path on the agent where your source code files are downloaded.
c:\agent_work\1\s
Build
Build.ArtifactStagingDirectory
The local path on the agent where any artifacts are copied to before being pushed to their destination. A typical way to use this folder is to publish your build artifacts with the Copy files and Publish build artifacts tasks
c:\agent_work\1\a
Build
Build.StagingDirectory
The local path on the agent where any artifacts are copied to before being pushed to their destination.
c:\agent_work\1\a
Build
Build.BinariesDirectory
The local path on the agent you can use as an output folder for compiled binaries
c:\agent_work\1\b
Build
Build.Repository.LocalPath
The local path on the agent where your source code files are downloaded.
c:\agent_work\1\s
Build
Common.TestResultsDirectory
The local path on the agent where the test results are created.
c:\agent_work\1\TestResults
Pipeline
Pipeline.Workspace
The Workspace directory for a particular pipeline
/home/vsts/work/1
System
System.DefaultWorkingDirectory
The local path on the agent where your source code files are downloaded.
c:\agent_work\1\s
When you install the agent, you specify the work directory. In pipeline tasks, you can find out exactly where within that directory files are staged with variables like $(Agent.BuildDirectory). This might not be the exact location you need, but I think it is in the right direction.
For a complete list of predefined variables, see here: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
You can check this document:
So for Linux its default location /home/<username>/Downloads.
I do the test in Microsoft hosted agent-window2019 with test C# code(Hint from Daniel!) like:
static void Main(string[] args)
{
string FILEURI = "https://www.nuget.org/api/v2/package/Cauldron.Newton/2.0.0";
System.Diagnostics.Process prozess = new System.Diagnostics.Process();
prozess.StartInfo.FileName = #"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe";
prozess.StartInfo.Arguments = "--download " + FILEURI;
prozess.Start();
Console.WriteLine("Test starts.");
}
And then I use command like dir c:\users\VssAdministrator\cauldron.newton.2.0.0.nupkg /s /b to find the location of downloaded file: cauldron.newton.2.0.0.nupkg.
Then i confirmed the default download location of Chrome is still C:/Users/{user}/Downloads, same as using self-agent or downloading locally. (VssAdministrator is user when run windows hosted agent)
So I think Linux hosted agent should have similar behavior. You can try to find your file from /home/<username>/Downloads folder. Hope it helps.

Browse file system in Azure Devops

Is it possible to browse the file system in Azure Devops. Like when using SSH to connect to a server? Or if it's possible to browse using Explorer.
It would really simplify things if I could see what files were created and where they end up after builds.
Now I don't feel I have any way to know which files ended up where after the builds are done.
Thanks!
I don`t think so. You may add build steps (Build and release tasks - Utility) and create cmd or bath file to browse the file system of the build servers.
As alternative way, you may use your own build server (Self-hosted agents) on Azure VMs and you will have the full control.

how to move files located in on-premises windows based file share using scala?

Background : I am using Azure ADFV2 to move data from fileshare to ADLS, after the file is moved successfully I want to archive the file within fileshare location.
How do I connect to on-premises windows based file share and move the files from one folder to another within the fileshare using scala. I am not sure how to establish the connectivity to a file share.
You can use file system linked service to establish the connectivity to a file share.
create a self-hosted integration runtime on ADF and install it on your machine.
create a file system linked service, and for "Connect via integration runtime" field, choose the self-hosted ir you created in 1.
configure you linked service and dataset as the doc instructs: https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system

Download file from VSTS with lightweight client (PowerShell)

I code PowerShell, and save it to my VSTS account.
But when I code costumer specific Powershell script, I write them in my VS on my computer, then download them/the file from the customer server via web.
Is there any lightweight client or PowerShell script to download the latest version of a given file or folder from a private VSTS repository?
So I don't have to install anything on the customer server, and I can easily update the local file with a saved script.
It's a TFVC repository, not GIT repository if that makes a difference.
Neno Loje has built a small tool that can download one or more files directly from TFVC. It needs a couple of Client Object Model files, which are taken from the new TFS Client Object Model nuget files.
You can find it here:
https://blogs.msmvps.com/vstsblog/2011/03/14/download-files-from-tfs-version-control-and-set-the-file-last-access-timestamp-to-the-file-s-last-check-in-time/
The TFS 2015/2017 version will work against VSTS as well as pretty much any TFS server out there regardless of the version.
Alternatively, you could use the TFS Cross Platform Command Line, as long as Java is available on the target server