How to control work folder for Azure DevOps pipeline? - azure-devops

Normally, if you are using a self-hosted Azure agent, it automatically selects target folders for you (I assume based on pipeline name), underneath the work folder... so for example:
/home/xxxxx/azure/_work/2/t/
Is there a way that I can control this choice from a YAML pipeline? My use case is that I have a very large repo, and several different pipelines that all check it out. I only have one agent running on the machine, so my preference is that all of the pipelines I run on this agent check out into the same folder (like /home/xxxxx/azure/_work/MyProject), ensuring that there will only be one checkout of this large repo.

You could avoid syncing sources at all in the pipeline by setting checkout: none:
steps:
- checkout: none
Then add a command line or script task to clone the repo manually via git commands.

We've recently been testing linking the source folder and it has been working pretty well, so that is possibly an easier answer than the accepted one.
That is, you can create a specific folder (let's say ~/projects/Acme or C:\projects\acme), and then in your Azure pipeline steps before checkout you delete the s folder and link it to the target project folder.
# In PowerShell (Windows)
New-Item -Type Junction -Path s -Target C:\projects\acme
# In Bash (Linux/MacOS)
ln -s ~/projects/acme s
The advantage to this approach is that you don't have to attempt to override the working folder for dozens of builtin tasks scattered throughout your pipelines and template files; it's a bit of setup at the top of the pipeline and then the rest of the tasks can operate as normal.

Related

How can I determine current directory of executing YAML pipeline in azure pipelines?

We have a large Azure DevOps monorepo, containing many applications and YAML pipelines.
We try to maximise autonomy of our solutions, so we define pipelines in a \Pipelines\ sub directory within the solution root directory. The solutions are in various depths from the repo root.
Every time we need to reference a source file from YAML to pass it to a built-in task such as MSBuild, DotNetCLI or NuGet, we refer to it as relative to $(Build.SourcesDirectory), which seems to indicate the repo root. For example, to pass my-solution.sln to a task, we refer to it as $(Build.SourcesDirectory)\path\to\my-solution.sln. This works, but makes the relation between the pipeline and the solution less atomic, as it requires to explicitly define the full path from the repo root to the solution. If a solution ever moves as a whole unit (which happens), this unit breaks if we don't update the YAML file. This as opposed to most other entities within our ecosystem (in our case .Net), which refer to other entities (e.g. solution --> project; project --> referenced project) by relative path from the source location.
My question:
Is there any pipeline variable (or any other accessible variable e.g. environment variable) which captures the location of the currently executing YAML pipeline? Or is that information lost when Azure compiles the pipelines for execution?
If there is no such variable, is there any other (simple) way to retrieve said location? I understand that I can query the Azure DevOps API, but this seems like it would add more code and maintenance than it would save in the long run.
I've looked here, but if any of those is what I want, then I must have missed it.
Every time we need to reference a source file from YAML to pass it to a built-in task such as MSBuild, DotNetCLI or NuGet, we refer to it as relative to $(Build.SourcesDirectory), which seems to indicate the repo root. For example, to pass my-solution.sln to a task, we refer to it as $(Build.SourcesDirectory)\path\to\my-solution.sln.
You can still use related path, don't need have to use full path. Accoridng to the folder/file structure, use ../ to parent folder, use ./ as current folder. For example:
- task: DotNetCoreCLI#2
displayName: 'dotnet build'
inputs:
projects: ../Bank/Bank.csproj # related path
arguments: '--configuration $(BuildConfiguration)'
In DevOps pipeline, your source code will be checked out to agent machine, by default it's $(Build.SourcesDirectory) which points to c:\agent_work\1\s for example. The pipeline tasks are executed with the files(source code) under this folder. Unless you define workingDirectory for some tasks, please check the doc description below:
In addition, DevOps support "File matching patterns reference", you can specify **/*.proj to match the files.
Hope it answers.

Is there a way to name the azure agent working folders

When you set up an azure devops agent on a build machine, it will have a working folder (by default _work) where it will create subfolders for each pipeline that it has to run.
These folders have integer names like "80" or "29". This makes it hard to trouble shoot issues on a given build machine, when you have many pipelines, as you don't know which folder it each pipeline relates to.
Is there a way to figure out the mapping from pipeline > folder number, or to name these folders more explicitly?
Rename the folders is currently not supported in Azure DevOps.
Each pipeline maps a folder in the agent-> _work.
1.You could check the pipeline log to figure out which folder is your pipeline's working folder. (Enable system diagnostics)
2.You could also add a command line task in your pipeline to echo this directory.
echo $(System.DefaultWorkingDirectory)

How to exclude files present in the mapped directory when Publishing the artifacts in Azure CI/CD?

I am new to Azure CICD pipelines and I am trying to export the CRM solutions using Build pipeline in azure devops using Power Platform Task. There is a requirement to keep the exported solution from build pipeline to Azure repos. (which I am doing it from command line using tf vc)
I am able to export the solution successfully but the issue is when I publish the artifacts it publishes every file present in the mapped folder. (mapped a directory in Azure repos where all the solution backups are kept)
I see that azure agents copies all the files present in the mapped directory and stores in agent directory. The problem is the mapped directory contains all the backup files of CRM Solutions. I found some articles where it was mentioned to cloak the directory so that the files will not be included in azure agent. But if I cloak the directory then I am not able to check-in the exported solution from command line.
So, I was wondering if there is any way to exclude all files present in the mapped directory and still able to check-in the exported file to that directory using command line.
You can use a .artifactignore file to filter out paths of files that you don't wish to be published as part of the process.
Documentation can be found here

How to manually publish a file to Azure Artifacts?

I have a file which I have created manually on my local computer. I need it in several Azure DevOps pipelines.
I want to use it as an "Artifact".
I know how to publish artifacts from within an Azure DevOps Pipeline, but this specific file I just want to upload from my computer. How can I do it?
How to manually publish a file to Azure Artifacts?
As we know, the essence of Artifact is the storage of a shared file. We can roughly get this information from the Publish build artifacts task:
But the default value of Artifact publish location is Azure Pipelines:
This is a shared place set up on Azure.
Update:
Thanks hey sharing:
We could upload from local to the shared place with AZ command line, like:
az artifacts universal publish --organization https://dev.azure.com/example/ --feed my_feed --name my-artifact-name --version 0.0.1 --description "Test Description" --path
Now let us return to the first sentence we started with "the essence of Artifact is the storage of a shared file", so we could create a shared place/folder to save the file. It is now seen as "Artifact". Then we just need make sure other pipelines could access this shared place/folder.
For example, we could create a folder on the server where our private agent is located, then we just need to copy your file to that folder. Now, we could use it when we build the pipeline with private agent. Obviously this is not limited to local folders, we can also create a network folder, only need to ensure that other pipelines can access it.
Hope this helps.
You have to push it through your package manager like NuGet, npm or anything else. But I guess better option would be commit and push this single file to specific repo (if this file is specific to single project) or common repo like "Utilities" (if you gonna reuse it across many projects) and then download this repo (or just file) in your pipeline.

Azure DevOps Server 2019-TFVC Prevent build from occurring if changes are only in certain folder

I've recently set up an Azure DevOps Server 2019 on our local servers using TFVC for our source control. Our branch is structured as follows:
root
- App1
- App1a
- App1b
- etc
- App2
- etc
- Utils
Our build scripts, test utilities, apps used during the build, etc are stored in Utils. What I want to do is only perform a build when changes occur anywhere in root except for Utils. I've seen the option to exclude paths in SO but it's only with GIT libraries, is this possible with TFVC?
The solution I'm using to work around this is from Triggering Azure DevOps builds based on changes to sub folders
, but the build still executes, not the actual build mind you, but the pipeline which then triggers a notification to the team that it was successful. Ultimately I don't want it to run at all unless changes have been made outside of Utils. I also don't want to re-organize the folder structure either since a lot of our utilities have relative paths. I've got the trigger setup as
Thanks in advance.
You can exclude the utils folder in the CI trigger options: