When you set up an azure devops agent on a build machine, it will have a working folder (by default _work) where it will create subfolders for each pipeline that it has to run.
These folders have integer names like "80" or "29". This makes it hard to trouble shoot issues on a given build machine, when you have many pipelines, as you don't know which folder it each pipeline relates to.
Is there a way to figure out the mapping from pipeline > folder number, or to name these folders more explicitly?
Rename the folders is currently not supported in Azure DevOps.
Each pipeline maps a folder in the agent-> _work.
1.You could check the pipeline log to figure out which folder is your pipeline's working folder. (Enable system diagnostics)
2.You could also add a command line task in your pipeline to echo this directory.
echo $(System.DefaultWorkingDirectory)
Related
I have a PowerShell task that processes published artifacts from another stage.
I have noticed these are put into a folder "s". Random sampling shows its "s" all the time but I doubt if it will be the case always!
Wondering what's the best way to refer to these files safely?
Here's my publish artifacts task:
Published artifacts:
And the PowerShell task that consumes these artifacts:
The variable that you use does not work with the s folder. As described on the documentation Build.ArtifactStagingDirectory is the local path on the agent where any artifacts are copied to before being pushed to their destination. For example: c:\agent_work\1\a . You will find those files under a folder. The folder that is referred with the s letter is where the sources are downloaded Build.SourcesDirectory.
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
Documentation on Azure devops predefined variables that can be used and where the folders are located.
https://blog.geralexgr.com/devops/how-azure-devops-pipelines-agent-works
The predefined variable you are looking for is $(Pipeline.Workspace). If you look at the documentation for the download pipeline artifact task you can see the default path. Note if you are downloading more than one artifact they will be within subfolders
Edit - Having just looked again at the pipeline are you publishing and downloading the artifact, or are you just doing all of these tasks as a single pipeline?
The best way imo to have these setup would be to have two pipelines. The CI to build and publish the artifact, then the CD pipeline to download and use the artifact
I am new to Azure CICD pipelines and I am trying to export the CRM solutions using Build pipeline in azure devops using Power Platform Task. There is a requirement to keep the exported solution from build pipeline to Azure repos. (which I am doing it from command line using tf vc)
I am able to export the solution successfully but the issue is when I publish the artifacts it publishes every file present in the mapped folder. (mapped a directory in Azure repos where all the solution backups are kept)
I see that azure agents copies all the files present in the mapped directory and stores in agent directory. The problem is the mapped directory contains all the backup files of CRM Solutions. I found some articles where it was mentioned to cloak the directory so that the files will not be included in azure agent. But if I cloak the directory then I am not able to check-in the exported solution from command line.
So, I was wondering if there is any way to exclude all files present in the mapped directory and still able to check-in the exported file to that directory using command line.
You can use a .artifactignore file to filter out paths of files that you don't wish to be published as part of the process.
Documentation can be found here
Normally, if you are using a self-hosted Azure agent, it automatically selects target folders for you (I assume based on pipeline name), underneath the work folder... so for example:
/home/xxxxx/azure/_work/2/t/
Is there a way that I can control this choice from a YAML pipeline? My use case is that I have a very large repo, and several different pipelines that all check it out. I only have one agent running on the machine, so my preference is that all of the pipelines I run on this agent check out into the same folder (like /home/xxxxx/azure/_work/MyProject), ensuring that there will only be one checkout of this large repo.
You could avoid syncing sources at all in the pipeline by setting checkout: none:
steps:
- checkout: none
Then add a command line or script task to clone the repo manually via git commands.
We've recently been testing linking the source folder and it has been working pretty well, so that is possibly an easier answer than the accepted one.
That is, you can create a specific folder (let's say ~/projects/Acme or C:\projects\acme), and then in your Azure pipeline steps before checkout you delete the s folder and link it to the target project folder.
# In PowerShell (Windows)
New-Item -Type Junction -Path s -Target C:\projects\acme
# In Bash (Linux/MacOS)
ln -s ~/projects/acme s
The advantage to this approach is that you don't have to attempt to override the working folder for dozens of builtin tasks scattered throughout your pipelines and template files; it's a bit of setup at the top of the pipeline and then the rest of the tasks can operate as normal.
I am using IIS Deployment template in Release pipeline to deploy MVC application to VM and it is working fine. But after deploying application, we want to run any ad-hoc sql changes using script files in SQL server using custom task Run SQLCMD Scripts from VSTS market place.
Relese pipeline, scripts are in zip file, Can anyone suggest what we should key-in in "Path to folder containing SQLCMD script files"?
You can try referencing the variable
$(Build.ArtifactStagingDirectory)
In Release Pipeline, the artifacts will be downloaded to the path : $(System.ArtifactsDirectory).
According to your screenshot, I noticed that you are using the "Extract files" task. This task will find the zip files in the $(System.ArtifactsDirectory) and extract it.
The unzipped folder name is set in the "Extract files" task (Destination folder).
So you could try to use the following path:
$(System.ArtifactsDirectory)/Destination folder name
You can also expand this path according to the actual location of the file.
Hope this helps.
I'm executing a Powershell script in an Azure DevOps release pipeline. In the script some JSONs are saved to a local directory, which later in the script are then uploaded to Azure Blob Storage.
However, of course, Azure DevOps doesn't see my local directory to save to. Can I save the JSON files to say a staging folder in the Azure DevOps agent? Or if not where can I save to when the script is run in the release pipeline? Thanks
You can save the json file to a folder in azure devops agent. When the agents run your pipeline, it will create a working space with below folders in the agent machine.
You can point to these folders in your pipeline by referring to the predefined environment variables.
For example :
$(Build.ArtifactStagingDirectory) is mapped to folder '/_work/2/a'
$(Build.BinariesDirectory) is mapped to folder '/_work/2/b'
$(System.DefaultWorkingDirectory) is mapped to folder '/_work/2/s'
You can also save the json file to a new folder(eg. staging folder) within '/_work/2'.
For example $(Agent.BuildDirectory)/staging. A new folder staging will be created within folder/_work/2.
For more informatin about predefined variables. Please check here. For release variables please check here.