How to exclude files present in the mapped directory when Publishing the artifacts in Azure CI/CD? - azure-devops

I am new to Azure CICD pipelines and I am trying to export the CRM solutions using Build pipeline in azure devops using Power Platform Task. There is a requirement to keep the exported solution from build pipeline to Azure repos. (which I am doing it from command line using tf vc)
I am able to export the solution successfully but the issue is when I publish the artifacts it publishes every file present in the mapped folder. (mapped a directory in Azure repos where all the solution backups are kept)
I see that azure agents copies all the files present in the mapped directory and stores in agent directory. The problem is the mapped directory contains all the backup files of CRM Solutions. I found some articles where it was mentioned to cloak the directory so that the files will not be included in azure agent. But if I cloak the directory then I am not able to check-in the exported solution from command line.
So, I was wondering if there is any way to exclude all files present in the mapped directory and still able to check-in the exported file to that directory using command line.

You can use a .artifactignore file to filter out paths of files that you don't wish to be published as part of the process.
Documentation can be found here

Related

Is there a way to name the azure agent working folders

When you set up an azure devops agent on a build machine, it will have a working folder (by default _work) where it will create subfolders for each pipeline that it has to run.
These folders have integer names like "80" or "29". This makes it hard to trouble shoot issues on a given build machine, when you have many pipelines, as you don't know which folder it each pipeline relates to.
Is there a way to figure out the mapping from pipeline > folder number, or to name these folders more explicitly?
Rename the folders is currently not supported in Azure DevOps.
Each pipeline maps a folder in the agent-> _work.
1.You could check the pipeline log to figure out which folder is your pipeline's working folder. (Enable system diagnostics)
2.You could also add a command line task in your pipeline to echo this directory.
echo $(System.DefaultWorkingDirectory)

How do I use an Azure DevOps Services Build Pipeline to retrieve TFVC source files based upon a Label value and then zip those files?

This is a TFVC repo in Azure, not Git. It is running in Azure DevOps Services, not local in Azure DevOps Server (2019). This is a classic pipeline, not YAML.
I got as far as adding a variable that contains the Label value I am looking to package into the zip file.
I can't figure out how to get the sources by Label value. In the Pipeline Get Sources step, I've narrowed the path down, but then I need to recursively get source files that have the Label in the variable I defined.
The next step is to zip those source files up, I've added an Archive task to which I will change the root folder from "build binaries" to the sources folder.
This is necessary for this particular project because we must pass the source files to the vendor as a zip for them to compile and install for us. The developers create/update the source files, build and test them locally, then apply a Label to the sources for a given push to the vendor.
When configuring 'Get sources' step, there is no any option or method that can only map the source files with the specified label.
As a workaround, in the pipeline job, you can try to add the steps to filter out the source files with the specified label, and use the Copy Files task to copy these files to a folder, then use the Archive Files task in this folder.
[UPDATE]
Normally, a pipeline run will automatically check out the file version (changeset) that triggers the run. If manually trigger the the pipeline, by default the run will check out the latest changeset if you do not specify one.
The labels are used to mark a version of a files or folders, so you also can get the specific version of files or folders via the labels.
In your case, you can try using the 'tf get' command to download the files with the specified labels.

How can I copy just new and changed files with an Azure Devops pipeline?

I have a large (lots of dependencies, thousands of files) nodejs app that I am deploying with an Azure Devops YAML build and Azure Devops "classic editor" release pipeline.
Is there some way to copy JUST new and changed files during a file copy, not every single file? My goal is to reduce the time it takes to complete the copy files step of the deploy, as I deploy frequently, but usually with just changes to one or a few files.
About copying only the changed files into artifacts for releasing, if the changed files are in a specific folder , you can copy files in the specified folder by specifying SourceFolder and Contents arguments in the Copy Files task.
If the changed files are distributed in different folders, I am afraid that there is no out-of-the-box method to only pick the changed files when using copy file or PublishArtifacts task.
As workaround, we could add powershell task to deletes all files (recursive) which have timestamp that is < (Now - x min), with this way Artifact directory contains of ONLY CHANGED files. More detailed info please refer this similar case.
Alternatively, you can call Commits-Get Changes rest api through a script in the powershell task, and then retrieve the changed files in the response and then copy them to specific target folder.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.0

Find and download all instances of a file name in Azure Devops

Is there a way to download all files with a specific name from the master branches of all the projects in an Azure DevOps installation?
I have been tasked with documenting all of the entries in all of the appsettings.json files in our entire codebase and I would prefer to not have to go through all 300 repositories to manually download these files if I don't absolutely have to.
Is there a way to download all files with a specific name from the
master branches of all the projects in an Azure DevOps installation?
This is not supported. Please check this document, we recommend different project(with one or more repos) for different products/sub-modules of big product. So in Azure Devops Service there's no such out-of-box feature to find/download files across projects.
A possible direction:
If those appsettings.json files are in root directory of your repos. You may save some time by using these two Rest APIs:
List all repos in current organization:
GET https://dev.azure.com/{OrganizationName}/_apis/git/repositories?api-version=5.1
Get File(Download):
GET https://dev.azure.com/{OrganizationName}/_apis/git/repositories/{Repositoryid}/items?scopePath=/appsettings.json&download=true&api-version=5.1
You can use PowerShell script to combine these two apis. The first one will list all Repositoryids in your organization, and the second one can download the appsettings.json from different repos via different Repositoryids. So the possible way could be run the first api once to get list of reposID(You can check this similar one) and then add a loop to get the files one by one.

copy files from azure file storage to azure website after release

I have files that need to be copied over to my website (azure website) after a deployment has been made. usually these files are server specific (I have multiple different servers for different releases), and usually in the past, before i used azure, i just had a backup folder with these files and a powershell script that i ran after deployment that just copied those files right over.
Now that i'm moving to azure, i'd like to keep this functionality. I'm interested in copying over these files into azure file storage, and then in my release task after azure website deployment, just copying from that file storage over into the site\wwwroot folder. I'm not really seeing an easy way to do this. Is there a release task i can use with this in mind?
Is there a release task i can use with this in mind?
Yes, we could use the Azure File Copy task. I also do a demo to copy the zip file to the azure storage. It works correctly on my side. Fore more information, please refer to the screenshot.
Note: If you don't want to zip the files, you could remove the Archive File task.
Test result: