How can I copy just new and changed files with an Azure Devops pipeline? - azure-devops

I have a large (lots of dependencies, thousands of files) nodejs app that I am deploying with an Azure Devops YAML build and Azure Devops "classic editor" release pipeline.
Is there some way to copy JUST new and changed files during a file copy, not every single file? My goal is to reduce the time it takes to complete the copy files step of the deploy, as I deploy frequently, but usually with just changes to one or a few files.

About copying only the changed files into artifacts for releasing, if the changed files are in a specific folder , you can copy files in the specified folder by specifying SourceFolder and Contents arguments in the Copy Files task.
If the changed files are distributed in different folders, I am afraid that there is no out-of-the-box method to only pick the changed files when using copy file or PublishArtifacts task.
As workaround, we could add powershell task to deletes all files (recursive) which have timestamp that is < (Now - x min), with this way Artifact directory contains of ONLY CHANGED files. More detailed info please refer this similar case.
Alternatively, you can call Commits-Get Changes rest api through a script in the powershell task, and then retrieve the changed files in the response and then copy them to specific target folder.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.0

Related

How to exclude files present in the mapped directory when Publishing the artifacts in Azure CI/CD?

I am new to Azure CICD pipelines and I am trying to export the CRM solutions using Build pipeline in azure devops using Power Platform Task. There is a requirement to keep the exported solution from build pipeline to Azure repos. (which I am doing it from command line using tf vc)
I am able to export the solution successfully but the issue is when I publish the artifacts it publishes every file present in the mapped folder. (mapped a directory in Azure repos where all the solution backups are kept)
I see that azure agents copies all the files present in the mapped directory and stores in agent directory. The problem is the mapped directory contains all the backup files of CRM Solutions. I found some articles where it was mentioned to cloak the directory so that the files will not be included in azure agent. But if I cloak the directory then I am not able to check-in the exported solution from command line.
So, I was wondering if there is any way to exclude all files present in the mapped directory and still able to check-in the exported file to that directory using command line.
You can use a .artifactignore file to filter out paths of files that you don't wish to be published as part of the process.
Documentation can be found here

How do I use an Azure DevOps Services Build Pipeline to retrieve TFVC source files based upon a Label value and then zip those files?

This is a TFVC repo in Azure, not Git. It is running in Azure DevOps Services, not local in Azure DevOps Server (2019). This is a classic pipeline, not YAML.
I got as far as adding a variable that contains the Label value I am looking to package into the zip file.
I can't figure out how to get the sources by Label value. In the Pipeline Get Sources step, I've narrowed the path down, but then I need to recursively get source files that have the Label in the variable I defined.
The next step is to zip those source files up, I've added an Archive task to which I will change the root folder from "build binaries" to the sources folder.
This is necessary for this particular project because we must pass the source files to the vendor as a zip for them to compile and install for us. The developers create/update the source files, build and test them locally, then apply a Label to the sources for a given push to the vendor.
When configuring 'Get sources' step, there is no any option or method that can only map the source files with the specified label.
As a workaround, in the pipeline job, you can try to add the steps to filter out the source files with the specified label, and use the Copy Files task to copy these files to a folder, then use the Archive Files task in this folder.
[UPDATE]
Normally, a pipeline run will automatically check out the file version (changeset) that triggers the run. If manually trigger the the pipeline, by default the run will check out the latest changeset if you do not specify one.
The labels are used to mark a version of a files or folders, so you also can get the specific version of files or folders via the labels.
In your case, you can try using the 'tf get' command to download the files with the specified labels.

Azure DevOps AzureBlob File Copy Task Wildcard Directory Source Path

I have a Release Pipeline setup to extract my build artifact and then I use the RegEx Find & Replace task to update a few configuration files on my static website before uploading via AzureBlob File Copy task.
What I have currently works. However I have to use the full source path to the "wwwroot" folder in order to for the correct files to be uploaded.
My concern is that my builds may change the directory structure that gets dropped as a build artifact which will break my release tasks.
I've tried to use various wildcards for the source path such as:
**/wwwroot/*
*/*wwwroot/*
And many other variations to no avail.
I read in some of the docs that wildcards won't work for directory paths in this task but was hoping there was a way to work around this as I have long term concerns about the path changing over time. I looked at a few other tasks in the marketplace but nothing seems to fit the bill and I couldn't find any definitive guidance on the matter. Any suggestions?
1.You can use $(System.DefaultWorkingDirectory)/**/wwwroot/*.*(or .json,.xml...) in Source input.
(If the wwwroot folder doesn't have subfolders and all files within that folder can match format *.*)
2.Otherwise I suggest you archive the wwwroot folder first before uploading it:
Use $(System.DefaultWorkingDirectory) to get working directory and then use full path(not recommend using * here) to archive the specific wwwroot folder. Then upload the zipped folder which contains all content of your wwwroot folder.

copy files from azure file storage to azure website after release

I have files that need to be copied over to my website (azure website) after a deployment has been made. usually these files are server specific (I have multiple different servers for different releases), and usually in the past, before i used azure, i just had a backup folder with these files and a powershell script that i ran after deployment that just copied those files right over.
Now that i'm moving to azure, i'd like to keep this functionality. I'm interested in copying over these files into azure file storage, and then in my release task after azure website deployment, just copying from that file storage over into the site\wwwroot folder. I'm not really seeing an easy way to do this. Is there a release task i can use with this in mind?
Is there a release task i can use with this in mind?
Yes, we could use the Azure File Copy task. I also do a demo to copy the zip file to the azure storage. It works correctly on my side. Fore more information, please refer to the screenshot.
Note: If you don't want to zip the files, you could remove the Archive File task.
Test result:

Newlines being added to Bamboo artifacts used in deployment project

I've just set up a project's deployment to our stage server. I have a powershell script on the remote server that is copying files from the bamboo artifacts to a specific folder on that same server. The folder they are being copied to uses TortoiseSVN as it's source control. Once the files are copied over all of my .js,.css,.html files are showing up as being modified in TortoiseSVN when only a couple have truly been modified. Using diff on some of the files the only difference between the new files and the previous is that newlines are being added to the end of every file.
My main question is if Bamboo modifies some project's files during the artifact building process or if there is another explanation for why these files are showing up as being modified.
Edit: As a side note, I deployed a different project to the same server using the same script and no newlines were present in any of the files. This project was smaller in size than the first project mentioned.