VSTS Release Manager copy only specific files - azure-devops

I want to download a specific folder from my team project in VSTS and copy it to a server on premise. I've setup the vsts agent and it can copy files just fine by using the "Windows Machine File Copy", but my problem is the agent downloads my whole team project starting from the root.
In Artifacts when I choose Link an artifact source and under type choose Team Foundation Version Control, under Source (repository) I can only choose my team project $/myTeamProject in the dropdown list. I'm not able to provide a path in VSTS like $/myTeamProjet/Main/subfolder.
Is this the wrong approach? I basically want to copy some files from a subfolder in my team project in VSTS to a on premise machine, without downloading everything from the whole root folder ($/myTeamProject). It takes forever when I trigger a Release with a singe task that copies files. How can the agent map only a specific folder and not the whole root folder?

My opinion is that it's not a great approach. Your build should be publishing a set of artifacts that represents a full set of deployable bits that will remain static as you deploy them through a pipeline.
Think of this scenario: You have a release definition with a pipeline defined that goes Dev -> QA -> Prod.
You deploy to Dev. Your release definition pulls in Changeset 1234 from source control.
A few hours later, you deploy to QA. Your release definition pulls in Changeset 1234.
Someone changes some source code. You go to deploy to Prod. Your release definition pulls in Changeset 1235. Now you're deploying some stuff that hasn't been tested in a lower environment. You've just drastically increased the likelihood of a problem.
Same scenario applies if you ever want to redeploy an old version to try to repro a bug.
In short: publish that folder as an artifact as part of your build process.

In release definition, you can’t specify part of files to download from the artifacts (and the artifact source link is for you to choose artifact from which build definition).
But you can specify the files you want to copy by Windows Machine File Copy task. For the source option in Windows Machine File Copy you can specify the subfolder you want to copy, such as $(System.DefaultWorkingDirectory)/BuildDefinition/drop/Main/subfolder.

Related

How do I use an Azure DevOps Services Build Pipeline to retrieve TFVC source files based upon a Label value and then zip those files?

This is a TFVC repo in Azure, not Git. It is running in Azure DevOps Services, not local in Azure DevOps Server (2019). This is a classic pipeline, not YAML.
I got as far as adding a variable that contains the Label value I am looking to package into the zip file.
I can't figure out how to get the sources by Label value. In the Pipeline Get Sources step, I've narrowed the path down, but then I need to recursively get source files that have the Label in the variable I defined.
The next step is to zip those source files up, I've added an Archive task to which I will change the root folder from "build binaries" to the sources folder.
This is necessary for this particular project because we must pass the source files to the vendor as a zip for them to compile and install for us. The developers create/update the source files, build and test them locally, then apply a Label to the sources for a given push to the vendor.
When configuring 'Get sources' step, there is no any option or method that can only map the source files with the specified label.
As a workaround, in the pipeline job, you can try to add the steps to filter out the source files with the specified label, and use the Copy Files task to copy these files to a folder, then use the Archive Files task in this folder.
[UPDATE]
Normally, a pipeline run will automatically check out the file version (changeset) that triggers the run. If manually trigger the the pipeline, by default the run will check out the latest changeset if you do not specify one.
The labels are used to mark a version of a files or folders, so you also can get the specific version of files or folders via the labels.
In your case, you can try using the 'tf get' command to download the files with the specified labels.

how to exclude test projects from continuous deployment

I have a build definition in VSTS and I want the unit test projects to be run as part of the build, do I then need to include those dlls in the deployment to the Azure web app? surely they are no longer needed after the build.
No matter you are going to use Release or not, you could only copy the files that you need and publish them:
If you are going to use TFS Release to deploy, then in Release definition, you can just select the build artifact. If you want to deploy in build definition, you can add deploy task after Copy files and deploy from the target folder in Copy files task.

Deploying from Appveyor to Nuget only on changes in a particular folder

I have a .NET Core project that is auto-built in Appveyor and deployed to Nuget. By default, every successful build causes a new Nuget release.
However, there are many cases when a new release is meaningless because the library's actual code has not changed:
Readme updated
Unit tests added
Appveyor configuration changed
Other cases
It is possible to configure the build so that Nuget publishing only runs if there are changes in the actual code (for example, in folder X)?
There are a few options.
Commit filtering. Note that with it the whole build, not just deployment will be skipped if nothing in folder x changed. You may need a build without deployment at least when unit tests added. As a workaround consider adding separate AppVeyor project which will build and deploy only if folder x changed and keep current project to build every time, but not deploy
Inspect changed files with script. Please check this sample on how to check those files if you use GitHub. So if you see that files in folder x changed, you can set some custom environment variable (lets say you call it deploy_nuget) to true, and use it with a conditional deployment.

Copying files and deploying to Azure without building using Visual Studio Team Services

I'm attempting to deploy a web site to Azure using VSTS. Basically, I commit code to the GIT repo and have it setup to run CI, so it begins building as soon as I commit. However, once it hits the release section, it never copies the code to the Azure web app, rather, it gives me this line:
Info: Updating file ({projectname}\error.txt).
It doesn't copy the files I changed, but rather always just copies this file. I checked and there is indeed an error.txt file in my website directory in Azure, but it is always blank.
This build/deploy process isn't "standard" because the build step only downloads from source code, it doesn't build, because the website isn't a "web application", but rather just a "web site", meaning it doesn't need to be built.
So my build step is as follows:
Get Sources
Run on Agent - this step is empty
so the idea is that it just downloads everything from source control, that's it.
Then, my release step is as follows:
Artefacts are from build step above
deploy to environment 1 (dev)
Azure app service deploy, using "package or folder" as $(System.DefaultWorkingDirectory)/
Any idea what I might be doing wrong here?
So I actually figured this out and will leave this here in case anyone else needs it.
I admit I'm pretty new to the Azure/VSTS world, so maybe someone else is making my mistake as well.
If you don't need to "build" your project, then don't. I resolved it by simply skipping the build step altogether. What I was really after was to just download the files from source control and deploy them as-is.
In your release editor, you can specify which "artifact" you want to use to release, and one of the options is source control, which is what I did.
This would be useful for websites like mine where you don't need to build them (mine is DNN/DotNetNuke, so you don't build it before deploying).

Unique Need - Perform Team Services releases using artifacts created in an external build

I have a unique need where I need to perform releases from Team Services using a Release Pipeline and artifacts that have been created in a previous external build. I have the artifacts that were created, dacpacs and websites ect.
I would like to deploy these items using the features in release Pipelines but artifact sources only come from a build or some other version control.
My approach (hack) was to use a build to copy the external files and publish them into the artifact container for the build. I could then use the release pipelines to do my releases. But .. Build copy tasks only seem to work with paths into a repo.
My fall back will be to use the release pipeline and powershell to do the releases with these externally created artifacts. I would sure like to avoid this since there is nice capability in the release pipeline tasks.
This is a compliance requirement my firm has which results in the rather crazy post.
Any help would really be appreciated.
You can use Copy Files task and Publish Build Artifacts task for your build definition.
Copy Files task
Source Folder: you can specify the folder which has your external build artifacts. Such as C:\project\a.
Contents: you can use wildcards to specify which files to copy. Such as **\*.dll, this will copy all *.dll files in C:\project\a and it’s subfilder.
Target Folder: where you want to copy these files. Usually it’s $(build.artifactstagingdirectory).
Publish Build Artifacts task
Path to Publish: set as the same with Target folder in Copy files task. Such as $(build.artifactstagingdirectory).
Note: Copy files task will find the source folder in the machine where the private agent is located.