VSTS Windows File Copy Variables not working? - azure-devops

I'm attempting to run a "Windows Machine File Copy" task in a deploy step on VSTS via private Agent.
In reading the documentation at https://learn.microsoft.com/en-us/vsts/build-release/tasks/deploy/windows-machine-file-copy?view=vsts, the section for "Source" parameter says:
You can use pre-defined system variables such as $(Build.Repository.LocalPath) (the working folder on the agent computer), which makes it easy to specify the location of the build artifacts on the computer that hosts the automation agent.
When I attempt to use that exact variable (because I literally want to copy the most recent source files) as such:
I get the following error when running the task:
[error]Source path 'C:\agent_work\r1\a\$(Build.Repository.LocalPath)' does not exist.
2018-04-18T05:52:09.2461155Z ==============================================================================
2018-04-18T05:52:09.2461984Z Task : Windows Machine File Copy
2018-04-18T05:52:09.2462630Z Description : Copy files to remote machine(s)
2018-04-18T05:52:09.2463336Z Version : 2.0.4
2018-04-18T05:52:09.2463945Z Author : Microsoft Corporation
2018-04-18T05:52:09.2464620Z Help : [More Information](https://go.microsoft.com/fwlink/?linkid=627415)
2018-04-18T05:52:09.2465332Z ==============================================================================
2018-04-18T05:52:13.1043515Z ##[error]Source path 'C:\agent\_work\r1\a\$(Build.Repository.LocalPath)' does not exist.
2018-04-18T05:52:13.1533941Z ##[section]Finishing: Copy files To Server
2018-04-18T05:52:13.1653576Z ##[section]Finishing: Release
Am I missing something?
For background, I want to do this Robocopy/WMFC because I'm deploying a static website that's around 40gigs total in source. I don't want to copy the entire Build Output to the artifacts directory first, because it will take too long. I need quicker deploys than copying 40 gigs of data. This is a legacy site and there isn't much I can do about the way it's structured.
Ultimately, I'm trying to pull the latest source (without clean, because pulling 40 gigs is too slow) and then do a RoboCopy /MIRror to copy only changed files to the destination IIS directory on another machine.

I can see on your question that you are doing this on a release, based on C:\agent_work\r1\a\$(Build.Repository.LocalPath), where r1 is a release.
Based on the release variable documentation, the $(Build.Repository.LocalPath) is not available for a release.
You should use one of the release variables instead, like $(System.DefaultWorkingDirectory)\the artifact name

Related

Does makeappx.exe exist in azure pipeline?

I am trying to pack my program in azure release pipeline using the makeappx.exe.
But I get the following error:
'makeappx' is not recognized as an internal or external command
I use windows-2019 hosted agent.
Does makeappx.exe exist in azure pipeline?
Indeed, I also could reproduce this issue on my side. That because makeappx.exe is a external command, can not be recognized by Windows in an arbitrary directory.
In order to investigate this issue, I add a copy task to copy the default SDK folder:
C:\Program Files (x86)\Windows Kits\10\bin\x64
Obviously, the makeappx.exe does not exist under the default folder.
According to the Vs2019-Server2019-Readme.md, we could to know the windows-2019 hosted agent including Windows10 SDK 16299, 17134, 17763, 18362.
So, to invoke makeappx.exe, we could use the full path of this file, like:
C:\Program Files (x86)\Windows Kits\10\bin\10.0.18362.0\x64\makeappx.exe
As test, it works fine on the Windows-2019 hosted agent.
Hope this helps.
I think he is talking about this command being able to run under pipeline build not a local drive.

Powershell Script for Copying a zip file everytime from one location to other with creating a new folder in the destination whenever a TFS build runs

Hi Can any one help on this. I have a situation where i need a powershell script to run from TFS build under post build section, it has to copy the zip file generated to some location locally with the build number or build name every time a build happens.
If you use TFS 2015, it's suggest using the new build system, as there is already a Copy Files task in the new build system, you can use it directly.
Detailed information of this task, you can refer to: https://github.com/Microsoft/vsts-tasks/tree/master/Tasks/CopyFiles

Custom action after ClickOnce deployment / publishing

How can I run custom script which will upload ClickOnce deployment files to a web-server (in my case Windows Azure Blog Storage) right after publishing? Is it possible to modify MSBuild file in some way so it would run custom script right after ClickOnce published files into a local folder?
Yes, you can hook to build process using various technics:
pre and post build actions ( from visual studio project properties menu). It's actually exec task hooked into your project file
you can override your DependsOn property for concrete target and append execution of your own target (pre-Msbuild 4.0 way)
you can declare your target and hook with AfterTarget\BeforeTarget attributes (Msbuild4.0 way).
As for uploading something to blob - you can use Exec task in your own target to upload or use whatever tool\script you usually use to uploading files to website\Blob storage.
NB: You could clarify your question with following points (if you need more concrete answer) :
what kind of build process you are using - build from VS, CI server with custom msbuild script, CI server that building your sln file etc
what kind of script\tool you want to execute to upload build result.
do you know the name of last executed msbuild target, after which you want to fire your tool.

TFS Build Website deployment package web.config transformation not working

So I am trying to use TFS Build for generating deployment packages for my 3 environments (ST, UAT, Prod).
This what I followed to successfully genrate the package locally.
http://social.msdn.microsoft.com/Forums/en-US/tfsbuild/thread/74bb16ab-5fe6-4c00-951b-666afd639864/
So my local machine will generate the package for the acyive configuration and everything is good. Here is my Build definition :
/p:DeployOnBuild=true;DeployTarget=Package
I run my solution file and the web deployment project in the Projects To Build.
It creates the respective folders with ST, UAT and PROD. In each of these there is a _PublishedWebsites folder. This folder have 2 folders.
1) MydeploymentProject - It contains the transformed web.config
2) MyDeploymentProject_Package - Contains the Package folder contents along with the zip file and setparameters files. Here the everything is not transformed. But if I check the TempBuildDir on the TFS server it does contain the transformed config.
When compared the logs local and on server, I found that the on my local After transformation files are updated and package is created whereas on TFS the AfterBuild target is called transformation done and it ends there.
this is my local log
Target "WPPCopyWebApplicaitonPipelineCircularDependencyError" skipped, due to false condition; ($(WPPCopyWebApplicaitonPipelineCircularDependencyError)) was evaluated as (False).
Target "ProcessItemToExcludeFromDeployment" in file "C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets" from project "C:\TAX-IT\Main\Source\TDDB\TDDB_deploy2\TDDB_deploy2.wdproj" (target "PipelineCollectFilesPhase" depends on it):
Done building target "ProcessItemToExcludeFromDeployment" in project "TDDB_deploy2.wdproj".
Target "GetProjectWebProperties" in file "C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets" from project "C:\TAX-IT\Main\Source\TDDB\TDDB_deploy2\TDDB_deploy2.wdproj" (target "PipelineCollectFilesPhase" depends on it):
Using "GetProjectProperties" task from assembly "C:\Program Files\MSBuild\Microsoft\WebDeployment\v10.0\....\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.Tasks.dll".
Task "GetProjectProperties"
I am not sure what is wrong.
Also I installed VS2010, web deploy 2.0 and 3.0 and web deployment tools on my Build servers.
Anyone have faced this and resolved.
Please help.
Thanks
MadCoder,
From what I've gathered from your description, you have everything set up correctly. It seems like you are just missing the "Configuration" parameter. When you do run the Build Definition, it uses the configuration specified in your "Configurations to Build" argument. If you want to have multiple configurations built (like you are suggesting), you'll need to have multiple configurations defined. One question I have is: When you look at the logs of the TFS Build Process, do you see multiple configurations built, or do you only see one? If you only see one, then you don't have all of the configurations defined in order to transform the config file. According to your description, you'll need to see something like this in your build definition configuration:
If you don't want to deploy to a webserver, you can stop reading here, and don't have to continue on.
If you choose to use a TFS Build Definition to deploy to a web server, you'll need to have a target web server somewhere and you'll need to install and configure the Web Deploy v2/v3 on that server as well.
When you are using TFS Build Definitions to deploy, the transformation happens upon deployment, not during packaging (prior to deployment). It may package up a transformed config somewhere, but it won't actually transform the config bundled with the website. The only way I've been able to get the deployment to actually work with a transformed config is when I had a website specified in the MSBUILD args. Here is an example of my MSBUILD args:
/p:DeployOnBuild=True /p:DeployTarget=MSDeployPublish /p:MSDeployPublishMethod=RemoteAgent /p:MsDeployServiceUrl=MyWebServer/MsDeployAgentService /p:DeployIisAppPath="MyWebsite as named in IIS" /p:UserName=MyDomain\MyWebDeployUser /p:Password=MyWebDeployPassword
If you don't want MSBUILD to do the actual deployment (I prefer not to because then your deployment process is tied to TFS), you can do the deployment after the build process and use the CTT Project, found on codeplex. This tool performs the exact same transformations as MSBUILD, but it also includes the ability to parameterize settings so you can define classes of environments (for example, 3 QA environments, 2 Staging Environments, etc.) and still use the respective transforms for that class of environment.

Powershell in CruiseControl.net to backup existing folder before deploying new version of the code

I would like to zip a bunch of files (.exe and .dll) before I overwrite them with the new build. Is there a simple way to zip files without using some sort of dll?
Just creating a folder with the build number / date time stamp will also work great. How do I pass parameters from the cruise control build process into my Powershell script that will do the work then?
Is this a sustainable way to do things?
Thanks
You could either use CCNET's Package Publisher Task directly or zip the files via the PowerShell Task introduced in CCNET 1.5.
Configuration sample for PowerShell Task:
<powershell>
<description>Adding scheduled jobs</description>
<scriptsDirectory>ScheduledTasks</scriptsDirectory>
<script>CreateScheduledJobsFromListOfTasks.ps1</script>
<buildArgs>-zipDir="C:\foo"</buildArgs>
</powershell>