VSTS Online - Copy files from AGENT MACHINE to VSTS - azure-devops

I use the VSTS online build and release process. I have two servers that have firewalls between them (can't just run a script on Server1 to accomplish moving a file between servers). I have installed the VSTS Agent on both servers and each server is assigned to its own Pool in VSTS. I can release to either server with out any issues.
What I can't figure out (or if its even possible) is how can I copy files FROM an agent in one pool to VSTS' release working directory/temp path (or even better from one pool, directly to another agent pool)?
For example, I have a Server1 in Pool1 and Server2 in Pool2. For my release steps, I have selected "Run on agent" and selected Pool1. I then have a task that copies files over to the agent and it does its thing. What I need to do is then pull down a zip file from a path accessible by the Pool1 server(s) to VSTS and send that zip file to a path accisble by the Pool2 server(s).
Is it possible to download a file from an agent pool? I assume if I was able to have the "run on pool1 steps" store that zip file somewhere in the release temp path/working dir, I would be able to do a windows file copy to send it from the working directory to the "run on pool2 steps".

If the other machine can be accessed by Windows Machine File Copy task. You can use Windows Machine File Copy task to copy files from a machine to another. Settings for the task as below:
Source: path for the files you want to copy, such as $(System.DefaultWorkingDirectory)/build/drop.
Machines: IP of the other machine you want to copy to.
Admin Login: ComputerName\AdminUserName.
Password: password for Admin user name.
Destination Folder: path for the other machine you want to copy files to.
If you still can't access the other machine based on firewall settings, you can upload the files to a place where both the two machines can access (such as your own website or github etc), and then use a power shell tash to download files from website.

No, there isn’t the task to download the file from an agent to other, also the middle server is needed if these servers can’t connect to each others to upload/download files (e.g. FTP, HTTP)
On the other hand, the files aren’t stored in agent pool, they are in agent, the agent pool is used to organize agents and different servers/agents can in the same agent pool, also a server can in multiple agent pools. Agent pools and queues.

It sounds like you are going to make a very brittle spaghetti type build. These files that are needed can they not be held in source control or as an artifact of a build? That way you can clone/pull the repo or just use the download artifact task that is out in preview now.

Related

Copy files from one windows server to another in the release phase of VSTS

I have a use case where I need to copy some binaries from one windows server to another in a specified directory and then restart the IIS server. This will have below steps.
create backup of existing files in win server 1 -> stop IIS -> copy new files from win server 2 -> start IIS
The app is running on an ec2 instance.
Is there a way to automate this in the release phase of VSTS ? How can I achieve this if this is something not supported out of the box.
I need to copy some binaries from one windows server to another in a specified directory and then restart the IIS server.
You could use the task Windows Machine File Copy task to copy those binaries from one windows server to another:
- task: WindowsMachineFileCopy#2
inputs:
sourcePath:
#machineNames: # Optional
#adminUserName: # Optional
#adminPassword: # Optional
targetPath:
#cleanTargetBeforeCopy: false # Optional
#copyFilesInParallel: true # Optional
#additionalArguments: # Optional
Or you could create a share folder on the remote server, then we could just use the copy task those binaries:
To restart the IIS, just like the Krzysztof Madej said, you could use the PowerShell on Target Machines task to execute the powershell scripts to restart the IIS.
It should be doable. What you should do is:
enable PS remoting on ec2 instance to be able to stop and start iis. You can find this here how to do this. On Azure DevOps you may use PowerShell on Target Machines task
copy files from and into ec2 instance over scp. More details are here and for that you can use Copy Files Over SSH task

VSTS "Windows Machine File Copy" task failed: ErrorMessage: 'The network path was not found'

in VSTS, is it possible to copy files build in VSTS to my local PC? I found a task called "Windows Machine File Copy" and I tried to use it to copy file to my local PC.
There is a machine field:
machine field
And I followed instruction to fill my PC Computer Name here. Then I share a folder named "test1". I want to copy files build in VSTS to "test1" folder in my PC but it caused this error.
error
Could someone who had experience with this task provide me with some help. Thank you.
The error message 'The network path was not found' clearly pointed out the problem.
To copy files to local PC which needs the Agent machine can access the local PC, but obviously the Hosted Agent cannot access the local PC.
So as a workaround you can try to setup a private build agent (the machine which in your local network) for the build: Deploy an agent on Windows, then grant the permissions for the build agent service account to access another local PC. Then you can copy files...

Deployment of Windows Services on Remote Servers (Different Domain)

Is there a simpler way of deploying Windows Services from TFS than using a Powershell script, run on the TFS server, which:
Stops the existing Windows Service on the remote server
Copy the file on a shared folder on the remote server (copy-item)
Starts the Windows Service on the remote
If not, can any other continuous integration/deployment tool do this better?
As the TFS server is using a domain controller which is different from the remote server, can we share a folder for a specific user? I tried to run the powershell script as a user from the target domain controller, but of course, it is not recognized as a valid user on TFS server.
At last, is there any difference on deploying on an hosted remote server or on the cloud?
Thanks,
In tasks based build system (TFS 2015 +), you can try to install Windows Service Release Tasks, which contains tasks to start and stop windows services as well as change the startup type.

Managing Multiple servers in an environment with Powershell DSC

I want to manage the servers in our staging pipeline with Powershell DSC (push model). The servers map to the environments as following
Development: 1 server
Test: 2 servers
UAT: 2 servers
Production: 2 servers
The server(s) within one environment do have the same configuration. But the configuration is different between the environments. I wanted to go with the push model because I do not have to setup a pull server.
Powershell DSC offers the option to manage the configuration via configuration data in a separate file But this comes with the caveat that you need to specify a node name that matches the respective server name. And that means, I need to copy the configuration data for each server in one environment. And when changing the configuration I need to remember that there is a second place where I need to update the configuration value.
Additionally, I do not really care about the server names. If the servers are exchanged tomorrow for new servers, the configuration should be just applied which is relevant to the environment.
What is the best practice approach to manage multiple servers within one environment with the same configuration?
Check the links, I think they cover scenerio
Using A Single DSC Configuration for Multiple Servers
enter link description here
DSC ConfigurationNames with multiple nodes
enter link description here
The mof file that gets produced does not contain the nodename inside it. So as long as you build a generic configuration, you can rename it after the fact at deploy time.
You can create one config for each environment with some generic name. Then enumerate the list of servers and make a copy of the config for each one with that servers name.
You can take it a step further. Have a share where you create a folder for each server that matches the server's name. Then copy the mof for that server into that folder with a name of localhost.mof. You can then run Start-DSCConfiguration -Path \\server\share\$env:computername from that machine as part of my deployment script.

Jenkins - Publish Over CIFS Plugin

I am getting confused with this plug in.
Basically my target is to Deploy files from Server1 to Server2
Now the buildoutput dir is in a specific location in Server1
example: E:\BuildOutput\Apps\Application1\Bin\Release\
I need to deploy them in Server2: C:\Program Files\Tools\Application1\Bin\
How do I set up this plugin to work to what I need?
I am getting stressed in the amount of files that needs to be deployed to another server, I just wished a simple xcopy tool to another server could work.
I am looking for plugin if not this, to basically deploy only the files that has been changed to another server for automated feature testing.
Any methods will do too, that's if possible.
XCOPY should work fine. You need to create a share on Server2 in the desired location
Go to the Jenkins configuration and click "Add build step"->"Execute Windows batch command"
You should be able to execute any DOS commands you need there.
XCOPY E:\BuildOutput\Apps\Application1\Bin\Release\my_app.exe \\SERVER2\Share
If you don't want to share your applications bin directory:
Make a different share on Server2
Configured build to XCOPY to the new share
Add Server2 as a build node (Manage Jenkins->Nodes)
Create a new build job to move the files where you want them
Tie the new job to Server2 build node (Check the box "Restrict where this project can be run" in the job config
If your account has admin rights on Server2 you can just connect to the admin share of the C: drive like this:
XCOPY E:\BuildOutput\Apps\Application1\Bin\Release\* \\SERVER2\c$\Program Files\Tools\Application1\Bin\