I am having trouble deploying files to my servers through the Release Pipelines.
I need to copy files to a Windows and a Linux server. I have tried using the file copy and the ssh file copy tasks, but they seem to be getting blocked because the microsoft servers aren't in my firewall whitelist. What is worse is that I can't seem to get a reliable list of IP's that I need to whitelist, and even if I did it seems they change over time.
So, any advice appreciated.
Also, I am a bit confused about the azure agent. My understanding was that you install them on the servers so that you don't need to worry about firewall issues. I just have the feeling I am missing something. I have no idea what that agent is doing at the moment - it certainly doesn't seem to be helping with the file deploy.
Thanks in advance!
Deploy issue with Azure Release Pipeline
Self-hosted agent: An agent that you set up and manage on your own to run jobs is a self-hosted agent.
To resolve this issue, you could create your private agent, then you can add the IP address of the machine where your private agent deployed to the firewall whitelist of your server machine.
In this case, Azure Release Pipeline runs on your private agent, and the IP of the machine where the private agent is located is added as a whitelist, so that it will not be blocked by the firewall of Windows and Linux servers.
You could refer the document Self-hosted agents to create your private agent.
Related
Im new to devops , and im trying to deploy windows service to on premises vm,
I have added Copy file from task and setup all the user name/ password/ source/ destination and all the required settings.
but error
[error]Failed to Create PSDrive with Destination:
'\my_server\D$\TestCI', ErrorMessage: 'The network path was not
found'
fails the release.
can you help on this?
Since you work on the Azure Devops Server,you should be using self-hosted agents.
ErrorMessage: 'The network path was not found'
The casue of this issue could be that the Self-hosted agent and the Target windows machine are not in the same Windows Domain.
To solve this issue, you could try to create a new self-hosted agent under the same windows domain (the machine which in the same local network), then grant the permissions for the build agent service account to access another local PC.
Here is the ticket with the similar issue, you could refer to it.
We are trying to start using the Azure Pipelines agents instead of Self hosted ones. While trying to convert over or Acceptance tests I am running into an issue with the agent not allowing our test to connect to an api we spin up with in the Agent that is running on port 44392. Noticed this post. How to open TCP port on localhost on VSTS build agent?, from a couple years ago and is pretty similar to how our test is working. Just wondering if the answer is still accurate or not.
Since your are using the Hosted Agents, which means that the machine is a shared resource between many Azure DevOps Organizations (tenants) and managed (and locked) down by Microsoft.
In other words, we do not provide end user to open port with these agents. The answer in your link is still valid.
You may have to install an agent on your own virtual machine and run the build there. The VM can be in the cloud or on premise. You trade simplicity and cheapness for full control.
I have two web servers and one service server and a database server and all these servers are domain joined. And I have set my private build agent from VSTS from where I can build my artifacts and based on build configuration. And all my DEV,QA and STAGING environments are setup on those servers.
My problem is i am looking for a way using PowerShell Desired state configuration such a way that based on the environment artifacts (DEV,QA and STAGING) the scripts has to copy the artifacts to specific location on those "TWO web-servers" and ensure the website is configured correctly with all the required permissions where these artifacts are used to host IIS website and perform the delete and creation action of particular windows service on "SERVICE service" and should also perform the migration activities on "DATABASE server" for particular database. since I have separated database for each individual environment.
Any kind of help or suggestion would be appreciated. thank you.
My suggestions are:
Don't use DSC for deployment (i.e. deploy applications or databases)
Use DSC for configuration (e.g. install IIS)
Install the VSTS Agent on each server in Deployment Groups mode, running as a service with local administrator privileges
Use the IIS Deploy Tasks designed for Deployment Groups
Use the Powershell Task to manage the Windows Services (tip. help *-Service)
We are trying to setup Release (continuous deployment) from our VSTS in the cloud. After the build is done, the Hosted Agent VS2017 tries to deploy the artifacts to the target server.
Firstly, it failed because our firewall blocked the target server from receiving the artifact (a .zip containing all the stuff). In fact, if I connect to the server via RDP and try to download the artifact from a browser, it's blocked.
Our security team temporarily disabled this firewall rule, and it worked (this also means the hosted agent has line of sight for the target server). Now, they don't want this rule off, they would like to know what is the User Account that tries to download/publish the artifact from the hosted agent, so they would allow the download of the .zip only for that specific user. I'm not sure if it's the same account which runs the service in the Host Agent, or if it's Network Service (therefore the own target server credentials), os some other account.
How do I know what user account should be granted rights in our firewall to download anything?
Use the Windows Machine File Copy task, you can provide a username/password to use for copying the files.
However, it uses RoboCopy over SMB to copy the files. As a result, it's probably going to be safer to set up a private agent within your network that has line-of-sight to the target servers, rather than opening up holes in your firewall a whole slew of ports.
I am using VSTS (Visual Studio Team Services, formerly known as Visual Studio Onine) for continuous deployment to an Azure VM using an Azure File Copy task in my build definition.
The problem I am having is that I have an ACL setup on the Azure VM that is only allowing connections from my office for Remote Powershell.
With the ACL in place, the Azure File Copy task fails with an error like "WinRM cannot complete the operation. Verify that the specified computer name is valid, that the computer is accessible over the network, and that the firewall exception for the WinRM service is enabled and allows access from this computer." With the ACL removed, everything works.
To be clear, this is not a problem with WinRM configuration or firewalls or anything like that. It is specifically the ACL on the VM that is blocking the activity.
So the question is, how can I get this to work without completely removing the ACL from my VM? I don't want to open up the VM Powershell endpoint to the world, but I need to be able to have the Azure File Copy task of my build succeed.
You can have an on-premises build agent that lives within your office's network and configure things so that the build only uses that agent.
https://msdn.microsoft.com/library/vs/alm/release/getting-started/configure-agents#installing
Azure File Copy Task need to use WinRM Https Protocol, so when you enable the ACL, the Hosted Build Agent won't be able access to the WinRM on Azure VM and that will cause Azure File Copy Task fail.
When copying the files from the blob container to the Azure VMs,
Windows Remote Management (WinRM) HTTPS protocol is used. This
requires that the WinRM HTTPS service is properly setup on the VMs and
a certificate is also installed on the VMs.
There isn't any easy workaround for this as I know. I would recommend you to setup your own build agent in your network that can access to Azure VM WinRM.