TFS 'Powershell on Target Machines' task for machines in different AD domain - powershell

We want to utilize TFS release management for our deployments. We have several environments (dev, qa, staging, prod). Each of them in separate AD forest. Build machine also resides in separate forest. No trust between them.
I set up target machines to accept CredSSP authentications for PS remoting. I was able to enter PS session on target machine from build machine. But no luck from TFS task 'Powershell on Target Machines'.
Here how my tasks looks in TFS:
TFS PS on Target Machines task
In logs:
2016-12-30T15:04:11.0279893Z System.Management.Automation.Remoting.PSRemotingTransportException: Connecting to remote server app.dev.local failed with the following error message : WinRM cannot process the request. The following error with errorcode 0x80090322 occurred while using Negotiate authentication: An unknown security error occurred.
Is there any way to make TFS run PS on target machines that resides outside of build machine AD domain?
AD trust doesn't look like an option. And without proper PS remoting it doesn't look like release management can provide much value for us.

TL;DR;
No, you have two options.
Setup one way trust between your primary domain ans all of your sub domains so that your production domain credentials can be used on all of your sub domains.
use shadow accounts to allow cross domain authentication. These are local accounts with the same username and password across machines that allows auth. This is the official MSFT work around for non trust domain auth.
The long answer
Other than that, since you are well off the supported happy path, you would need to implement your own custom tasks that facilitated the cross domain authentication that you want. Should be a fairly simple task to implement your own tasks in PowerShell.
https://www.visualstudio.com/en-us/docs/integrate/extensions/develop/add-build-task
The reality is that there are only a few limited senarios that you need a "test AD" environment and it is never correct to have domains for Dev, QA, or Staging. AD is not designed that way and I have never seen it work for the benefit of the organisation or the development efforts. It is a product of over paranoid sysadmins and it is a lost cause.
The only reason to have a permanent additional domain is for your sysadmins to test their domain changes and configurations.
For software development projects that actively change AD, or require specific setups for testing, you would dynamically create your test domain along with the test machines required. That is how you create valid and repeatable tests against a Domain.

Related

Multi-Domain permissions problem using Azure Devops Server Build Agent running a PowerShell script

We just upgraded from TFS 2017 to ADS 2020. The ADS runs in one domain and the build server runs in another. All appropriate cross-domain permissions are set up - these domains talk all of the time every day.
I had a service account created for the build server from the ADS domain for the ADS pipeline to run under (even thought the machine is on the other domain) and when we first tried this I got errors that the build didn't have access to write to the registry. This despite the service account (which again is on a different domain than the build server) is in the Administrators group on the build server. At some point that error went away, but now I get an error trying to run a PowerShell script from the build.
Just to summarize, Build Server on Domain1, ADS Server on Domain2, Service account running pipeline is on Domain2, but is part of the Administrators group on Build Server (Domain 1). Pipeline runs fine up until it tries to execute a PowerShell script and then gets an error saying the permissions aren't there to run a script.
Hope this all makes sense.

how to retrieve certificates in VSTS-build if agent is running as "network service"

in the past, we used VSTS build agents, running with domain accounts on on-prem build machines. In such scenario, certificates could be stored into the domain accounts personal store (manually, by logging in once with this account). So a later build could get the certificates by thumbprint for signing e.g. a manifest.
Now, the agents run with "Network Service", because we no longer have a local domain (all moved to Azure AD). All works, except the retrieval of certificates from the store. I already used the mmc snap-in to connect to the service (VSTSAgent), and installed certificates to this personal store, but still the build fails with "Error MSB3323: Unable to find manifest signing certificate in the certificate store.".
If I log-on to the machine and run from within VS, all works well, but of course here I am using a different account (with a different personal store), but this at least tells me that solution & projects are fine. And the pipelines are OK as well, because they still work OK on the "old" build-machines that use a domain account.
So, if anyone has an idea or can point me to some information on how to use the VSTSAgent running as "Network Service" together with signing (from the certificate store), that highly appreciated.
Many thanks, Sebastian

Desired state configuration

I have two web servers and one service server and a database server and all these servers are domain joined. And I have set my private build agent from VSTS from where I can build my artifacts and based on build configuration. And all my DEV,QA and STAGING environments are setup on those servers.
My problem is i am looking for a way using PowerShell Desired state configuration such a way that based on the environment artifacts (DEV,QA and STAGING) the scripts has to copy the artifacts to specific location on those "TWO web-servers" and ensure the website is configured correctly with all the required permissions where these artifacts are used to host IIS website and perform the delete and creation action of particular windows service on "SERVICE service" and should also perform the migration activities on "DATABASE server" for particular database. since I have separated database for each individual environment.
Any kind of help or suggestion would be appreciated. thank you.
My suggestions are:
Don't use DSC for deployment (i.e. deploy applications or databases)
Use DSC for configuration (e.g. install IIS)
Install the VSTS Agent on each server in Deployment Groups mode, running as a service with local administrator privileges
Use the IIS Deploy Tasks designed for Deployment Groups
Use the Powershell Task to manage the Windows Services (tip. help *-Service)

VSTS Deployment to a deployment group from a UNC share

I am using visualstudio.com Teams Services to build and deploy an ASP.NET website to two Azure VMs.
I have a build which on completion triggers a release to my two servers in a deployment group. When you configure a Deployment Group for Visual Studio Team Services you create an agent that by default runs as NT AUTHORITY\SYSTEM.
If I publish my build artifacts to Azure (the server option) then everything works fine and deployment succeeds to both my VMS. However when using a file-drop I get the following error:
The artifact directory does not exist:
\\MACHINE1\drop\RRStore\20170517.20. It can happen if the password of
the account NT AUTHORITY\SYSTEM is changed recently and is not updated
for the agent.
This is basically saying MACHINE2 cannot access \\MACHINE1\drop due to permissions. In windows I can bring up this folder just fine, but since the agent is running as NT AUTHORITY\SYSTEM it cannot access it.
I want to use a filedrop because my website is about 250MB (although in the meantime I am using the 'publish to server' option and deploying via team services.)
I am unclear how to give permissions to the file drop though as the agent is running as SYSTEM. I am running as a WORKGROUP and giving permissions to 'Everyone' does not seem to work.
What is the correct way to configure access to a VSTS drop folder so that the deployment agent can access it?
Few possible options:
Set up a domain (I tried doing this but then I need a new network interface and it sounds klunky)
Continue using teamservices to deploy the artifacts (or reduce the website size!)
Save to a storage account, but again I'm not sure how to configure that.
Run as a different user account
I have similar problems when deploying with VSTS. Instead I chose to:
Run VSTS agent on the deployment group VM as a local user with limited access.
Impersonate the account on the deployment group VM to test its access to the drop folder.
Save/cache a different credential to access the drop folder if applicable.
(So the sensitive information stays on the VM.)
The cached credentials can be a different local user account created on the drop server just for this purpose.
Grant the local user access to various parts of the file system explicitly to limit access permission of this VSTS agent service runner account.
This should work in most cases. In fact, this same way is used in my VSTS, Jenkins and TFS instances. This should prevent you from setting up a domain to solve this problem.
This may not be the best practice, but at least it should get you started in the right direction.

How do I use "\\company\network\share\" as a NuGet source in TeamCity?

I've checked that the TeamCity user has access to the network share in question.
All packages from the public NuGet feed are found correctly while packages available on the network share are not.
We use the network share when building via Visual Studio with the exact same path without a problem.
I've tried using "file://ratchet/NuGetRepository" but that doesn't make a difference.
TeamCity log entries and screenshot of the build step configuration shown below:
NuGet command: E:\BuildAgent01\plugins\nuget-agent\bin\JetBrains.TeamCity.NuGetRunner.exe E:\BuildAgent01\tools\NuGet.CommandLine.DEFAULT.nupkg\tools\NuGet.exe restore E:\BuildAgent01\work\95323b7041b60513\MySolution.sln -Source https://nuget.org/api/v2/ -Source \\ratchet\NuGetRepository\
Was able to solve this by specifying the fully qualified name of the network share, e.g. \\ratchet.hq.local\NuGetRepository.
Since the accepted answer did not provide a solution for my setup, I'd like to post what did allow TeamCity to access my network share.
First, a very important note: TeamCity Build Agent may either run as a Windows service or directly in command prompt. For my machine, this had the following consequences:
When run as a Windows service, the build agent was logged in as LocalSystem. For our network share, my machine's credentials were not given permissions.
Note: while this SO thread indicates that the network share can be configured to allow the machine's LocalSystem account to have permission, this was NOT an option for me.
When run in command prompt, the build agent will use the security context of whoever runs it (for me, it was my domain user). Again, for our network share, all domain users are given permissions.
The quick solution was to simply run the build agent in command prompt and call it a day; however, I did really want to run the build agent as a Windows service, since I think it is a cleaner approach.
Here's my solution:
First, I needed to grant my domain user the privilege to log on as a service. This is needed to run the service with my domain user's security context. I navigated to User Rights Assignment within Local Security Policy:
Control Panel -> Administrative Tools -> Local Security Policy -> Local Policies -> User Rights Assignment
Next, I added my domain user to the Log on as a servcie setting. For this, I made sure to include the domain with my user name.
Now that my domain user's security context can be used when starting a service, I navigated to Services (services.msc), located TeamCity Build Agent, and edit its properties:
Now, when relaunching the TeamCity Build Agent Windows service, it would be able to access the network share since it was using the security context of my domain user. I can now access the Nuget repository on our shared drive and keep the build agent running in the background.
You can include the package sources in NuGet.targets file. Just find the commented lines and add your path.
<PackageSource Include="https://nuget.org/api/v2/" />
<PackageSource Include="\\ratchet\NuGetRepository\" />