vNext Work Around for Second Hop Issue on NonDomain Machines - powershell

I've asked the Microsoft Developer Community this question but I haven't had much success.
I am trying to create some automation tests with a vNext Build Definition in which the build agent RemotePSSession into a non-domain virtual machine (the test machine) and runs a batch file that can take several arguments. This batch file may read (installer files) or write (reports) to a network share that is on the domain. The issue that I am coming across is the second hop issue. Here is an article about it: https://blogs.technet.microsoft.com/ashleymcglone/2016/08/30/powershell-remoting-kerberos-double-hop-solved-securely/
In my instance, the PowerShell Remote Session is not able to pass the credentials we have authenticated previously in the test machine to access the network share’s resources. We have tried using CredSSP authentication on both the agent and the test machine to enable access but that has failed. The Net Use and other commands which call domain resources have also failed. We’ve even tried modifying the custom task PowerShell on Target Machines task and did not have much luck with it.
From what we have discovered is that there is no way to access the domain network shares with RemotePSSession with the following topology: Server A (which is in the domain or workgroup) ⇒ RemotePSSession + CredSSP into Server B (which is non-domain), using a local admin Server B account ⇒ Calls the network shares, with Net Use using some domain account.
It seems that the second hop only works for domain-joined machines (we have been testing it CredSSP using as well).
Let us know if there is a solution or workaround that we can implement.
Their response was >> If you can make sure the method is correct and the issue is caused by DevOps, we will be happy to help you with your issues about DevOps. Here are some documents might be helpful: https://learn.microsoft.com/en-us/powershell/scripting/learn/remoting/ps-remoting-second-hop?view=powershell-6.
I've looked at this documentation before and I haven't had much success, does anyone else have any suggestions?
Here is a code snippet:
#serverA - local machine
#serverB - $remoteServer
#serverC - DFS namespace server
$session = New-PSSession -ComputerName $remoteServer -Credential $credential -SessionOption (New-PSSessionOption -SkipCACheck -SkipCNCheck -SkipRevocationCheck) -Authentication Credssp
$scriptBlock_runFile = {
#Scenario 0 which works:
#ipconfig
#Scenario 1 which doesn't work:
#& dir \\contoso.com\departments\folder"
#Scenario 2 which doesn't work:
#& net use x: \\contoso.com\departments\folder /user:CONTOSO\user "password"
}
Invoke-Command -Session $session -ScriptBlock $scriptBlock_runFile

Related

Double-Hop Errors when running Skype for Business Cmdlets

I am attempting to automate the Skype for Business Server installation process in Powershell, I have a script that remotes into specified machines and begins preparing them as Front-End servers. The problem lies when certain SfB cmdlets (SfB commands are all of the form "verb-Cs...", ex. Get-CsUser or Get-CsPool) are run in remote sessions, they throw the double-hop error:
Exception: Active Directory error "-2147016672" occurred while searching for domain controllers in domain...
This is after running Enable-CsComputer, which enables the computer's role-based off its definition in the topology (topology was published successfully). The user object is in all required groups (RTCUniversalServerAdmins, Schema Admins, CsAdministrators & Local Admin rights on all SfB Servers). Oddly enough, the command 'Import-CsConfiguration -localstore" does not throw errors, and it's in the same remote session. There may be other local or domain groups that I need to be in, but I cannot pinpoint exactly which and have not seen them documented in the Skype build guides. Skype commands that have parameters to specify targets or just pull data, such as Get-CsPool or Get-CsAdForest, do not have errors because they are run in the local scope. The Enable-CsComputer has no parameter for the computer name, it has to be executed from that machine itself.
Enabling CredSSP delegation on each server is not an option, and I'm not understanding why there is a "second hop" in this command! If the second hop was a resource on a file server or database, that would make sense, and be easy to solve, but in this case, I can't track it. Can anyone tell me what I may be missing?
Here's a code sample to try and illustrate. From the jumbox I get the pool data to create an array, and a session is opened to each machine:
$ServerArray =get-cspool -identity $poolName
$i=0
$SessionArray = #{}
foreach($server in $ServerArray.Computers){$SessionArray[$i] = new-PsSession -ComputerName $server}
foreach($session in $SessionArray.values){
invoke-Command -session $session -scriptBlock {
#remote commands:
import-csConfiguration -<config file path> -localstore; #no errors
enable-CsReplica; #no errors
enable-cscomputer; #double hop error here
}}
If I log into that machine and run the same command, it executes fine but the intention of the project is to automate it on an arbitrary number of machines.
It looks like it's just trying to authenticate to a domain controller, which is reasonable. You'll have to approach this like any other double-hop issue.
Microsoft has an article dedicated to the double hop issue, and has a few solutions other than CredSSP that you can look at: Making the second hop in PowerShell Remoting

Powershell invoke-command multihopping

I have a question regarding multihopping in a windows environment.
Let's say I have a schedule running on Server A (Central Scheduler) which executes a command on Server B. This script contains a call to save files on a remote filer (UNC path, Server C). Hop 1 (from A to B) works well, hop 2 (from B to C) fails.
I already tested to save the files locally on server B, that works flawlessly.
I think there's a problem with the second hop. I remember reading something like this on a forum a while ago, but can't remember a solution.
In detail, the command looks like this:
$session = New-PSSession -computer ComputerName
$templatepath = "\\filerpath\"
Invoke-Command -Session $session -Scriptblock { powershell ovpmutil cfg pol dnl $Using:templatepath /p \BSH }
To clarify: Powershell gives me an "Access denied" when performing the second hop. I already enabled Credential delegation as described here:
Enabling Multihop Remoting
Any help is appreciated. Thanks in advance
The solution is a real pain in the backside if you ask me but here it is...
On the originating server (A):
Set-Item WSMAN:\localhost\client\auth\credssp -value $true
On the intermediate server (B):
Set-Item WSMAN:\localhost\client\auth\credssp -value $true
Open Group Policy editor on server A, navigate to:
Computer Configuration > Administrative Templates > System > Credentials Delegation
Enable these options:
Allow delegating fresh credentials
Allow delegating fresh credentials with NTLM-only server authentication
Both policies need to have server B added to the allowed list, wildcards are allowed. Note that if you use RDP from server A you'll also need to add TERMSRV/*
When running Invoke-Command from server A, include the -Authentication CredSSP param.
Note that if saving SecureStrings somewhere for the credential to connect to server C, you'll want to either use a fixed encryption (specify byte array) or plain text and convert it.

Powershell remote access to nanoserver on docker

I have created a W10 VM (guest) running docker, pulled microsoft/nanoserver image and hosted a container of the image.
(tutorial here: https://msdn.microsoft.com/en-us/virtualization/windowscontainers/quick_start/quick_start_windows_10)
Everything runs great, even host can ping the container running under guest W10. But what i cannot do, is to connect a remote powershell to container.
Enter-PSSession -ComputerName "<container ip>" -Credential ~\Administrator
This pops up a dialog asking for user and password. I cannot leave it blank or etc - the result is access denied. Any ideas how to connect or set a password for nanoserver container ?
I've been struggling with this for a few days now. However, think my problem is slightly different though, as I'm trying to do an Enter-PSSession to a windows docker container, but from another machine, not the container host.
In this tutorial (http://dinventive.com/blog/2016/01/30/windows-server-core-hello-container/), the guy makes a nested container PSSession inside a host PSSession.
He uses this command, which is only available in the latest versions of Powershell. (not in v3)
Enter-PSSession -ContainerId "<container ID>"
Get the ID by doing :
Get-Container | fl
You also have to check your Powershell version and make an upgrade if needed.
To check PS version :
$PSVersionTable
And to download Powershell latest version : https://www.microsoft.com/en-us/download/details.aspx?id=50395
When connecting to a PS-Session using a IP address it adds some requirements, You must either have the remote device configured to use ssl or have the IP address listed in your trusted hosts.
The solution is to either try use the host name for the device, I have had great success with this. Or play with the trusted hosts list. In my experience it works consistently if you add trusted list entries on your machine and the remote machine as well. You can also specify:
Set-Item WSMan:\localhost\Client\TrustedHosts -Value "*"
This will basically set all machines to be in the trusted hosts list, It has its cons like all machines being trusted but in certain restricted networks its acceptable. Doing this on the host and client machine seems to yield best results.
When specifying -Credentials it expects a credential object, You can craft one before the cmdlet to avoid entering it every time like so:
$secpass = convertto-securestring "Password Here" -asplaintext -force
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist "Username Here", $secpass
Enter-PSSession -ComputerName "<container ip>" -Credential $cred
Coding credentials like this in a script is bad practice, You should look in to storing credentials in scripts properly, there are plenty of good resources on it.

remote powershell script issue for adding IP address as relay - exchange 2010

firstly just to say my powershell skills are limited so please be gentle ;-)...
So I've built 4 or 5 runbooks using microsoft Orchestrator to essentially run some remote powershell scripts which do various simple exchange tasks, such as setting OutOfOffice reply, enabling mailboxes, creating shared mailboxes with various permissions etc. I had been using the same basic connection structure/method for these which works fine, ie
$ExchangeCAS = "<CASServerName>"
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://<CASServerNameFQDN>/PowerShell/
import-PSSession $Session
<script body with exchange cmdlets>
remove-PSSession $Session
However, I'm trying to create an additional runbook to automatically add IP addresses as relay to our 4 hub servers - its a pretty basic script (that I dug out/tweaked from internet) but its not working when run from Orchestrator. It does work fine when I run it from the Powershell ISE on the server that has the Exchange Tools installed, and the fact that the other exchange (similar) scripts that do work would at least rule out any permissions issues for the Orchestrator service account executing the script. For reference below is the full script I'm trying to run from Orchestrator as a .Net Activity and am testing using the powershell ISE on orchestrator server:-
$ExchangeCAS = "<ExCAS>"
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://<ExCASFQDN>/PowerShell/
import-PSSession $Session
# Get Receive Connectors to update
$listofIPs = Import-Csv \\<TargetSERVER>\d$\psscripts\MailboxRightsScript\scorch\AddIPlist.TXT
$recCons= Get-ReceiveConnector | Where {$_.Name -match"Anonymous Relay"}
ForEach ($recCon in $recCons)
{
Write-Host "Updating", $recCon.Identity
forEach($line in $listofIPs)
{
$recCon.RemoteIPRanges +=$line.IP
}
Set-ReceiveConnector $recCon -RemoteIPRanges $recCon.RemoteIPRanges
}
remove-PSSession $Session
The error I can see from the Orchestrator server ISE is as below:
Cannot process argument transformation on parameter 'Identity'. Cannot convert the "\Anonymous Relay" value of type "Deserialized.Microsoft.Exchange.Data.Directory.SystemConfiguration.ReceiveConnector" to
type "Microsoft.Exchange.Configuration.Tasks.ReceiveConnectorIdParameter".
From trawling through some articles it seems this is an issue with how data is passed between local and remote powershell and 'hydration' of objects. TBH a lot of the detail of those discussions is a bit over my head when it comes to powershell, so not wishing to be lazy would anyone be able to provide a powershell script solution based on what I was trying above which I can run as a .net activity from Orchestrator to add an IP address/addresses (prefer from input file) as relay. It would be good to know if the solution is easily modified to remove an address as relay aswell.
Any help much appreciated...
Use server distinguish name. It will work
Set-ReceiveConnector $recCon.DistinguishName -RemoteIPRanges $recCon.RemoteIPRanges

How to remotely register static ETW manifests as part of a website deployment?

I'm doing a pilot effort to use the new EventSource (Microsoft.Diagnostics.Tracing.EventSource from nuget) and its new support for ETW channels in order to write to the windows event log. The code is in place, and it writes properly to my workstations event log. I'm thrilled.
Now comes the difficult part. The application that's taking advantage of this capability is a web service, and we deploy it with webdeploy as part of a build-deploy-test system. Because usage of ETW channels requires static registration of provider manifests via wevtutil.exe. The EventSource documentation states that this is best done as part of an installer, but this seems a bit out of webdeploy's capabilities.
Our aim is that we would be able to automatically uninstall the manifest resident on the target server immediately before executing the webdeploy package, and then to import the new manifest after the webdeploy sync has completed. We're not set on this, but it seems like the most sensible way.
For that reason, it seems like maybe this is something that powershell remoting might be able to solve, but it's not an area I know much about.
Has anyone done something like this? Is there a better or simpler way?
There are only a few requirements here. A) the remote machine must have PowerShell remoting enable which also means it must have PowerShell 2.0 or higher B) the script running on the local machine must be able to run as administrator and the credentials used must have admin privileges on the remote machine. If you can meet those requirements then this should be cake.
On the remote machine you need to execute two commands to enable remoting:
Set-ExecutionPolicy RemoteSigned
Enable-PSRemoting -Force
Then on the local machine from an elevated prompt you should be able to execute something like this from a script:
# these two paths assume these files have been copied to the remote computer and to a directory
# in which the service account has privileges to read i.e. not under a userprofile dir.
$etwDllPath = c:\somepath\myassembly.mysourcename.etwManifest.dll
$etwManPath = c:\somepath\myassembly.mysourcename.etwManifest.man
$s = New-PSSession -ComputerName <remoteComputerName>
Invoke-Command -Session $s {param($man) wevtutil.exe um $man} -arg $etwManPath
Invoke-Command -Session $s {param($man,$dll) wevutil.exe im $man /rf:$dll /mf:$dll} -arg $etwManPath, $etwDllPath
Remove-PSSession $s
If you can avoid a remote path with spaces, try to. It will make this easier. :-)